Push a dataframe from Vertex Notebook to BigQuery
2 min readMay 24, 2024
Hey guys, in today’s very short blog we will see how we can Push a dataframe from Vertex Notebook to BigQuery. So without any further due, let’s do it…
Read the full article here — https://machinelearningprojects.net/push-a-dataframe-from-vertex-notebook-to-bigquery/
Steps to Push a dataframe from Vertex Notebook to BigQuery
- Open a new Python notebook and paste the following code into it.
- Now change the variables like ‘Database_Name’ and ‘Table_Name’.
- Also, you need to define a KMS key which acts as a password. Ask your DevOps team for this.
- ‘df’ is the Dataframe that you need to push.
- And finally, run the code.
Code
# Push a dataframe from Vertex Notebook to BigQuery
from google.cloud import bigquery
def push(dbname, tablename,df):
kms_key_name = 'your-kms-key'
project_name = 'project_name' #something like prj-abc-data-prod
job_config = bigquery.LoadJobConfig(destination_encryption_configuration=bigquery.EncryptionConfiguration(kms_key_name=kms_key_name))
client = bigquery.Client()
datapath = f'{project_name}.{dbname}.{tablename}'
job_config.write_disposition =…