Skip to content Skip to sidebar Skip to footer

How To Upload A Local Csv To Google Big Query Using Python

I'm trying to upload a local CSV to google big query using python def uploadCsvToGbq(self,table_name): load_config = { 'destinationTable': { 'projectId': self.project

Solution 1:

pip install --upgrade google-api-python-client

Then on top of your python file write:

from googleapiclient.httpimportMediaFileUpload

But care you miss some parenthesis. Better write:

result = bigquery.jobs().insert(projectId=PROJECT_ID, body={'jobReference': {'jobId': job_id},'configuration': {'load': load_config}}, media_body=upload).execute(num_retries=5)

And by the way, you are going to upload all your CSV rows, including the top one that defines columns.

Solution 2:

One of easiest method to upload to csv file in GBQ is through pandas.Just import csv file to pandas (pd.read_csv()). Then from pandas to GBQ (df.to_gbq(full_table_id, project_id=project_id)).

import pandas as pd
import csv
df=pd.read_csv('/..localpath/filename.csv')
df.to_gbq(full_table_id, project_id=project_id)

Or you can use client api

from google.cloudimport bigquery
import pandas as pd
df=pd.read_csv('/..localpath/filename.csv')
client = bigquery.Client()
dataset_ref = client.dataset('my_dataset')
table_ref = dataset_ref.table('new_table')
client.load_table_from_dataframe(df, table_ref).result()

Solution 3:

Post a Comment for "How To Upload A Local Csv To Google Big Query Using Python"