4

I need to automate uploading local csv file into Google Cloud storage bucket in Python. Which Python library can I use? Any sample code would be much appreciated.

4 Answers 4

7

We can use the google python client api to upload files to google cloud storage.

First, install the api client as follows.

>pip install --upgrade google-api-python-client

Then, enable api authentication to get application default credentials.

>gcloud beta auth application-default login

Below is a sample code which uploads a local file to google cloud storage using application default credentials.

from googleapiclient import discovery
from oauth2client.client import GoogleCredentials

credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)

filename = 'C:\\MyFiles\\sample.csv'
bucket = 'my_bucket'

body = {'name': 'dest_file_name.csv'}
req = service.objects().insert(bucket=bucket, body=body, media_body=filename)
resp = req.execute()

This will upload the file inside my_bucket. The full google storage url for the uploaded file would be gs://my_bucket/dest_file_name.csv

Sign up to request clarification or add additional context in comments.

Comments

2

Another way is as shown in this link.

First, connect to cloud

from gcloud import storage
client = storage.Client()

Then select the bucket and choose remote filename

bucket = client.get_bucket('<your-bucket-name>')
blob = bucket.blob('remote_file.txt')

Finally, upload the local file. I prefer the following way but there are alternative ways.

blob.upload_from_filename('local_file_txt')

If you have a variable, the above line requires you to write your variable into disk, then upload which may not be the best way. Instead you can directly write to the blob from a string.

blob.upload_from_string('this is test content!')

Comments

-1

You don't need to import any library. You can create a POST request to the method's /upload URI and add the query parameter

uploadType=media

For example:

POST https://www.googleapis.com/upload/storage/v1/b/myBucket/o?uploadType=media

Add a name query parameter to identify which resource the upload is associated with.

For example, to specify that an object's name is myObject:

POST https://www.googleapis.com/upload/storage/v1/b/myBucket/o?uploadType=media&name=myObject

Add the file's data to the request body. Add the following HTTP headers:

  • Content-Type. Set to the MIME media type of the object being uploaded.
  • Content-Length. Set to the number of bytes you are uploading. This heading is not required if you are using chunked transfer encoding.

Example upload:

POST https://www.googleapis.com/upload/storage/v1/b/myBucket/o?uploadType=media&name=myObject HTTP/1.1
Content-Type: [csv]
Content-Length: [NUMBER_OF_BYTES_IN_FILE]
Authorization: Bearer [YOUR_AUTH_TOKEN]

[DATA]

Comments

-2

you can use pandas library. follow the following example:

import pandas as pd
tobq = pd.read_csv("local.csv")
pd.io.gbq(tobq, "big_query_table_name", "project_name", private_key="big_query_private_key.json")

1 Comment

Your answer is not GCS related, but about BigQuery.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.