Skip to main content
We’ve updated our Terms of Service. A new AI Addendum clarifies how Stack Overflow utilizes AI interactions.
Filter by
Sorted by
Tagged with
-2 votes
0 answers
32 views

Google Storage transfer job unable to access S3 bucket via IAM Role [closed]

I am creating a gcp storage transfer job with source as S3 and destination as GCS bucket. I am following this link: https://docs.cloud.google.com/storage-transfer/docs/source-amazon-s3#...
user785461's user avatar
Best practices
0 votes
0 replies
47 views

Optimal way to upload large files (up to 10GB) in GCS from frontend client

My use case is simple in nature. I have a platform where users can upload any files up to 20GB. My current solution is: Frontend Client asks for a presignedURL which the Backend generates return ...
Asif Alam's user avatar
1 vote
0 answers
27 views

Is it possible to do offline upload using GCS Bucket Presigned URL resumable vs write

I have a Frontend Client which lets users uploads any number of files of any size (think up to 100 GB File). Currently I am using GCS buckets presigned URL to upload the file. My current ...
Asif Alam's user avatar
0 votes
0 answers
27 views

Firebase CLI fails with "An unexpected error has occurred" only when deploying storage

I'm trying to deploy Cloud Storage CORS settings using the Firebase CLI, but it consistently fails with a generic "Error: An unexpected error has occurred." Project ID: oa-maintenance-v2 ...
プラス田口's user avatar
0 votes
0 answers
37 views

Compute Engine image creation fails with “Required ‘read’ permission for storage object of a bucket” from Java SDK, works via REST API and Console

I am trying to create a virtual image from a tar file that is present in the storage bucket When tried from my Java SDK code which is my actual requirement I get this Required ‘read’ permission for ‘${...
ROHAN ACHAR V's user avatar
2 votes
1 answer
76 views

How to save Google Cloud GET Object API mediaLink response to local storage?

I wrote this code that uses Google's Cloud API to get an object from my bucket and download it. It works perfectly when I had my bucket set to public (allUsers added to Principal w/ all the required ...
Art T.'s user avatar
  • 31
0 votes
1 answer
70 views

distcp creating file in GCP bucket instead of file inside directory

Context: using distcp, I am trying to copy HDFS directory including files to GCP bucket. I am using hadoop distcp -Dhadoop.security.credential.provider.path=jceks://$JCEKS_FILE hdfs://nameservice1/...
Jhon's user avatar
  • 49
1 vote
2 answers
92 views

ModuleNotFoundError: No module named 'google' in WSL

I was trying to use google cloud storage in a python virtual environment. I tried installing google-cloud-storage and whenever I run the code I always get the error ModuleNotFoundError: No module ...
Asem Shaath's user avatar
0 votes
1 answer
86 views

GCSToGCSOperator is moving folder along with files when using option move_object as True

I have a requirement to move files from a source folder to destination folder in different GCS buckets. I am using GCSToGCSOperator with following config: source_bucket: "source_bucket" ...
A B's user avatar
  • 1,936
1 vote
0 answers
209 views

Running huggingface models offline using huggingface snapshot download causes errors

I have been trying to run some models from huggingface locally. The script is being hosted on google cloud run. Since running the instance multiple times triggers rate limiting, I have downloaded the ...
GentleClash's user avatar
0 votes
0 answers
42 views

Firebase storage image urls keep returning access denied response

I have a small app where I am using firebase functions to upload an image into firebase storage. Once done, I store this image url against an object in firebase db and then reuse this image in the app ...
feeyam's user avatar
  • 1
1 vote
0 answers
97 views

How can I authenticate and upload files to Google Cloud Storage in Ballerina without manually generating access tokens?

I'm trying to upload files to a Google Cloud Storage bucket from a Ballerina application. Right now, the only way I’ve found to authenticate is by manually generating an access token using a service-...
Virul Nirmala Wickremesinghe's user avatar
0 votes
1 answer
72 views

How to prevent Django from generating migrations when using dynamic GoogleCloudStorage in a FileField?

On my Django Project I have a model that a property is used to store videos in a specific and single Google Cloud Storage bucket, using FileField. The model is defined like this: from storages....
Raul Chiarella's user avatar
0 votes
1 answer
74 views

How to limit size of upload using Google Cloud Signed URL

I'm using Google Cloud Storage Signed URLs with the NodeJS SDK to allow my client app to directly upload to Cloud Storage. I want to limit the size of the file the client is allowed to upload. How do ...
yoonicode's user avatar
  • 1,467
0 votes
1 answer
81 views

Cannot read properties of undefined (reading 'cloudStorageURI') Generation 2 Google Cloud Storage Function

I am migrating a generation 1 node.js google cloud functions to generation 2 cloud run. It uses a onDocumentCreated function to create and save a file to cloud storage built from a collection of data ...
Steve Klock's user avatar
1 vote
1 answer
95 views

how to unit test flask method using google cloud storage library

Im trying to create unit tests for this method but im getting the error saying credentials not found from io import BytesIO from google.auth.exceptions import DefaultCredentialsError from google....
willie revillame's user avatar
1 vote
1 answer
71 views

GCS bucket to Looker studio custom community viz does not connect

First time trying anything like this I want to create a custom viz in Looker Studio with d3.js I have created a bucket d3js-bucket I have made the bucket public. IAM has Organization Administrator ...
Einarr's user avatar
  • 350
1 vote
1 answer
70 views

How to support multiple Google Cloud Storage buckets in a Django FileField without breaking .url resolution?

I'm using Django with django-storages and GoogleCloudStorage backend. My model has a FileField like this: raw_file_gcp = models.FileField(storage=GoogleCloudStorage(bucket_name='videos-raw')) At ...
Raul Chiarella's user avatar
0 votes
0 answers
47 views

How to extract first n rows of a .tsv.bgz file with gsutil?

How to extract the first n rows of a block compressed tab-separated value text file (.tsv.bgz), which could only be accessed with gsutil, into a text file? The original file is very large, so I wonder ...
Xunzhi Zhang's user avatar
1 vote
0 answers
505 views

Vertex AI Agent Deployment Fails with ModuleNotFoundError: No module named 'data_science'

Context I'm trying to deploy a custom multi-agent app on Vertex AI Reasoning Engine (using Google ADK / Agent Builder). I'm using a .whl file that includes my entire custom agent code, organized under ...
cryptickey's user avatar
0 votes
0 answers
62 views

Document AI Batch Process error: "Unknown field for GcsDocuments: gcs_uri" in Cloud Shell

I am attempting to use Google Cloud Document AI's asynchronous batch processing feature in a new Google Cloud project, but I am consistently encountering a ValueError. I also notice a limitation in ...
R34's user avatar
  • 1
0 votes
1 answer
77 views

How to preserve only timestamp and mode of the file, an not other posix metadata when using "gcloud storage" command line interface?

When I use -P options with gcloud storage cp -r -P gs://some/bucket/object ./ It tries to convert the user id also and fails with the following message ERROR: Root permissions required to set UID ...
amisax's user avatar
  • 126
1 vote
2 answers
476 views

How do I upload unstructured documents with meta data to Google Cloud Platform data store with Python SDK?

I am trying to upload unstructured data to a Google Cloud Platform (GCP) data store from a GCP Storage Bucket using the Python SDK. I want to use unstructured data with meta data which is mentioned ...
Fruity Fritz's user avatar
0 votes
1 answer
119 views

Google Cloud Document AI cannot access Firebase Storage bucket - permission error

I'm trying to import documents from Firebase Storage into Google Cloud Document AI but I'm getting a persistent permission error even after adding the Storage Admin role to the Document AI service ...
Kyrylo Petrenko's user avatar
0 votes
0 answers
104 views

Error getting access token for service account: Connection refused: connect, iss: [...]

I was having some issues with configuring google cloud storage and i cannot for the life of me figure out what the issue is. So initially, i implemented everything with with wrapper classes and an ...
ivan's user avatar
  • 1
1 vote
0 answers
124 views

gcloud storage cp downloads exceed file size and result in corrupt file (e.g. 12.7GiB for 9.2GiB file)

I’m trying to download a .tar.gz file from a public Google Cloud Storage bucket using the gcloud CLI tool. Here’s the command I use: gcloud storage cp gs://my-bucket/large-file.tar.gz ./ The file is ...
Maximilian Burr's user avatar
0 votes
1 answer
267 views

java.lang.NoClassDefFoundError: com/google/auth/Credentials

While running test cases, I am encountering the following error: pyspark.errors.exceptions.captured.IllegalArgumentException: Cannot initialize FileIO implementation org.apache.iceberg.gcp.gcs....
Prashant Kumar's user avatar
0 votes
1 answer
198 views

polars read_csv_batched not working with GCS

I am trying to load a 1GB+ csv file from GCS and encountering memory issues. So I am trying to use read_csv_batched per Memory issues sorting larger than memory file with polars The documentation for ...
sicsmpr's user avatar
  • 55
1 vote
0 answers
38 views

dynamic wildcard is not working with load_table_from_uri

I was trying to load multiple gcs files to bigquery via Airflow using api load_table_from_uri uri="gs://b5a34db6ab213379-eu-pm-test-uat-data-ingest-temp/test_pm_cm_data/"+str(run_id).strip()+...
Vikrant Singh Rana's user avatar
0 votes
1 answer
76 views

in MacOS and it pre-compiled version of crcmod not being detected

➜ datasets gsutil --version gsutil version: 5.27 checksum: 5cf9fcad0f47bc86542d009bbe69f297 (OK) boto version: 2.49.0 python version: 3.10.13 | packaged by conda-forge | (main, Dec 23 2023, 15:35:25) ...
WurmD's user avatar
  • 1,493
0 votes
1 answer
80 views

The CORS policy on firebase storage is blocking any upload from flutter web app even though it has been set

I have set the CORS configuration on my firebase storage bucket like this : [ { "origin": ["https://arptc-connect.web.app", "*"], "method": [&...
Armando Sudi's user avatar
0 votes
0 answers
30 views

Google bucket directly accessed using oauth2.0 client credentials flow

Can Google Cloud Storage (GCS) be directly accessed using OAuth 2.0 Client Credentials flow (client ID + client secret) for file uploads? Can Google Cloud Storage (GCS) be directly accessed using ...
Shyam Singh's user avatar
1 vote
0 answers
69 views

why isn't my GCS-triggered Python Gen2 Cloud Function executing/logging despite Editor role?

I am trying to trigger a simple Python 3.11 Gen2 Cloud Function (parse-ticket-v2) in project loadsnap-prod (ID: 266229951076, region us-central1) when a file is finalized in the GCS bucket gs://...
Brian Murphy's user avatar
-1 votes
1 answer
79 views

GCP: Why can't my Backend Bucket find my public files?

Short Version: I have configured a backend bucket on my load balancer and mapped it to /__/auth/, that bucket contains a publicly acessable file named handler, but when I hit /__/auth/handler I get an ...
David's user avatar
  • 15k
0 votes
1 answer
57 views

pulumi failure trying to create google cloud certificate authority (python)

I'm hitting a wall trying to gcp create certificate authority in pulumi (python). The issue happens trying to create the authority, I get a 404 that it cannot find the authority (that it is creating). ...
Paul Forgey's user avatar
0 votes
2 answers
54 views

Why are renames of (non-empty) folders not triggering Google Cloud Storage events?

I'm subscribing to Google Cloud Storage events with Firebase Functions (2nd gen) on the Node.js runtime, as described here. My bucket has Hierarchical Namespace enabled. For most operations, I ...
mgw's user avatar
  • 260
2 votes
1 answer
95 views

Is it possible to use Copy-Item between providers via a custom implementation of CopyItem in a provider?

Background I am using the google cloud sdk which implements a provider to its storage service. Inside this provider, I can gci and cd around just fine, but when I want to cp an object onto my local ...
Blaisem's user avatar
  • 659
-1 votes
1 answer
132 views

How to set custom download filename for a public file URL without signing it?

I have a use case where I want users to download a file from a public URL, but with a custom filename. For private files (e.g., in AWS S3 or GCS), I can generate a signed URL and use the Content-...
Sudhanshu Bansal's user avatar
0 votes
1 answer
188 views

Error Exporting from BigQuery to GCS: 'Operation cannot be performed on a nested schema

I encountered an error while trying to execute an ETL task to export data from a BigQuery table to Google Cloud Storage (GCS). Here is the exact error message: raise self._exception google.api_core....
user22239200's user avatar
-1 votes
1 answer
40 views

How can i get the large sized images/videos by my backend express Api?

How can I get the large sized images/videos by my backend express Api? I got project which demands the user can upload many numbers of large images and videos and click save. and if the user refreshes ...
Sree ram Sekar's user avatar
0 votes
0 answers
229 views

How to save a file to a cloud storage bucket from a cloud run function

I am super new to using google cloud and a very novice coder. I am trying to create an automated system that saves a graph as a jpeg in a cloud storage bucket (this will be a cloud run function that ...
Amy Lock's user avatar
1 vote
1 answer
381 views

DuckDB persist secrets across sessions or provide an alternative config file for GCS credentials

I want to read Parquet files stored in a GCS bucket via DuckDB as CLI i.e. duckdb in an environment where I setup a Service Account and I created the HMAC credentials like gcloud storage hmac create \ ...
TPPZ's user avatar
  • 4,949
1 vote
0 answers
103 views

How to handle evolving Parquet schemas from GCS when loading into BigQuery?

We are designing a data ingestion pipeline where Parquet files are delivered weekly into a GCS bucket. The bucket structure is: gs://my-bucket/YYYY/MM/DD/<instance-version>/<instance-id>/&...
dadadima's user avatar
  • 958
0 votes
1 answer
54 views

OpenSSL::PKey::RSAError attempting to use Rails ActiveStorage connect to GCS

I have an application deployed on Heroku that I want to connect to GCS using ActiveStorage. I am explicitly specifying credentials in config/storage.yml as specified in the JSON key file and the ...
aec's user avatar
  • 1,193
0 votes
0 answers
134 views

Caching in Self-hosted GitLab Runner Not Working with GCS (Tried Local Cache as Well)

I am running a self-hosted GitLab instance with GitLab Runner and trying to set up caching for my CI jobs using Google Cloud Storage (GCS). However, caching is not working as expected. I’ve also tried ...
Toufik Benkhelifa's user avatar
0 votes
1 answer
55 views

Spring RestClient fails with GCS signed URL (403 SignatureDoesNotMatch) while RestTemplate/HttpURLConnection work

Context: Trying to download a file from Google Cloud Storage using a pre-signed URL with Spring Boot 3.4.4's RestClient. The same URL works perfectly with both RestTemplate and raw HttpURLConnection. ...
tschi's user avatar
  • 36
0 votes
0 answers
25 views

SolrException: Couldn't restore since doesn't exist; when restoring an existing backup from Google Cloud Storage

I am trying to restore a backup from a GCS bucket using the next API call: http://mysolr.dns:8983/solr/admin/collections?action=RESTORE&repository=gcs&location=my-backup-folder/&name=...
Leonardo Sergei Santoyo Cortés's user avatar
0 votes
1 answer
191 views

FoundryVTT instance crashing with error "EPERM: operation not permitted, utime '/data/Config/options.json.lock'" - Google Cloud Run/Storage

I'm attempting to deploy an instance of FoundryVTT as a container on Cloud Run. I've set it up to mount a Cloud Storage bucket as a volume so that its data will persist when it restarts. The app ...
splatman73's user avatar
0 votes
2 answers
176 views

How to Generate a Signed URL for storage bucket in Terraform without providing credentials [closed]

I am trying to generate a signed URL for an object in Google Cloud Storage using Terraform. Here is my code for reference variables.tf variable "create_bucket_object" { default = true } ...
iamarunk's user avatar
  • 149
0 votes
1 answer
72 views

Will BigQuery Incremental Transfer reload files when GCS lifecycle archives them?

I’m evaluating BigQuery Data Transfer Service to ingest daily files that land in a GCS bucket. My goal: Load each new daily file once into BigQuery. After 30 days, archive files using a GCS ...
Etienne Neveu's user avatar

1
2 3 4 5
226