Questions tagged [google-cloud-storage]

0

votes
0

answer
4

Views

Objects copied by Storage Transfer Service

Is there a way to see what objects got copied by a Storage Transfer Service job? Do I need to add eventing to the storage bucket and scrape the logs?
dustymugs
1

votes
1

answer
176

Views

Uploading to nested buckets in google cloud storage

In google cloud storage, i have bucket called cats inside a root bucket called images. Im using the google-api-ruby-client gem to upload files. Im able to upload a file to the root bucket 'images' but uploading to 'images/cats' does not work. I am aware that the buckets in google cloud storage do no...
raza.sayed
1

votes
1

answer
319

Views

How to (set permissions to) access a Google Storage bucket from a command line program?

In my particular case, I would like to open an object with Tensorboard (A Tensorflow component). The command line instruction is the following: # tensorboard --logdir=gs://mybucket/myobject gs://mybucket/myobject is not a public object. So the line above generates a forbidden access error. The close...
znat
1

votes
2

answer
652

Views

BigQuery load_table_from_storage won't recognize uri

I'm running into the following error when trying to load a table from google cloud storage: BadRequest: 400 Load configuration must specify at least one source URI (POST https://www.googleapis.com/bigquery/v2/projects/fansidata/jobs) Meanwhile my uri is valid (ie: i can see it in the gcs web app) ur...
Paul Bendevis
1

votes
2

answer
380

Views

download a file from google cloud storage with the api

I currently use gsutil cp to download files from my bucket but that requires you to have a bunch of stuff installed. I just want to download it using a simple API request like: http://storage.googleapis.com/mybucket/pulltest/pulltest.csv This gives me access denied. How are the auth parameters forma...
Jackery Xu
1

votes
2

answer
1.7k

Views

Google Cloud Platform Storage error: “Locked Domain Expired” when trying to access my storage items

I am currently hosting some static .css, .js, .tff, and image resources for my site in the google cloud platform storage buckets. I am able to access these resources for up to approximately 20 minutes after uploading them. After about 20 or so minutes I get a 401 error message when I try to access t...
Josh L
1

votes
2

answer
476

Views

How can I give my Google Cloud App Engine access to my Firebase Storage bucket

I'm getting an error: API error 7 (images: ACCESS_DENIED) Normally I would have the Firebase app associated with my Google App Engine service, but since Firestore is not compatible with App Engine, I have to create seperate instances of the service. However, now I need my app engine image service t...
MonkeyBonkey
1

votes
1

answer
156

Views

AWS S3 to Google Cloud Storage Transfer not working with Python Client Library because “precondition check failed”

I tried testing out cloud transfer function that would transfer an object from AWS S3 to GCS (as a one-off task) but I keep getting googleapiclient.errors.HttpError: . Here is the code: import argparse import datetime import json from pprint import pprint import googleapiclient.discovery def main(d...
claudiadast
0

votes
0

answer
3

Views

Dataflow TextIO.write issues with scaling

I created a simple dataflow pipeline that reads byte arrays from pubsub, windows them, and writes to a text file in GCS. I found that with lower traffic topics this worked perfectly, however I ran it on a topic that does about 2.4GB per minute and some problems started to arise. When kicking off the...
Tuubeee
1

votes
2

answer
360

Views

How to rename an object in Google Storage bucket?

How to rename an object in Google Storage bucket through the API? See also An error attempting to rename a Google Bucket object (Google bug?)
porton
1

votes
2

answer
1k

Views

How to host a static website on google cloud storage?

So, I've spent about 5 days searching for an answer here and on Google Docs, including having one of their support people help me. My domain still doesn't resolve to the website. For the record, the website works if I use the ugly url (http://storage.googleapis.com/7thgradeplay.org/index.html). I ha...
Kevin Flavin
1

votes
2

answer
876

Views

How to write to a cloud storage bucket with a firebase cloud function triggered from firestore?

To make sure this is as clean and isolated as possible, I created a new test project with nothing in it other than these steps. I then enabled cloud firestore in the console. I created a new collection called 'blarg' (great name, eh!). I added a document to it, and used the auto-id, then added a fi...
xrd
1

votes
1

answer
88

Views

Google Cloud Storage: what is the difference between WRITER and OWNER?

This seems like a simple question, but I have not been able to find the answer online. What specific actions can the OWNER of a bucket do that a WRITER of that bucket (or any object) cannot do? The reason I ask is that I noticed that one of my 'logs' buckets has [email protected] as an OWNER a...
SheRey
1

votes
1

answer
245

Views

Python Client for Google Cloud Storage and large files

In the documentation, there is no hard limits on file sizes but when I try to upload 100MB files, it fails without any error logs on administration portal. Has anyone see this before? What's the best way to upload large files (100MB+ < 2GB) on app engine?
AR_
1

votes
1

answer
328

Views

Issues creating table from bucket file

I have a big table (About 10 million rows) that I'm trying to pull into my bigquery. I had to upload the CSV into the bucket due to the size constraints when creating the table. When I try to create the table using the Datastore, the job fails with the error: Error Reason:invalid. Get more informat...
OM Asphyxiate
1

votes
1

answer
308

Views

Create and Share bucket in google cloud storage using google api's

i need to create and share a bucket in gcloud storage using google api's. but i got an error when am tring with postman(rest client). Iam trying with url ------- https://www.googleapis.com/storage/v1/b?project=testproject request body ------------ { 'name':'testbucketmanafnew' } i got an error {...
Abdul Manaf
1

votes
1

answer
996

Views

Uploading an image to google cloud storage on a rails site

I am trying to allow users to upload images to my site. I have the image 'uploading' to my controller and have a variable with the value that is my image. For example: @image_data = data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAC8AAAAqCAYAAAAj6gIfAAAABHNCSVQICAgIfAhkiAAABCJJREFUWIXVmT1MW1cUx3/G8AqG...
Rorschach
1

votes
1

answer
1.3k

Views

Google API Python Client: MediaIoBaseDownload: Problems with 'contentEncoding' of type 'gzip'

I am running in circles trying to figure out how to download a CSV file that is 'contentEncoded' as 'gzip' from Google cloud using their google-api-python-client. My issue, I am not able to download a file that has 'contentEncoding' as 'gzip', nor its 'md5Hash' matching what was downloaded, nor its...
jeff00seattle
1

votes
1

answer
710

Views

How do I exclude hidden files and directories when using gsutil to rsync?

I have a Jekyll blog with a directory structure that contains lots of hidden files and directories like .DS_Store, .idea and .git. It also has intermediate build artifacts and scripts that begin with _ like _deploy.sh and _drafts. I want to write a script that uploads everything to a bucket on Googl...
mimming
1

votes
1

answer
540

Views

How to programmatically delete all the contents within a folder with in Cloud storage bucket

I have a folder called 'myfolder' within Cloud storage bucket. It has files like a.log, b.log etc. How can I programmatically delete all these files from the folder in the bucket. I want some some java example code to do it.
sanjeev
1

votes
1

answer
320

Views

Failure deploying AppEngine version because of Cloud Storage JSON API enablement

I am trying to deploy to Google AppEngine. (More precisely: Managed VM.) I ran gcloud preview app deploy d:\dev\mytest\yaml-war\app.yaml --version=joshua20160316d --project=mytest-test1. After about 30 minutes, it failed. The error message directs me to log lines in the Cloud Developer Console (...
Joshua Fox
1

votes
2

answer
477

Views

How to edit the metadata information of an object in cloud storage bucket using golang

I've tried to insert a csv file from a local machine to the cloud storage bucket but it is storing as a text file. When i tried including Metadata option in object := &storage.Object{Name: objectName, Metadata: map[string]string{'Content-Type': 'text/csv; charset=utf-8'}} It is creating another slo...
Phani Kumar Dytha
1

votes
1

answer
27

Views

Google Cloud Platform - Data Distribution

I am trying to figure out a proper solution for the following: We have a client from which we want to receive data, for instance a binary that is 200Mbytes updated daily. We want them to deposit that data file(s) onto a local server near them (Europe). We then want to do one of the following: We...
radiator
1

votes
1

answer
628

Views

How to download part of file from Google Storage

I use Google Storage API for C# and not found how to download part of file from Google Storage. Is it possible? That is my code: using (var stream = new MemoryStream()) { ObjectsResource.GetRequest request = _service.Objects.Get(bucketName, fileName); request.MediaDownloader.ChunkSize = chunkSize; r...
Dennis Vinogradov
1

votes
1

answer
520

Views

HttpError 400 when trying to upload pdf file on Google Cloud Storage

I am setting up a service that: receives an email with an attachment file uploads this file to cloud storage uses that file as a source for further processing I arrived at step 2 where the error is occurring. I am using the discovery API for Google services to authenticate. Mine is a simple flas...
ciacicode
1

votes
1

answer
65

Views

can't delete multiple object with blank spaces and Cyrillic in file_name using REST API, google cloud storage

My batch delete code looks like: @staticmethod def gcs_batch_delete(gcs_file_names): logging.debug('Deleting ' + str(len(gcs_file_names)) + ' files.') boundary = '===============7330845974216740156==' headers = {'Content-Type': 'multipart/mixed; boundary='' + boundary + '''} data = '--' + boundary +...
1

votes
2

answer
112

Views

App Engine: Copy live Datastore to local dev Datastore (that still works)

This used to be possible by downloading with the bulkloader and uploading to the local dev server. However, the bulkloader download has been non-functional for several months now, due to not supporting oauth2. A few places recommend downloading from a cloud storage backup, and uploading to the local...
PaintingInAir
1

votes
1

answer
283

Views

Details, We're sorry, but the Customer-Supplied Encryption Key feature is not available in your location

I tried to use my own key to encrypt the data to the Google Cloud Storage service and got that error. I choose US-Central as location for the server. Dont' get it. Exact message: AccessDeniedException: 403 AccessDenied: [(Details, We're sorry, but the Customer-Supplied Encryption Key feature is not...
1

votes
2

answer
252

Views

What is my exact URI for my Google Cloud Bucket?

I want to upload a picture to my Google Cloud Storage Bucket from my ios Application. I am using a http post method to add the picture. However, I want to know the exact URI to use in my code for a http Post method because Google Documentation uses so many and is unclear to the exactness. If the nam...
1

votes
1

answer
431

Views

store terabytes of data and later import to elasticsearch

I am looking for a good way to store up to 20 terabytes of data (social media postings, twitter data, etc) in the cloud and gradually feed it into Elasticsearch (to enable faceted searching) so that it can be quickly searched. I was going to break this into 2 steps. Saving the data to storage and th...
Jen
1

votes
1

answer
359

Views

response headers for google cloud storage

Using Google Cloud Storage, I do not get CORS header in response for one file, but all other files contain it. I uploaded a folder of SVG files to be served from Google Cloud Storage. I put them in a new bucket. I changed the permissions so that all files would be served to the public. I used the gs...
cyrf
1

votes
2

answer
200

Views

GAE import errors; sys.path shows wrong path for appengine libraries

I'm developing a web app for Google App Engine in Python on Windows 10. Everything was working fine when my main.py was just serving templates. import os import urllib from google.appengine.api import users import jinja2 import webapp2 JINJA_ENVIRONMENT = jinja2.Environment( loader=jinja2.FileSyste...
JackOfAll
1

votes
2

answer
168

Views

Google Cloud Container Builder from Cloud Source Repository

Is it possible to build a docker container using Google Cloud Container Builder from source code in Google Cloud Source Repository? The docs say the code must be in Cloud Storage so I assume the answer is no but this seems crazy. Am I missing something? Is code in Google Source Code accessible via C...
Matthew Jones
1

votes
1

answer
448

Views

reading numy array from GCS into spark

I have 100 npz files containing numpy arrays in google storage. I have setup dataproc with jupyter and I am trying to read all the numpy arrays into spark RDD. What is the best way to load the numpy arrays from a google storage into pyspark? Is there an easy way like np.load('gs://path/to/array.npz...
ajkl
1

votes
1

answer
341

Views

Creating and Deleting Buckets Programmatically in Google Cloud Storage [closed]

I am using Google App Engine and Google Cloud Storage. I would like to create a bucket daily using a cron job. Also, delete a bucket programmatically. I was able to manually create a bucket using the Google Cloud Console. How do I create/delete buckets from GAE using python? Also, is it a good desi...
ssk
1

votes
1

answer
288

Views

GCP Cloud Storage - Wildcard Prefix List

This is a question of how to acomplish a certain task with the GCP Cloud Storage API. I have a bucket with a 'folder' structure as follows: ID / Year / Month / Day / FILES I need to search for all files with the following format: ID/2016/04/03/. I had hoped I could use a * in the prefix (*/2016/04/0...
Dovy
1

votes
1

answer
511

Views

how do you perform hadoop fs -getmerge on dataproc from google storage

How do you use getmerge on dataproc for part files which are dumped to the google storage bucket. If I try this hadoop fs -getmerge gs://my-bucket/temp/part-* gs://my-bucket/temp_merged I get an error getmerge: /temp_merged (Permission denied) It works fine for hadoop fs -getmerge gs://my-bucket/t...
ajkl
1

votes
1

answer
470

Views

Is skipping leading rows when reading files in google dataflow possible

I want to skip leading rows when reading files while using google dataflow. Is that feature available in the lastest version? The files are kept in google storage. I will be writing these files to big query. bq load command has option --skip_leading_rows . This option skips the leading rows when rea...
abhishek jha
1

votes
1

answer
303

Views

Read files from local computer and write to BigQuery or google storage using google dataflow

Is there a way to read csv files from a local computer and write it to big query or storage using google dataflow? If it exists, what runner should be used? All the google dataflow examples either read from cloud and write to either to cloud storage or big query. I use DirectPipelineRunner for readi...
abhishek jha
1

votes
1

answer
433

Views

How can I produce a sorted export of a large BigQuery table?

I would like to produce a sorted CSV export of a large BigQuery table in Google Cloud Storage. Currently to do this, we start with an unsorted table, then do a SELECT * FROM table ORDER BY col1, col2 into another table, and then export that table to GCS. This works well, since the export seems to us...
Steven

View additional questions