Questions tagged [google-cloud-platform]

1

votes
1

answer
407

Views

Add GCP credentials to airflow via command line

Airflow allows us to add connection information via command-line airflow connections. This can help with automated deployment of airflow installations via ansible or other dev-ops tools. It is unclear how connections to google cloud platform (service accounts) can be added to ariflow via command lin...
Sebastian
1

votes
1

answer
423

Views

Dialog api v2 - Unexpected error while acquiring application default credentials: Could not load the default credentials

i am trying to implement a chat bot application with google dialog flow. i was fallowing this github tutorial https://github.com/dialogflow/dialogflow-nodejs-client-v2 to implement api. this is my code var express = require('express'); var router = express.Router(); const projectId = 'my-project-id'...
pavithra rox
1

votes
1

answer
1.5k

Views

Ansible with Google Cloud Platform GCE

I used Ansible to create a gce cluster following the guideline at: https://docs.ansible.com/ansible/latest/scenario_guides/guide_gce.html And at the end of the GCE creations, I used the add_host Ansible module to register all instances in their corresponding groups. e.g. gce_master_ip But then when...
Tim Raynor
1

votes
2

answer
1k

Views

Create GCR secret error: exactly one NAME is required, got 26

I am trying to create a docker-registry secrete for GCR, but am getting a really cryptic error message. This is the kubectl cmd that I am running: kubectl create secret docker-registry gcrsecret --docker-username=_json_key --docker-password=”$(cat wk-test-1-b3c9659d9a07.json)” --docker-server=ht...
Ryan Salmons
1

votes
2

answer
967

Views

Network default is not accessible to Dataflow Service account

Having issues starting a Dataflow job(2018-07-16_04_25_02-6605099454046602382) in a project without a local VPC Network when I get this error Workflow failed. Causes: Network default is not accessible to Dataflow Service account There is a shared VPC connected to the project with a networked called...
Brodin
1

votes
1

answer
63

Views

Package.json file not found when using kubernetes

I trying to setup kubernetes on my local environment using docker. I've built the necessary docker image with this Dockerfile: FROM node:9.11.1 WORKDIR /usr/src/app COPY package.json /usr/src/app/ RUN npm install COPY . /usr/src/app/ EXPOSE 3002 CMD [ 'npm', 'start' ] I then pushed this image to my...
Christofer Johnson
1

votes
1

answer
439

Views

Google Cloud: How to list granted permission for user or service account?

Is it possible to get a list of all permissions that have been granted (specifically or transitively) to a user or GCP service account, ideally filtered by resource, through gcloud or the web UI?
Andreas Jansson
1

votes
1

answer
158

Views

Cloud composer tasks fail without reason or logs

I run Airflow in a managed Cloud-composer environment (version 1.9.0), whic runs on a Kubernetes 1.10.9-gke.5 cluster. All my DAGs run daily at 3:00 AM or 4:00 AM. But sometime in the morning, I see a few Tasks failed without a reason during the night. When checking the log using the UI - I see no l...
Ary Jazz
0

votes
0

answer
5

Views

Stop Streaming pipeline when no more messages to consume

I have a streaming dataflow pipeline job which reads messages from a given pub-sub topic. I understand there is an auto-ack once the bundles are committed. How to make the pipeline stop where there are no more messages to consume?
user3483129
1

votes
1

answer
59

Views

Cloud Bigtable doesn't appear to be removing data that should be garbage collected

I am using a Cloud Bigtable development cluster. I changed max version to 1 for a specific column family, but it doesn't seem like it affected my data. When I perform a lookup, old versions still exist. What am I missing? I run: #cbt setgcpolicy table column_family maxversions=1 #cbt ls table Famil...
Barbaros Yıldız
1

votes
1

answer
37

Views

Duplicate Filename in GCP Storage

I am streaming files to GCP Storage (bucket). Doing so results in a frequent error (roughly 2 million times a day) claiming that my filename policy must generate a unique name. I've tried multiple ways of guaranteeing a unique name such as using currentTimeMillis, currentThread, encrypting the file...
Scicrazed
1

votes
1

answer
49

Views

GCP: Instance creation failed

I recently tried to create an instance group on the Google Cloud Platform (GCP) with 50 n1-standard-1 instances in zone us-east1-b, each with P100 GPUs. I requested and got approval for 200 P100 GPUs in this zone. My CPU, IP addresses, and Routes for this zone and globally all meet the quotas listed...
Coder
1

votes
2

answer
79

Views

Computing GroupBy once then passing it to multiple transformations in Google DataFlow (Python SDK)

I am using Python SDK for Apache Beam to run a feature extraction pipeline on Google DataFlow. I need to run multiple transformations all of which expect items to be grouped by key. Based on the answer to this question, DataFlow is unable to automatically spot and reuse repeated transformations like...
kpax
1

votes
1

answer
38

Views

Where did region europe-west1 go?

I'm unable to create a bucket in region europe-west1. Anyone have the same problem? This is what I see on the dashboard: When the docs region list looks like this:
Carl Engene
1

votes
1

answer
38

Views

Dataproc submit a Hadoop job via Python client

I'm trying to use Dataproc API, by trying to convert a gcloud command to API, but I can't find a good example in documentation. %pip install google-cloud-dataproc The only good sample I found is this, which works fine: from google.cloud import dataproc_v1 client = dataproc_v1.ClusterControllerClient...
spicyramen
1

votes
2

answer
52

Views

Google Healthcare API on GKE with PubSub - INVALID_ARGUMENT

We have been testing out the Google Healthcare API specifically with HL7 and as I've run through the tutorials I've hit a roadblock. I should mention that I have a fair bit of experience with Kubernetes and AWS, but not so much Google Cloud. This step here is what is giving me trouble: https://cloud...
damianesteban
1

votes
2

answer
37

Views

Cloud fuction storage trigger on bucket of other project

Not able to create cloud function trigger on bucket which is under other project. Deploying function (may take a while - up to 2 minutes)...failed. ERROR: (gcloud.functions.deploy) OperationError: code=7, message=Insufficient permissions to (re)configure a trigger (permission denied for bucket ing-a...
1

votes
1

answer
29

Views

Find all rows with Null value(s) in a specific column(s) in Big Query

Is there a way to improve the following? I need to count all rows with NULL value(s) in a specific column. SELECT SUM(IF(column1 IS NULL, 1, 0)) AS column1, SUM(IF(column2 IS NULL, 1, 0)) AS column2 FROM `dataset.table`;
spicyramen
1

votes
2

answer
43

Views

Google PubSub - serviceAccount:[email protected] doesn't exist

I am trying to subscribe to Gmail for mail notifications using Google's Pub/Sub and I've gone through the documentation and tutorials provided by Google. At one point, the docs state that I need to publish the correct rights to my PubSub topic: You need to grant publish privileges to serviceAccount:...
Maxwell
1

votes
1

answer
16

Views

Dataprep importing files with different number of columns into a dataset

I am trying to create a parameterized dataset that imports files from GCS and puts them under each other. This all works fine (Import Data > Parameterize). To give a bit of context, I store each day a .csv file with a different name referring to that date. Now it happens that my provider added a new...
JohnAndrews
0

votes
1

answer
11

Views

Connecting to Google Cloud by external IP - the site is not reachable

I was using an VM instance on Google Cloud for a while. Yesterday I stopped it as I did many times, but after starting it today I can't access it via external IP. I can, however, access it via terminal typing gcloud compute ssh rstudio. I checked if I'm typing correct IP and I do. Therefore, I have...
jakes
1

votes
1

answer
876

Views

Connecting Google Cloud Compute to Google Cloud SQL ERROR 2013 (HY000)

When trying to connect to mysql using the docker cloud proxy like so: mysql -u -p -S /cloudsql/:: I received this error ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 95 According to the documentation, this is how I'm suppose to setup th...
Matthew Harrison
1

votes
1

answer
919

Views

what diffrent gcloud alpha commands and gcloud beta commands

I have a one question aboud gcloud(google-cloud-sdk) I have found a gcloud components to using interavtive shell of google cloud. using like this command [hostname]# gcloud components list and I found a results │ Installed │ gcloud Alpha Commands│ alpha │ < 1 MiB │ │ Installed | gclo...
Lee. YunSu
0

votes
1

answer
143

Views

Possibility of updating data in real-time on a client

I have the following scenario that I was wondering if it's possible/feasible to implement. I apologize if this is considered an overly 'broad' question, but I think SO would be the best place to ask this. Let us suppose I have a website and I want to display a graph to an end-user. For the purposes...
David542
0

votes
0

answer
3

Views

Anonymous caller does not have storage.objects.create access but my JWT has scope https://www.googleapis.com/auth/devstorage.full_control

I am following the documentation for server to server OAuth2 flow (creating my own JWT as opposed to using the library). I've created a service account with the right permissions to upload to my storage bucket. I successfully get an access_token from https://www.googleapis.com/oauth2/v4/token with...
uniisoverrated
1

votes
1

answer
472

Views

Difference between gcloud auth activate-service-account --key-file and GOOGLE_APPLICATION_CREDENTIALS

I'm creating a shell script to handle automation for some of our workflows, This workflow include accessing Google Buckets via Apache Beam GCP. I'm using a .json file with my service account, in which situations do i need to use: gcloud auth activate-service-account --key-file myfile.json vs export...
spicyramen
1

votes
1

answer
32

Views

AppEngine nodejs : how to protect a cron URL from public access?

https://cloud.google.com/appengine/docs/standard/python/config/cron A cron handler is just a normal handler defined in app.yaml. You can prevent users from accessing URLs used by scheduled tasks by restricting access to administrator accounts. However this option is not available in nodejs. What's t...
ben
1

votes
1

answer
125

Views

My Firebase Cloud Function fails with Object.value is not a function error?

Why am I getting the following error when I try to fetch the values of my Firebase child nodes under registrationTokens: Database structure: 'fcmtokens' : { 'dBQdpR7l1WT2utKVxdX2' : { 'registrationTokens' : { 'O': ''c4PSCAUAg5s:Yw95DyVxwElE88LwX7' } } } Console output: TypeError: Object.values is n...
Roggie
1

votes
1

answer
149

Views

Cloud Spanner is using a secondary index when it should not

An existing query that performed quickly using the primary key massively slowed down (10ms -> 8sec) without notice because a secondary index that has been created for another use-case is now used automatically. The 'Explanation' of the Cloud-Spanner-Web-Query tells me that the secondary index is use...
1

votes
1

answer
219

Views

List all the tables in a dataset in bigquery using bq CLI and store them to google cloud storage

I have around 108 tables in a dataset. I am trying to extract all those tables using the following bash script: # get list of tables tables=$(bq ls '$project:$dataset' | awk '{print $1}' | tail +3) # extract into storage for table in $tables do bq extract --destination_format 'NEWLINE_DELIMITED_JSON...
Syed Arefinul Haque
1

votes
3

answer
72

Views

When casting a timestamp to date yields a (Date + 1)

In Google Spanner, I am trying to cast some timestamps to date, when I found this issue. When executing the script below: SELECT EXTRACT(DATE FROM CAST('2019-01-01T07:56:34Z' AS TIMESTAMP)) I get an output 2018-12-31, rather than 2019-01-01. How should I parse it to the exact date?
Logical
1

votes
2

answer
288

Views

Run a python script on schedule on Google App Engine

I'm looking for a good samaritan that can provide with a very basic skeleton to run a python script using Google App Engine. I have read the documentation, check on related SO questions but I'm lost with the WebApp format. All I want to do is run one python script which accepts arguments or several...
Racu
1

votes
2

answer
39

Views

GCloud AppEngine under Eclipse doesn't start (Mac OS)

I want to run a GCloud App Engine (Spring Boot Application) in Eclipse under Mac OS. When I start the App Engine, I get the following error: The Google Cloud SDK could not be found in the customary locations and no path was provided. -> [Help 1] The SDK is installed, the installation path is set to...
user2191287
1

votes
2

answer
81

Views

2019: Dynamic cron jobs Google App Engine

I am developing a reporting service (i.e. Database reports via email) for a project on Google App Engine, naturally using the Google Cloud Platform. I am using Python and Django but I feel that may be unimportant to my question specifically. I want to be able to allow users of my application schedul...
ViaTech
1

votes
1

answer
40

Views

Pubsub Authorization Issue

I've generated a Google Cloud API key without restrictions. I'm passing that to the my topic:publish url as a query parameter, but I'm experiencing an authorization error. { 'error': { 'code': 403, 'message': 'User not authorized to perform this action.', 'status': 'PERMISSION_DENIED' } } I'm unsure...
Bryant Jackson
1

votes
1

answer
114

Views

GCP Cloud Functions - Memory Consumption

How does Cloud Functions compute memory consumption? Is it the total amount memory of all the functions that are currently running at the moment? Let's say: Total Memory assigned 512 MB. 3 running functions with 60MB each. Does it mean we use 180MB in total? Or does each function gets it's own memor...
user1157751
1

votes
1

answer
60

Views

Cloud function deployment time

I am deploying a function to cloud function, but it takes substantial time to deploy. How can I optimize my deployment? I have tried deployed with and without go.mod. I have also tried include vendor (go mod vendor). gcloud functions deploy FuncX --entry-point FuncX --runtime go111 --trigger-http
kyle-la
1

votes
2

answer
48

Views

My google cloud instance is no longer able to resolve external hostnames

Yesterday I had to revert to a recent snapshot of my vm. This vm was working flawlessly at the time I took it. But now I can no longer resolve any url from this host. All git pull commands, all curl requests, host lookups, etc.. are failing. For instance: # host www.google.com ; connection timed out...
ChrisF
1

votes
2

answer
311

Views

golang on GCP => listen tcp :443: bind: permission denied

I have an issue trying to setup an https on Google Cloud Platform using golang + let's encrypt I already have a domain targeting the IP of the instance Also I got a let's encrypt certificate and chain saved on /etc/letsencrypt/live/mydomain.com/ I already setup myapp to use the cert and configured t...
Roberto
1

votes
2

answer
47

Views

google ml-engine cloud storage as a file

I am working in Python with Google Cloud ML-Engine. The documentation I have found indicates that data storage should be done with Buckets and Blobs https://cloud.google.com/ml-engine/docs/tensorflow/working-with-cloud-storage However, much of my code, and the libraries it calls works with files....
user1902291

View additional questions