Why is cloud.google.com/go/functions Golang package not able to List gen2 golang functions - go

I am trying to listFunctions via golang SDK using service account json .
The return iterator have information about gen1 functions but not gen2.
https://go.dev/play/p/ZwZ-jGscFCl - Please add service account json for running in local.
The same service account credentials are used for with gcloud binary. It is able to fetch all the functions.
Is there any issue with golang SDK?

I have 1 answer and 1 remark
Answer:
Use the v2 of the API in Golang and not the V1. The V1 does not support environment generation variable. Here the API spec of the V2. Look at the environment enum.
Since 2 weeks, the V2 API also supports gen1 functions. You can get all in the same time with the same library.
Remark:
You don't need and you shouldn't (mustn't) use service account key file locally. Your user account (with or without service account impersonation) is made for that. Look at my comments in that questions

Related

Running/Testing an AWS Serverless API written in Terraform

No clear path to do development in a serverless environment.
I have an API Gateway backed by some Lambda functions declared in Terraform. I deploy to the cloud and everything is fine, but how do I go about setting a proper workflow for development? It seems like a struggle to push every small code change to the cloud while developing in order to run your code. Terraform has started getting some support by the SAM framework to run your Lambda functions locally (https://aws.amazon.com/blogs/compute/better-together-aws-sam-cli-and-hashicorp-terraform/), but still no way to simulate a local server and test out your endpoints in Postman for example.
First of all I use serverless plugin instead of terraform, my answer is based on what you provided and what I found around.
From what I understood so far with priovided documentation you are able to run sam CLI with terraform (cf: Chapter Local testing)
You might follow this documentation to invoke local functions.
I recommend to use JSON files to create use cases instead of stdin injection.
First step is to create your payload in json file and to invoke your lambda with the json payload like
sam local invoke "YOUR_LAMBDA_NAME" -e ./path/to/yourjsonfile.json

How do I check that a Google Cloud service account has a particular permission programmatically?

I'm making an integration with a user-supplied GCS bucket. The user will give me a service account, and I want to verify that service account has object write permissions enabled to the bucket. I am failing to find documentation on a good way to do this. I expected there to be an easy way to check this in the GCS client library, but it doesn't seem as simple as myBucket.CanWrite(). What's the right way to do this? Do I need to have the bucket involved, or is there a way, given a service account json file, to just check that storage.objects.create exists on it?
IAM permissions can be granted at org, folder, project and resource (e.g. GCS Bucket) level. You will need to be careful that you check correctly.
For permissions granted explicitly to the bucket:
Use APIs Explorer to find Cloud Storage service
Use Cloud Storage API reference to find the method
Use BucketAccessControls:get to retrieve a member's (e.g. a Service Account's) permission (if any).
APIs Explorer used (sometimes) has code examples but, knowing the method, you can find the Go SDK.
The documentation includes a summary for ACLs using the List method, but I think you'll want to use Get (or equivalent).
NOTE I've not done this.
There doesn't appear to be a specific match to the underlying API's Get in the Go library.
From a Client, you can use Bucket method with a Bucket name to get a BucketHandle and then use the ACL method to retrieve the bucket's ACL (which should include the Service Account's email address and role, if any).
Or you can use the IAM method to get the bucket's IAM library's (!) Handle and then use the Policy method to get the resource's IAM Policy which will include the Service Account's email address and IAM role (if any).
Because of DazWilkin answer, you can check the permission at different level and it can be difficult to clearly know if an account as a permission.
For that, Google Cloud released a service: IAM troubleshooter. It's part of Policy Intelligence suite that help your to understand, analyse and troubleshoot the IAM permissions.
You have the API to call in the documentation.

google cloud sdk is installed in my EC2 instance and I could access gcloud. But bq is not available even though I see it in the list of components

I am installed google-cloud-sdk in my matillion instance hosted on EC2. I am able to access gcloud command in the ssh instance and also by using a bash component in my matillion.
However, I am not able to run bq commands. I see it has been installed as part of the cloud sdk. I was able to configure my account and everything. But it doesn't work.
Can someone help me with this?
As per the documentation, its necessary that you activated the BigQuery API in order to use the bq command-line tool.
These are all the steps that you need to follow:
In the Cloud Console, on the project selector page, select or create
a Cloud project.
Install and initialize the Cloud SDK.
BigQuery is automatically enabled in new projects. To activate BigQuery in a preexisting project, go to Enable the BigQuery API.
I also was getting the same error than you and activating the API was the solution.

Get ARN of vendored layers

Looks like AWS layers like AWSLambda-Python37-SciPy1x have a different account and head version in the ARN in different regions. Eg
us-east-1: arn:aws:lambda:us-east-1:668099181075:layer:AWSLambda-Python37-SciPy1x:22
us-east-2: arn:aws:lambda:us-east-2:259788987135:layer:AWSLambda-Python37-SciPy1x:20
From a script I need to add the layer that pertains to the lambda's region, but I'm not finding an AWS CLI or boto3 command that will give me the ARN of a "published" layer (ie one that was given access to by an AWS admin to all accounts), I can only find my own layers (eg aws lambda list-layers).
The AWS console for lambda in web browser shows the vendored layers, so I loaded the page and looked through js console and saw the following request is made:
https://console.aws.amazon.com/lambda/services/ajax?operation=listAwsVendedLayers&locale=en
So it looks like the REST API has this operation to get that, but I cannot find the equivalent anywhere in AWS CLI or boto3.
Any ideas (short of using curl with the proper request head and auth info, pain), perhaps a way to run a "raw" request in boto3 so I could give it this listAwsVendedLayers operation? I looked in the docs could not find anything.

How to use Service Accounts with gsutil, for uploading to CS + BigQuery

How do I upload data to Google BigQuery with gsutil, by using a Service Account I created in the Google APIs Console?
First I'm trying to upload data to Cloud Storage using gsutil, as that seems to be the recommended model. Everything works fine with gmail user approval, but it does not allow me to use a service account.
It seems I can use the Python API to get an access token using signed JWT credentials, but I would prefer using a command-line tool like gsutil with support for resumable uploads etc.
EDIT: I would like to use gsutil in a cron to upload files to Cloud Storage every night and then import them to BigQuery.
Any help or directions to go would be appreciated.
To extend #Mike answer, you'll need to
Download service account key file, and put it in e.g. /etc/backup-account.json
gcloud auth activate-service-account --key-file /etc/backup-account.json
And now all calls use said service account.
Google Cloud Storage just released a new version (3.26) of gsutil that supports service accounts (as well as a number of other features and bug fixes). If you already have gsutil installed you can get this version by running:
gsutil update
In brief, you can configure a service account by running:
gsutil config -e
See gsutil help config for more details about using the config command.
See gsutil help creds for information about the different flavors of credentials (and different use cases) that gsutil supports.
Mike Schwartz, Google Cloud Storage Team
Service accounts are generally used to identify applications but when using gsutil you're an interactive user and it's more natural to use your personal account. You can always associate your Google Cloud Storage resources with both your personal account and/or a service account (via access control lists or the developer console Team tab) so my advice would be to use your personal account with gsutil and then use a service account for your application.
First of all, you should be using the bq command line tool to interact with BigQuery from the command line. (Read about it here and download it here).
I agree with Marc that it's a good idea to use your personal credentials with both gsutil and bq, the bq command line tool supports the use of service accounts. The command to use service account auth might look something like this.
bq --service_account 1234567890#developer.gserviceaccount.com --service_account_credential_store keep_me_safe --service_account_private_key_file myfile.key query 'select count(*) from publicdata:samples.shakespeare'
Type bq --help for more info.
It's also pretty easy to use service accounts in your code via Python or Java. Here's a quick example using some code from the BigQuery Authorization guide.
import httplib2
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
# REPLACE WITH YOUR Project ID
PROJECT_NUMBER = 'XXXXXXXXXXX'
# REPLACE WITH THE SERVICE ACCOUNT EMAIL FROM GOOGLE DEV CONSOLE
SERVICE_ACCOUNT_EMAIL = 'XXXXX#developer.gserviceaccount.com'
f = file('key.p12', 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
SERVICE_ACCOUNT_EMAIL,
key,
scope='https://www.googleapis.com/auth/bigquery')
http = httplib2.Http()
http = credentials.authorize(http)
service = build('bigquery', 'v2')
datasets = service.datasets()
response = datasets.list(projectId=PROJECT_NUMBER).execute(http)
print('Dataset list:\n')
for dataset in response['datasets']:
print("%s\n" % dataset['id'])
Posting as an answer, instead of a comment, based on Jonathan's request
Yes, an OAuth grant made by an individual user will no longer be valid if the user no longer exists. So, if you use the user-based flow with your personal account, your automated processes will fail if you leave the company.
We should support service accounts with gsutil, but don't yet.
You could do one of:
Probably add the feature quickly to
gsutil/oauth2_plugin/oauth2_helper.py using the existing python
oauth client implementation of service accounts
Retrieve the access token externally via the service account flow and store it in the cache location specified in ~/.boto (slightly hacky)
Create a role account yourself (via gmail.com or google apps) and grant permission to that account and use it for the OAuth flow.
We've filed the feature request to support service accounts for gsutil, and have some initial positive feedback from the team. (though can't give an ETA)
As of today you don’t need to run any command to setup a service account to be used with gsutil. All you have to do is to create ~/.boto with the following content:
[Credentials]
gs_service_key_file=/path/to/your/service-account.json
Edit: you can also tell gsutil where it should look for the .boto file by setting BOTO_CONFIG (docs).
For example, I use one service account per project with the following config, where /app is the path to my app directory:
.env:
BOTO_CONFIG=/app/.boto
.boto:
[Credentials]
gs_service_key_file=/app/service-account.json
script.sh:
export $(xargs < .env)
gsutil ...
In the script above, export $(xargs < .env) serves to load the .env file (source). It tells gsutil the location of the .boto file, which in turn tells it the location of the service account. When using the Google Cloud Python library you can do all of this with GOOGLE_APPLICATION_CREDENTIALS, but that’s not supported by gsutil.

Resources