Google Cloud Logging Authentication / permissions - google-cloud-logging

I am using the Golang library cloud.google.com/go/logging and want to send runtime logging.
Already have a GOOGLE_APPLICATION_CREDENTIALS .json file - and am using google storage and firebase - so I know the credentials are working.
With logging, I get an error "Error 403: The caller does not have permission, forbidden"
The account in the application credentials is a service account and I have been looking at the IAM permissions. There is not an obvious permission for logging (there are other stackdriver permissions, for debug, trace etc but these don't seem to work).
So assuming I am in the right place so far - what permissions does the service account need in order to send logging data to stackdriver logging?

If we look at the API for writing entries to a log we find that the IAM permission logging.logEntries.create is required.
A more detailed article can be found at Access control guide.
This describes a variety of roles including:
roles/logging.logWriter

According to the official documentation:
Using Stackdriver Logging library for Go requires the Cloud IAM Logs
Writer role on Google Cloud. Most Google Cloud environments provide
this role by default.
1.App Engine grants the Logs Writer role by default.
2.On Google Kubernetes Engine, you must add the logging.write access scope when creating the cluster:
3.When using Compute Engine VM instances, add the cloud-platform access scope to each instance.
4.To use the Stackdriver Logging library for Go outside of Google Cloud, including running the library on your own workstation, on your data center's computers, or on the VM instances of another cloud provider, you must supply your Google Cloud project ID and appropriate service account credentials directly to the Stackdriver Logging library for Go.
You can create and obtain service account credentials manually. When specifying the Role field, use the Logs Writer role. For more information on Cloud Identity and Access Management roles, go to Access control guide.
Setting Up Stackdriver Logging for Go
gcloud iam service-accounts list
gcloud projects add-iam-policy-binding my-project-123 \
--member serviceAccount:my-sa-123#my-project-123.iam.gserviceaccount.com \
--role roles/logging.logWriter

Related

The application utilizing the gmail-api must run in Goolge Cloud Platform?

When I want to use Google's Gmail API within my web application in order to receive and send emails, then must this web application be deployed in the Google Cloud as a precondition and any on-premise hosting will fail? Is this the price one must pay to use it?
Your application's code can be hosted anywhere you want. However, you do need to create a Google Cloud account to create a project, enable the APIs and get the application credentials:
Cloud APIs use application credentials for identifying the calling applications. Credential types include API keys, OAuth 2.0 clients, and service accounts. You can use Google Cloud console to create, retrieve, and manage your application credentials. For more information about application credentials, see Authentication Overview.
Once you have your project's credentials you can just create the code within your current app and use the credentials wherever they are needed. You can refer to one of Google's quickstarts for that.
Sources:
Getting started with Google Cloud APIs
Developing on Google Workspace
Gmail API Overview

Android Management API: Failed to patch policy - Caller is not authorized to manage enterprise

I have been working with the Android Management API to try and manage the policy of my company's existing enterprise. My company account has the Owner role within the organization and the roles Owner and Service Account Admin for the service account mentioned later.
I followed the Quickstart Guide to get familiar with the API and made some modifications for a more permanent solution along the way such as creating a service account with the Android Management User role via the Google Cloud Platform and generating a JSON key to acquire credentials rather than going through the OAuth2 flow like in the guide. This allowed me to authenticate properly, but when it comes time to patch the policy as such,
androidmanagement.enterprises().policies().patch(
name=policy_name,
body=policy_json
).execute()
I get the following error:
<HttpError 403 when requesting https://androidmanagement.googleapis.com/v1/enterprises/XXXXXXXXX/policies/<policy_name>?alt=json returned "Caller is not authorized to manage enterprise.". Details: "Caller is not authorized to manage enterprise.">
I have verified that the service account I am authenticating with has the Android Management User role, and thus has the androidmanagement.enterprises.manage permission.
I have also attempted to make this call with an elevated admin role in the organization.
Is there a chance that I need to have created the enterprise with my own account to manage the enterprise? The guide suggests that an organization can create multiple enterprises. In which case, would I need to create a new Google account not associated with my organization's enterprise and create a new enterprise that way?
It is advisable to use your own google account to call Android Management API since your organization account may not be compatible with the quickstart.
To access the Android Management API your service account requires the androidmanagement.enterprises.manage permission, which can be granted by the Android Management User role (or roles/androidmanagement.user). Kindly check this link for details regarding creating a service account.
Please keep in mind that the enterprise you created as part of the colab instructions can only be managed using the colab itself. To allow your cloud project to manage an organization, you will need to create one using the client configuration from your cloud project.

Setting up Dialogflow CX to save audio recordings in Google Cloud Storage

I have setup a Google Cloud Storage bucket in the same project as my Dialogflow CX Agent, and in the settings for the agent, under the Speech & IVR tab, set the Google Cloud Storage URI to match that bucket:
gs://my-bucket/calls
I can see that the Dialogflow Agent has access to the bucket as it has a Service Agent listed in the Permissions tab of the bucket.
Furthermore, since I successfully enabled logging, I can see that the bucket is correctly configured, as in the log payload I can see the following property:
interactiveVoiceResponseSettings: {
audioExportGcsDestination: {
uri: "gs://my-bucket/calls"
}
}
However, when making calls, nothing appears on that bucket's folder.
Is there another configuration option I'm missing to enable this feature?
Or perhaps it is not yet functional?
Thank you.
The 'Google Cloud Storage URI' option in the Dialogflow CX Speech and IVR agent settings is currently supported for the following use cases:
If you use 1-click telephony partner integrations (for example,
AudioCodes or Avaya)
If you use the Contact Center AI solution
provided by Google partners.
If you use 1-click telephony partner integrations, and the 'Google Cloud Storage URI' option is not working for you, please check the following:
Go to the GCP project IAM and find the automatically created service account of the format one-click#df-cx-<ALPHANUMERIC_VALUE>-<ALPHANUMERIC_VALUE>.iam.gserviceaccount.com
Make sure that this service account has the 'GCS Storage Bucket Owner' role assigned to it.
Note that the 'Google Cloud Storage URI' option doesn't support the detect Intent API requests.

How to grant EC2 access to SQS

The docs are very confusing to me. I have read through the SQS access docs. But what really throws me is this page: http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-sqs.html
You can provide your credential profile like in the preceding example,
specify your access keys directly (via key and secret), or you can
choose to omit any credential information if you are using AWS
Identity and Access Management (IAM) roles for EC2 instances or
credentials sourced from the AWS_ACCESS_KEY_ID and
AWS_SECRET_ACCESS_KEY environment variables.
1) Regarding what I have bolded, how is that possible? I cannot find steps whereas you are able to grant EC2 instances access to SQS using IAM roles. This is very confusing.
2) Where would the aforementioned environment variables be placed? And where would you get the key and secret from?
Can someone help clarify?
There are several ways that applications can discover AWS credentials. Any software using the AWS SDK automatically looks in these locations. This includes the AWS Command-Line Interface (CLI), which is a python app that uses the AWS SDK.
Your bold words refer to #3, below:
1. Environment Variables
The SDK will look for the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables. This is a great way to provide credentials because there is no danger of accidentally including a credentials file in github or other repositories. In Windows, use the System control panel to set the variables. In Mac/Linux, just EXPORT the variables from the shell.
The credentials are provided when IAM users are created. It would be your responsibility to put those credentials into the environment variables.
2. Local Credentials File
The SDK will look in local configuration files, such as:
~/.aws/credentials
C:\users\awsuser\.aws\credentials
These files are great for storing user-specific credentials and can actually store multiple profiles, each with their own credentials. This is useful for switching between different environments such as Dev and Test.
The credentials are provided when IAM users are created. It would be your responsibility to put those credentials into the configuration file.
3. IAM Roles on an Amazon EC2 instance
An IAM role can be associated with an Amazon EC2 instance at launch time. Temporary credentials will then automatically be provided via the instance metadata service via the URL:
http://instance-data/latest/meta-data/iam/security-credentials/<role-name>/
This will return meta-data that contains AWS credentials, for example:
{
"Code" : "Success",
"LastUpdated" : "2015-08-27T05:09:23Z",
"Type" : "AWS-HMAC",
"AccessKeyId" : "ASIAI5OXLTT3D5NCV5MS",
"SecretAccessKey" : "sGoHyFaVLIsjm4WszUXJfyS1TVN6bAIWIrcFrRlt",
"Token" : "AQoDYXdzED4a4AP79/SbIPdV5N8k....lZwERog07b6rgU=",
"Expiration" : "2015-08-27T11:11:50Z"
}
These credentials have inherit the permissions of the IAM role that was assigned when the instance was launched. They automatically rotate every 6 hours (note the Expiration in this example, approximately 6 hours after the LastUpdated time.
Applications that use the AWS SDK will automatically look at this URL to retrieve security credentials. Of course, they will only be available when running on an Amazon EC2 instance.
Credentials Provider Chain
Each particular AWS SDK (eg Java, .Net, PHP) may look for credentials in different locations. For further details, refer to the appropriate documentation, eg:
Providing AWS Credentials in the AWS SDK for Java
Providing AWS Credentials in the AWS SDK for .Net
Providing AWS Credentials in the AWS SDK for PHP

How can we use IAM Roles in AWS that can be used to access API in an application?

How can I use the role feature to access the AWS API in an application? How can I implement this use case?
I am able to fetch the credentials by launching an ec2 role based instance.
Any help is appreciated
Thanks in advance.
IAM Rules are used to manage access to AWS services and resources. It is not meant to be used as an applicative authorization engine. From the documentation:
AWS Identity and Access Management (IAM) enables you to securely
control access to AWS services and resources for your users. Using
IAM, you can create and manage AWS users and groups and use
permissions to allow and deny their access to AWS resources.

Resources