AWS Data Migration Service IAM Policy Error? - elasticsearch

In AWS DMS, I am having an error when I try to create Target Endpoint.
I am trying to migrate MariaDB to Elasticsearch, and so the service of Target Endpoint is AWS Elasticsearch service.
However, it requires me to add IAM user but even I add an IAM user with Administrator policy it keeps saying that
The IAM Role arn:aws:iam::[number]:user/[username] is not configured properly.AccessDenied
What kind of IAM policies are required for this task?

you can get the policy document from here
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.DynamoDB.html

Related

Not authorized to perform cloudformation DescribeStacks despite AdministratorAccess [Serverless, Lambda, IAM]

I tried to deploy using the following command.
sls deploy --stage=stage --profile=[my-profile]
And it gives the error.
Serverless Error ----------------------------------------
User: arn:aws:iam::[my-iam-user] is not authorized to perform: cloudformation:DescribeStacks on resource: arn:aws:cloudformation:ap-northeast-1:[my-lambda-endpoint]/* with an explicit deny
My IAM user has AdministratorAccess policy, so I can't understand why the error occured
If you really have admin privileges for that user, there's only one way those can be restricted and this only applies when you're in a member account of AWS organizations: Service Control Policies. SCPs can be used by the management account to restrict access to certain regions or services.
If you're not in an AWS organizations member account and getting this error, you probably do not have the permissions you think you have with that user.

access issue while connecting to azure data lake gen 2 from databricks

I am getting this below access issue while trying to connect from databricks to gen2 data lake using Service principal and OAuth 2.0
Steps performed: Reference article
created new service principal
provide necessary access to this service principal from azure storage account IAM with Contributor role access.
Firewalls and private end points connection has been enabled on databricks and storage account.
StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=AuthorizationPermissionMismatch
ErrorMessage=This request is not authorized to perform this operation using this permission.
However when I tried connecting via access keys it works well without any issue. Now I started suspecting if #3 from my steps is the reason for this access issue. If so, do I need to give any additional access to make it success? Any thoughts?
When performing the steps in the Assign the application to a role, make sure to assign the Storage Blob Data Contributor role to the service principal.
Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.
Solution: Now assigned the Storage Blob Data Contributor role to the service principal.
Finally, able to get the output without any error message after assigning Storage Blob Data Contributor role to the service principal.
For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.

Google Cloud Logging Authentication / permissions

I am using the Golang library cloud.google.com/go/logging and want to send runtime logging.
Already have a GOOGLE_APPLICATION_CREDENTIALS .json file - and am using google storage and firebase - so I know the credentials are working.
With logging, I get an error "Error 403: The caller does not have permission, forbidden"
The account in the application credentials is a service account and I have been looking at the IAM permissions. There is not an obvious permission for logging (there are other stackdriver permissions, for debug, trace etc but these don't seem to work).
So assuming I am in the right place so far - what permissions does the service account need in order to send logging data to stackdriver logging?
If we look at the API for writing entries to a log we find that the IAM permission logging.logEntries.create is required.
A more detailed article can be found at Access control guide.
This describes a variety of roles including:
roles/logging.logWriter
According to the official documentation:
Using Stackdriver Logging library for Go requires the Cloud IAM Logs
Writer role on Google Cloud. Most Google Cloud environments provide
this role by default.
1.App Engine grants the Logs Writer role by default.
2.On Google Kubernetes Engine, you must add the logging.write access scope when creating the cluster:
3.When using Compute Engine VM instances, add the cloud-platform access scope to each instance.
4.To use the Stackdriver Logging library for Go outside of Google Cloud, including running the library on your own workstation, on your data center's computers, or on the VM instances of another cloud provider, you must supply your Google Cloud project ID and appropriate service account credentials directly to the Stackdriver Logging library for Go.
You can create and obtain service account credentials manually. When specifying the Role field, use the Logs Writer role. For more information on Cloud Identity and Access Management roles, go to Access control guide.
Setting Up Stackdriver Logging for Go
gcloud iam service-accounts list
gcloud projects add-iam-policy-binding my-project-123 \
--member serviceAccount:my-sa-123#my-project-123.iam.gserviceaccount.com \
--role roles/logging.logWriter

Serverless multiple services errorring with AWS provider credentials not found. Learn how to set up AWS provider credentials in our docs here

We have a repo with multiple services ~15. In the deployment step we get the following error intermittently
AWS provider credentials not found. Learn how to set up AWS provider credentials in our docs here
When we run the same service again it works. Could this be due to the number of services we are deploying?

How can we use IAM Roles in AWS that can be used to access API in an application?

How can I use the role feature to access the AWS API in an application? How can I implement this use case?
I am able to fetch the credentials by launching an ec2 role based instance.
Any help is appreciated
Thanks in advance.
IAM Rules are used to manage access to AWS services and resources. It is not meant to be used as an applicative authorization engine. From the documentation:
AWS Identity and Access Management (IAM) enables you to securely
control access to AWS services and resources for your users. Using
IAM, you can create and manage AWS users and groups and use
permissions to allow and deny their access to AWS resources.

Resources