Not able to run AWS CLI commands through windows script(.bat file) - windows

I am trying to create a "aws_configure.bat" file which will run aws commands. I need to configure "aws_configure.bat" file as windows task. I created my script with below content.
aws configure set AWS_ACCESS_KEY_ID <mykey>
aws configure set aws_secret_access_key <myskey>
aws configure set region us-west-2
aws dynamodb list-tables
When I am trying to run this script then its printing the first line in cmd window. Can someone please suggest what is the problem here. Why my script is not able to run the aws cli commands. (I have installed aws cli in my system and when i am running these commands directly in cmd window, everything is working fine).

You should consider creating and configuring your AWS credentials outside of your batch file, then referencing the named profile from the batch file.
Run aws configure --profile myprofile, and provide the information required.
Then from your batch file, call aws dynamodb list-tables --profile myprofile.
To setup the prefered/default profile, set AWS_PROFILE=myprofile in system environment. With this method, you will not need to reference the profile in the batch file.

Related

How can I run AWS Lambda locally and access DynamoDB?

I try to run and test an AWS Lambda service written in Golang locally using SAM CLI. I have two problems:
The Lambda does not work locally if I use .zip files. When I deploy the code to AWS, it works without an issue, but if I try to run locally with .zip files, I get the following error:
A required privilege is not held by the client: 'handler' -> 'C:\Users\user\AppData\Local\Temp\tmpbvrpc0a9\bootstrap'
If I don't use .zip, then it works locally, but I still want to deploy as .zip and it is not feasible to change the template.yml every time I want to test locally
If I try to access AWS resources, I need to set the following environment variables:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_SESSION_TOKEN
However, if I set these variables in template.yml and then use sam local start-api --env-vars to fill them with the credentials, then the local environment works and can access AWS resources, but when I deploy the code to the real AWS, it gives an error, since these variables are reserved. I also tried to use different names for these variables, but then the local environment does not work, and also tried to omit these from template.yml and just use the local env-vars, but environment variables must be present in template.yml and cannot be created with env-vars, can only fill existing variables with values.
How can I make local env work but still be able to deploy to AWS?
For accessing AWS resources you need to look at IAM permissions rather than using programmatic access keys, check this document out for cloudformation.
To be clear virtually nothing deployed on AWS needs those keys, it's all about applying permissions to X(lambda, ec2 etc etc) - those keys are only really needed for the aws cli and some local envs like serverless and sam
The serverless framework now supports golang, if you're new I'd say give that a go while you get up to speed with IAM/Cloudformation.

How to run the aws command contained in a Bash script without configuring it?

I am trying to run bash script to upload the file into S3 bucket
like : aws s3 <cp xyx.txt> s3://<tos3bucket>
Is that possible to run the aws command without configuring using $ aws configure like below detail.
Either by external file or by command -u 'key' -p 'value' is there?
My aim is run the aws cli without configuring it
I have tried by below
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
But I got:
upload failed: ... Unable to locate credentials
If I configure aws it works.
Yes, you can run the aws CLI command without configuring or export the key-value but keep in mind add space before the command to avoid command writing to history.
<space> REGION=us-west-2 AWS_ACCESS_KEY_ID="You_key" AWS_SECRET_ACCESS_KEY="secret" aws ec2 describe-instances

aws s3 cli not working in window task scheduler

I try to run the following aws cli command in console it working correctly.
I have aws access key and secret configured.
aws s3 sync "C:\uploadfolder" s3://uploadfolder
However, when i run it inside windows task scheduler in windows 10 as well as windows server 2012, I got the following error:
cannot find the file specified 0x80070002
It does not seems like it is a corrupted profile because it does not work for both windows and other command is running as expected.
Is there any step that I miss out? or any other special command needed when run aws cli in window task scheduler.
Your cli command is attempting to sync a FILE called "uploadfolder". You need to change to the directory first, then run the command. Your command should instead be:
cd C:\uploadfolder
aws s3 sync . s3://uploadfolder/
This will recursively copy all files in your local directory that are not in your s3 bucket. If you would also like the sync command to delete files that are no longer in the local directory, you also need to add the --delete flag.
aws s3 sync . s3://uploadfolder/ --delete

Specify an AWS CLI profile in a script when two exist

I'm attempting to use a script which automatically creates snapshots of all EBS volumes on an AWS instance. This script is running on several other instances with no issue.
The current instance already has an AWS profile configured which is used for other purposes. My understanding is I should be able to specify the profile my script uses, but I can't get this to work.
I've added a new set of credentials to the /home/ubuntu/.aws file by adding the following under the default credentials which are already there:
[snapshot_creator]
aws_access_key_id=s;aldkas;dlkas;ldk
aws_secret_access_key=sdoij34895u98jret
In the script I have tried adding AWS_PROFILE=snapshot_creatorbut when I run it I get the error Unable to locate credentials. You can configure credentials by running "aws configure".
So, I delete my changes to /home/ubuntu/.aws and instead run aws configure --profile snapshot_creator. However after entering all information I get the error [Errno 17] File exists: '/home/ubuntu/.aws'.
So I add my changes to the .aws file again and this time in the script for every single command starting with aws ec2 I add the parameter --profile snapshot_creator, but this time when I run the script I get The config profile (snapshot_creator) could not be found.
How can I tell the script to use this profile? I don't want to change the environment variables for the instance because of the aforementioned other use of AWS CLI for other purposes.
Credentials should be stored in the file "/home/ubuntu/.aws/credentials"
I guess this error is because it couldn't create a directory. Can you delete the ".aws" file and re-run the configure command? It should create the credentials file under "/home/ubuntu/.aws/"
File exists: '/home/ubuntu/.aws'

AWS CLI doesn't find config file when running in batch file after maven command

When I just run the aws lambda update-function-code command in the cmd with the appropriate parameters everything works fine. It also works when I run the command in a batch file. But when I want to run mvn package before aws lambda update-function-code in a batch file I get the following error:
'You must specify a region. You can also configure your region by running "aws configure"'
I already configured it and I know it is correctly configured, otherwise by just running the aws lambda command it would also throw an error.
The config file is also at the location Amazon suggest it.
My batch file looks like this:
call mvn package
call aws lambda update-function-code --function-name <functionName> --zip-file fileb://<path/to/jar>
(Of course the words in brackets are just placeholder)
You could specify the AWS region as a command line option in the batch file
call aws --region us-east-1 lambda update-function-code --function-name <functionName> --zip-file fileb://<path/to/jar>
Any kind of switching of regions could be handled by logic in the batch file
I solved the problem!
Maven sets some local variables which affects in some way the aws lambda command. Because of call, these variables persist till the batch file is completly executed. To avoid that these variables are set till the end I had to add #SETLOCAL and #ENDLOCAL as follows:
#SETLOCAL
call mvn package
#ENDLOCAL
call aws lambda update-function-code --function-name <functionName> --zip-file fileb://<path/to/jar>
Now everything works like a charm.

Resources