describe_instance_status with boto3 with filter "running" skips instances for region ap-east-1 - amazon-ec2

The following code snippet uses the latest version of boto3 and looks for all "running" instances in ap-east-1, where the client is created with the specific region (ap-east-1)
try:
running_instances = ec2.describe_instance_status(
Filters=[
{
"Name": "instance-state-name",
"Values": ["running"],
},
],
InstanceIds=<list of instance_ids>,
)
except ClientError as e:
<catch exception>
The result is an empty list even though there are running Ec2 instances.
The above snippet works for all other regions though.
The AWS command aws ec2 describe-instance-status --region ap-east-1 --filter Name="instance-state-name",Values="running" --instance-id <list of instance ids> returns the running instances with the same filter.
What I am missing for this region specifically while using boto3?
Is there a specific version of boto3 that works for ap-east-1 region?

https://github.com/boto/boto3/issues/3575 - I asked the question here and they helped with debugging it.
Org Payer Level Regional STS needed to be enabled for such regions.

Related

Passing complex parameters to `aws cloudformation deploy`

From PowerShell I'm calling aws cloudformation deploy against LocalStack using a template generated by the CDK:
aws --endpoint-url http://localhost:4566 cloudformation deploy --template-file ./cdk.out/my.template.json --stack-name my-stack --parameter-overrides functionLambdaSourceBucketNameParameter55F17A81=`{`"BucketName`":`"my-bucket`",`"ObjectKey`":`"code.zip`"`} functionLambdaSourceObjectKeyParameterB7223CBC=`{`"BucketName`":`"my-bucket`",`"ObjectKey`":`"code.zip`"`}
The code executes and I get a cryptic message back: 'Parameters'. Also, the stack events show a failure but don't include any reason:
"ResourceType": "AWS::CloudFormation::Stack",
"Timestamp": "2020-09-24T19:03:28.388072+00:00",
"ResourceStatus": "CREATE_FAILED"
I assume there is something wrong with the format of my parameters but I cannot find any examples of complex parameter values. The parameters are CodeParameters which have both BucketName and ObjectKey properties.

Parameter validation failed:\nUnknown parameter in input: \"include\", must be one of: cluster, services

I am writing a lambda to update all the services in all the ecs clusters based on their tags. For that I need to extract the tags from the description of the service but the corresponding function gives error.
import boto3
import botocore
client = boto3.client('ecs')
def lambda_handler(event, context):
responseToListClusters = client.list_clusters() #gets list of clusters
clusterArnsList=responseToListClusters['clusterArns'] #extracts list of clusterArns
for CLUSTER in clusterArnsList:
responseToListServices = client.list_services(cluster= CLUSTER) #gets list of services
serviceArnsList=responseToListServices['serviceArns'] #extracts list of serviceArns
for SERVICE in serviceArnsList:
responseToDescribeServices= client.describe_services(cluster=CLUSTER,services=[SERVICE,],include=['TAGS',])
print(responseToDescribeServices)
#client.update_service(cluster=CLUSTER,service=SERVICE,desiredCount=1) #updates all services
You are encountering this error because AWS lambda by-default runs with older version of boto3.
Currently AWS lamda has following versions :
python3.7
boto3-1.9.42
botocore-1.12.42
python3.6
boto3-1.7.74
botocore-1.10.74
python2.7
N/A
Reference : Lambda Runtimes
To upgrade the boto3 version you can refer to following articles :
AWS Lambda Console - Upgrade boto3 version
https://www.mandsconsulting.com/lambda-functions-with-newer-version-of-boto3-than-available-by-default/

I have code which run in lambda but not in python

I have code which run in lambda but same is not work on my system.
asgName="test"
def lambda_handler(event, context):
client = boto3.client('autoscaling')
asgName="test"
response = client.describe_auto_scaling_groups(AutoScalingGroupNames=[asgName])
if not response['AutoScalingGroups']:
return 'No such ASG'
...
...
...
my below code i try to run in linux but prompt error "No such ASG"
asgName="test"
client = boto3.client('autoscaling')
response = client.describe_auto_scaling_groups(AutoScalingGroupNames=[asgName])
if not response['AutoScalingGroups']:
return 'No such ASG'
The first thing to check is that you are connecting to the correct AWS region. If not specified, it defaults to us-east-1 (N. Virginia). A region can also be specified in the credentials file.
In your code, you can specify the region with:
client = boto3.client('autoscaling', region_name = 'us-west-2')
The next thing to check is that the credentials are associated with the correct account. The AWS Lambda function is obviously running in your desired account, but you should confirm that the code running "in linux" is using the same AWS account.
You can do this by using the AWS Command-Line Interface (CLI), which will use the same credentials as your Python code on the Linux computer. Run:
aws autoscaling describe-auto-scaling-groups --auto-scaling-group-names test
It should give the same result as the Python code running on that computer.
You might need to specify the region:
aws autoscaling describe-auto-scaling-groups --auto-scaling-group-names test --region us-west-2
(Of course, change your region as appropriate.)

Putting to local DynamoDB table with Python boto3 times out

I am attempting to programmatically put data into a locally running DynamoDB Container by triggering a Python lambda expression.
I'm trying to follow the template provided here: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GettingStarted.Python.03.html
I am using the amazon/dynamodb-local you can download here: https://hub.docker.com/r/amazon/dynamodb-local
Using Ubuntu 18.04.2 LTS to run the container and lambda server
AWS Sam CLI to run my Lambda api
Docker Version 18.09.4
Python 3.6 (You can see this in sam logs below)
Startup command for python lambda is just "sam local start-api"
First my Lambda Code
import json
import boto3
def lambda_handler(event, context):
print("before grabbing dynamodb")
# dynamodb = boto3.resource('dynamodb', endpoint_url="http://localhost:8000",region_name='us-west-2',AWS_ACCESS_KEY_ID='RANDOM',AWS_SECRET_ACCESS_KEY='RANDOM')
dynamodb = boto3.resource('dynamodb', endpoint_url="http://localhost:8000")
table = dynamodb.Table('ContactRequests')
try:
response = table.put_item(
Item={
'id': "1234",
'name': "test user",
'email': "testEmail#gmail.com"
}
)
print("response: " + str(response))
return {
"statusCode": 200,
"body": json.dumps({
"message": "hello world"
}),
}
I know that I should have this table ContactRequests available at localhost:8000, because I can run this script to view my docker container dynamodb tables
I have tested this with a variety of values in the boto.resource call to include the access keys, region names, and secret keys, with no improvement to result
dev#ubuntu:~/Projects$ aws dynamodb list-tables --endpoint-url http://localhost:8000
{
"TableNames": [
"ContactRequests"
]
}
I am also able to successfully hit the localhost:8000/shell that dynamodb offers
Unfortunately while running, if I hit the endpoint that triggers this method, I get a timeout that logs like so
Fetching lambci/lambda:python3.6 Docker container image......
2019-04-09 15:52:08 Mounting /home/dev/Projects/sam-app/.aws-sam/build/HelloWorldFunction as /var/task:ro inside runtime container
2019-04-09 15:52:12 Function 'HelloWorldFunction' timed out after 3 seconds
2019-04-09 15:52:13 Function returned an invalid response (must include one of: body, headers or statusCode in the response object). Response received:
2019-04-09 15:52:13 127.0.0.1 - - [09/Apr/2019 15:52:13] "GET /hello HTTP/1.1" 502 -
Notice that none of my print methods are being triggered, if I remove the call to table.put, then the print methods are successfully called.
I've seen similar questions on Stack Overflow such as this lambda python dynamodb write gets timeout error that state that the problem is I am using a local db, but shouldn't I still be able to write to a local db with boto3, if I point it to my locally running dynamodb instance?
Your Docker container running the Lambda function can't reach the DynamoDB at 127.0.0.1. Try instead the name of your DynamoDB local docker container as the host name for the endpoint:
dynamodb = boto3.resource('dynamodb', endpoint_url="http://<DynamoDB_LOCAL_NAME>:8000")
You can use docker ps to find the <DynamoDB_LOCAL_NAME> or give it a name:
docker run --name dynamodb amazon/dynamodb-local
and then connect:
dynamodb = boto3.resource('dynamodb', endpoint_url="http://dynamodb:8000")
Found the solution to the problem here: connecting AWS SAM Local with dynamodb in docker
The question asker noted that he saw online that he may need to connect to the same docker network using:
docker network create lambda-local
So created this network, then updated my sam command and my docker commands to use this network, like so:
docker run --name dynamodb -p 8000:8000 --network=local-lambda amazon/dynamodb-local
sam local start-api --docker-network local-lambda
After that I no longer experienced the timeout issue.
I'm still working on understanding exactly why this was the issue
To be fair though, it was important that I use the dynamodb container name as the host for my boto3 resource call as well.
So in the end, it was a combination of the solution above and the answer provided by "Reto Aebersold" that created the final solution
dynamodb = boto3.resource('dynamodb', endpoint_url="http://<DynamoDB_LOCAL_NAME>:8000")

EC2 describe-images returns empty data set for windows AMI ami-6a70e303

$ aws ec2 describe-images --image-ids ami-6a70e303
{
"Images": [],
"ResponseMetadata": {
"RequestId": "348eb2b0-b975-4632-915e-f2e344d275bd"
}
From us-east-1.
This ami is for Amazon Windows_Server-2008-R2_SP1-English-64Bit-Base-2013.02.22
Any ideas as to why not returning data ?
This image is apparently deregistered. The command describes images available to you. Being deleted this image is no longer available.
More info here: http://thecloudmarket.com/image/ami-6a70e303--windows-server-2008-r2-sp1-english-64bit-base-2013-02-22#/definition

Resources