Getting data from AWS SSM Parameter Store in ruby script using aws-sdk-v1 - ruby

I'm trying to get the secrets from SSM Parameter store. The issue is we're on aws-sdk-v1 (ruby). For V2, V3 I can get plenty of examples, but not for V1. e.g. code snippet for aws-sdk--v2.
ssm_client = Aws::SSM::Client.new(
region: region
)
param_response = ssm_client.get_parameter(
name: parameter_id,
with_decryption: true
).to_h
Do anyone know how to do it if I'm on aws-sdk-v1.
PS: Upgrading from aws-sdk V1 to V2/V3 is not the viable options, please suggest considering the solution should run on aws-sdk-v1.
Ruby version: '1.9.3'

SSM is already present in the aws-sdk-v1.
There was a dependency issue in my Gemfile. I was using aws-sdk-resources ~> 3, which is fundamentally incompatible with aws-sdk ~> 1.

Related

Azure ruby sdk storage account create params, no method allow_blob_public_access

While doing a create on storage_accounts
I get:
NoMethodError: undefined method allow_blob_public_access= for #Azure::Storage::Mgmt::V2019_06_01::Models::StorageAccountCreateParameter
This is the link to the StorageAccountCreateParameter on microsofts github.
My code looks like this:
sa_create_params = StorageModels::StorageAccountCreateParameters.new.tap do |sacp|
sacp.kind = 'StorageV2'
sacp.kind = payload['StorageAccountType'] if payload && payload['StorageAccountType']
sacp.sku = sku
sacp.location = params['region']
sacp.access_tier = 'hot'
sacp.access_tier = payload['AccessTier'] if payload && payload['AccessTier']
sacp.tags = system_tags(params)
sacp.allow_blob_public_access = false
end
Without the last line, regarding public access, it works just fine. I've tried upversioning the gems (hence the current version). And looking at their github it looks pretty self-explanatory. I'm at a loss, all help is much appreciated.
I'm using the following gems:
ms_rest_azure 0.11.0
azure_mgmt_storage 0.21.0
Update: it seems the name should be sacp.properties.allow_blob_public_acces as per link.
But this also throws a NoMethodError
I updated all azure gems to the latest version.
Relevant gems:
spec.add_dependency 'ms_rest_azure', '~> 0.12.0'
Spec.add_dependency 'azure_mgmt_storage', '~> 0.22.0'
This did the trick

Is there a way to add new aws regions to Ansible ec2 module, which still uses old boto?

Old boto is used in ansible aws ec2 module, which is outdated. last commit 2018y. How do u provision instances in new regions?
my current version of ansible 2.9.6, but in 2.10 & 2.11 changelog there are nothing about chenge to boto3
region list in boto:
[eu-west-1, eu-west-2, cn-north-1, us-east-2, us-gov-west-1, ca-central-1, ap-southeast-2, us-west-2, ap-southeast-1, us-east-1, sa-east-1, us-west-1, ap-northeast-2, eu-central-1, ap-south-1, ap-northeast-1]
region list using boto3:
['af-south-1', 'ap-east-1', 'ap-northeast-1', 'ap-northeast-2', 'ap-south-1', 'ap-southeast-1', 'ap-southeast-2', 'ca-central-1', 'eu-central-1', 'eu-north-1', 'eu-south-1', 'eu-west-1', 'eu-west-2', 'eu-west-3', 'me-south-1', 'sa-east-1', 'us-east-1', 'us-east-2', 'us-west-1', 'us-west-2']
Here is a quote from https://docs.ansible.com/ansible/latest/collections/amazon/aws/ec2_module.html
Note: This module uses the older boto Python module to interact with the EC2 API. amazon.aws.ec2 will still receive bug fixes, but no new features. Consider using the amazon.aws.ec2_instance module instead.

What is the right way to develop using Paperclip and S3, but without connecting to real S3 bucket?

My project uses Paperclip and Amazon S3, but I need a development/test environment that don't connect directly to S3. I've tried to use FakeS3, but with no luck, since I am using aws-sdk version 2 (and all other websites shows how to proceed using the v1).
There's a way to do that? How?
My Gemfile:
gem 'aws-sdk', '~> 2.5', '>= 2.5.3'
gem 'paperclip', '~> 5.1'
group :development, :test do
gem 'fakes3', '~> 0.2.4'
end
The aws-ruby-sdk v2 offers something that allow you to test your code that uses AWS sdk.
stub_data and stub_responses
If offers many options and one of them is:
# stub data in the constructor
client = Aws::S3::Client.new(stub_responses: {
list_buckets: { buckets: [{name: 'my-bucket' }] },
get_object: { body: 'data' },
})
client.list_buckets.buckets.map(&:name) #=> ['my-bucket']
client.get_object(bucket:'name', key:'key').body.read #=> 'data'
This way you control what gets returned by the SDK without needing to use the real service.
http://docs.aws.amazon.com/sdkforruby/api/Aws/ClientStubs.html

Can't create bucket using aws-sdk ruby gem. Aws::S3::Errors::SignatureDoesNotMatch

I have a new computer and I'm trying to set up my AWS CLI environment so that I can run a management console I've created.
This is the code I'm running:
def create_bucket(bucket_args)
AWS_S3 = Aws::S3::Client.new(signature_version: 'v4')
AWS_S3.create_bucket(bucket_args)
end
Which raises this error:
Aws::S3::Errors::SignatureDoesNotMatch - The request signature we calculated does not match the signature you provided. Check your key and signing method.:
This was working properly on my other computer, which I no longer have access to. I remember debugging this same error on the other computer, and I thought I had resolved it by adding signature_version = s3v4 to my ~/.aws/config file. But this fix is not working on my new computer, and I'm not sure why.
To give some more context: I am using aws-sdk (2.5.5) and these aws cli specs: aws-cli/1.11.2 Python/2.7.12 Linux/4.4.0-38-generic botocore/1.4.60
In this case the issue was that my aws credentials (in ~/.aws/credentials) - specifically my secret token - were invalid.
The original had a slash in it:
xx/xxxxxxxxxxxxxxxxxxxxxxxxxx
which I didn't notice at first, so when I double clicked the token to select the word, it didn't include the first three characters. I then pasted this into the terminal when running aws configure.
To fix this, I found the correct, original secret acceess token and set the correct value in ~/.aws/credentials.

Chef aws driver tags don't work using Etc.getlogin

I am currently using Chef solo on a Windows machine. I used the fog driver before which created tags for my instances on AWS. Recently, I moved to the aws driver and noticed that aws driver does not handle tagging. I tried writing my own code to create the tags. One of the tags being "Owner" which tells me who created the instance. For this, I am using the following code:
def get_admin_machine_options()
case get_provisioner()
when "cccis-environments-aws"
general_machine_options = {ssh_username: "root",
create_timeout: 7000,
use_private_ip_for_ssh: true,
aws_tags: {Owner: Etc.getlogin.to_s}
}
general_bootstrap_options = {
key_name: KEY_NAME,
image_id: "AMI",
instance_type: "m3.large",
subnet_id: "subnet",
security_group_ids: ["sg-"],
}
bootstrap_options = Chef::Mixin::DeepMerge.hash_only_merge(general_bootstrap_options,{})
return Chef::Mixin::DeepMerge.hash_only_merge(general_machine_options, {bootstrap_options: bootstrap_options})
else
raise "Unknown provisioner #{get_setting('CHEF_PROFILE')}"
end
end
machine admin_name do
recipe "random.rb"
machine_options get_admin_machine_options()
ohai_hints ohai_hints
action $provisioningAction
end
Now, this works fine on my machine. The instance is created on my machine with proper tags but when I run the same code on someone else's machine. It doesn't create the tags at all. I find this to be very weird. Does anyone know what's happening? I have the same code!
Okay so I found the issue. I was using the gem chef-provisioning-aws 1.2.1 and everyone else was on 1.1.1
the gem 1.1.1 does not have support for tagging so it just went right past it.
I uninstalled the old gem and installed the new one. It worked like a charm!

Categories

Resources