How do I pass the value of the variable to a command in a bash script?
Specifically, I want to create a AWS S3 bucket with (partially) random name. This is what I got so far:
#!/bin/bash
random_id=$(cat /dev/urandom | tr -dc 'a-z0-9' | fold -w 8 | head -n 1)
bucket_name=s3://mybucket-$random_id
echo Bucket name: ${bucket_name}
aws s3 mb ${bucket_name}
The output I get:
Bucket name: s3://mybucket-z4nnli2k
Parameter validation failed:cket-z4nnli2k
": Bucket name must match the regex "^[a-zA-Z0-9.\-_]{1,255}$"
The bucket name is generated correctly, but aws s3 mb ${bucket_name} fails. If I just run aws s3 mb s3://mybucket-z4nnli2k then the bucket is created, so I assume that aws s3 mb ${bucket_name} is not the correct way to pass the value of the bucket_name to the aws s3 mb command.
It must be something obvious but I have almost zero experience with shell scripts and can't figure it out.
How do I pass the value of bucket_name to the aws s3 mb command?
Thanks to the comments above, this is what I got working:
#!/bin/bash
random_id=$(cat /dev/urandom | tr -dc 'a-z0-9' | fold -w 8 | head -n 1)
bucket_name=s3://mybucket-$random_id
echo Bucket name: ${bucket_name}
aws s3 mb "${bucket_name}"
I also had to run dos2unix on the script, apparently there were bad line breaks.
Related
I am using this code to get my AWS region. I want working on them in a while loop.
awsRegionList=$(aws ec2 describe-regions | jq -r '.Regions[] | .RegionName')
while [I can't find the expression work with my variable]:
do
echo " working on : (I want here the regionName)"
done
In bash you need to use a for loop to iterate over a list, instead of a while loop:
awsRegionList=$(aws ec2 describe-regions | jq -r '.Regions[] | .RegionName')
for region in $awsRegionList
do
echo " working on : ${region}"
done
Im hoping to walk through some kinesis data using bash. Using a cmd like:
aws kinesis get-records --shard-iterator <long shard info> | jq '[.|.Records[].Data]' | grep \"ey | sed -e 's/^[ \t]*\"//;s/[ \t]*\",$//'
I can get the base64 data from the stream. What Im having issues with is piping this through base64 so I can see the actual data.
If I send it through using a combination of head -n and tail I can see individual values but any attempt to pass through more than 2-3 lines fails. Errors are typically one set of JSON values followed by garbage data. The whole command is typically preceded by
Invalid character in input stream.
To see the json values I use <long bash command from above> | xargs base64 -D
-- Caveat: Using bash on OSX
This works (assuming you've copied the base64 data to a file):
while IFS= read -r line; do echo $line | base64 -D && printf "\n"; done < <infile>
I have developed Kines - friendly CLI for Amazon Kinesis Data Stream. This can be useful for your debugging purpose.
You can install it using pip.
pip install kines
Then you can run kines walk command on stream and shard to view decoded data.
kines walk <stream-name> <shard-id>
Demo:
I have files which existing under s3 path like below:
s3://ttttt/2018/11/01/02 -->(tttt=s3 bucket, 2018=year, 11=moth, 01=day, hour=02)
I have a new files all the time which inserted in s3 as the example below:
s3://tttt/2018/10/01/01/ls.s3.4cede5e7d25c.2018-10-01T01.00.tag_lre.txt.gz
s3://tttt/2018/10/01/01/ls.s3.4cede5e7d25c.2018-10-01T01.00.tag_lre.txt.gz
s3://tttt/2018/10/01/02/ls.s3.4cede5e7d25c.2018-10-01T02.00.tag_lre.txt.gz
I would like to pick up via Bash script:
1. max year
2. max month
3. max day
4. max hour
The script that I built seems like that (but does not work good):
#!/bin/bash
result=`aws s3 ls "s3://tttt/2018/" | awk '{print $2}' |tail -n 1`
result1=`aws s3 ls "s3://tttt/2018/${result:0:2}/" | awk '{print $NF-1}' |tail -n 1`
result2=2018/${result:0:2}/${result1:0:2}/
Any ideas how to write it?
Can did not show the output of aws s3 ls "s3://tttt/2018/"|tail -1.
With
result="s3://tttt/2018/10/01/01/ls.s3.4cede5e7d25c.2018-10-01T01.00.tag_lre.txt.gz"
you can do
IFS=/ read -r s3 empty bucket year month day hour file <<< "${result}"
result:
set | egrep "bucket=|year=|month=|day=|hour="
bucket=tttt
day=01
hour=01
month=10
year=2018
I am listing all the files in s3 bucket and writing it in a text file. For example, my bucket has the following list of files:
text.zip
fixed.zip
hello.zip
good test.zip
I use the following code:
fileList=$(aws s3 ls s3://$inputBucketName/ | awk '{print $4}')
if [ ! -z "$fileList" ]
then
$AWS_CLI s3 ls s3://$inputBucketName/ | awk '{print $1,$2,$4}' > s3op.txt
sort -k1,1 -k2 s3op.txt > s3op_srt.txt
awk '{print $3}' s3op_srt.txt > filesOrder.txt
fi
cat filesOrder.txt;
After this when I iterate the files from the file I created (I will delete the files in S3 at the end of the loop, so the file won't be processed again):
fileName=`head -1 filesOrder.txt`
the files are listed like below:
text.zip
fixed.zip
hello.zip
good
So the problem is that, the list is not able to list the files with spaces correctly.
As the file name is returned as "good" and not as "good test.zip", it is not able to delete the file from S3.
Expected Result is
text.zip
fixed.zip
hello.zip
good test.zip
I used following command to delete files in S3:
aws s3 rm s3://$inputBucketName/$fileName
Put the full file path under double quotes.
For example:
aws s3 rm "s3://test-bucket/good test.zip"
In your case, it would be:
aws s3 rm "s3://$inputBucketName/$fileName"
Here even if the fileName has spaces, it'll be deleted.
I'd like to ask for help with changing uppercase file names in AWS S3 to lowercase.
I got two files, one's a list of file names from an AWS S3 bucket with upper case letter like so (lets call it uppercase.txt):
ABc.txT
aBCd.pHp
AbCdE.jpg
and a second file with a list of the translation of the names to lower case (lowercase.txt, easily done with tr '[:upper:]' '[:lower:]'):
abc.txt
abcd.php
abcde.jpg
I tried a bunch of for loops, the command I wish to repeat is 'aws s3 mv $first_list_value $second_list_value .
Tried this:
for i in `cat uppercase_file.txt`; do aws s3 mv $i `cat lowercase_file.txt`; done
no dice :-( The AWS S3 API is limited and doesn't take well to most linux commands.
Yelp?
Something like this should work:
paste uppercase_file.txt lowercase_file.txt | while read uc lc
do
aws s3 mv $uc $lc
done