I am using a script to generate a file with the current date appended, then exporting that file to Amazon S3. The script is:
#!/bin/bash
#python script that exports file in the form of "myfile{current date}.csv" to home directory
#example: myfile20150316.csv
python some_script.py
#recreate file name exported by some_script.py and save as a variable
now=$(date "+%Y%m%d")
file_name="myfile"
file_ext=".csv"
_file="$file_name$now$file_ext"
#export to file created by python to S3
s3cmd put $_file S3://myS3/bucket/$_file
The file is created by the python script and exported to my home directory, but the file is not exported to S3. My assumption is that I am incorrectly referencing the file in my s3cmd command.
This is not an S3 question. In a generic sense, how should I reference variables that contain file names to be able to use them in subsequent commands?
Please help. Thanks.
Related
Hello I'm trying to make a simple .bat file, I'm trying to modify a file after downloading it from another machine.
The problem is the python script needs the full name of the file, so filename* won't work so is there a way to download a file via scp and then somehow assign the downloaded file a variable so the script can find the full name
scp user#192.168.1.X:"C:\Users\user\Downloads\filename*" ./
pythonscript.py filename*
pythonscript.py "%cd%\filename"
There are files from an AWS s3 bucket that I would like to download, they all have the same name but are in different subfolders. There are no credentials required to download and connect to this bucket. I would like to download all the files called "B01.tif" in s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/, and save them with the name of the subfolder they are in (for example: S2A_7VEG_20170205_0_L2AB01.tif).
Path example:
s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/2017/2/S2A_7VEG_20170205_0_L2A/B01.tif
I was thinking of using a bash script that prints the output of ls to download the file with cp, and save it on my pc with a name generated from the path.
Command to use ls:
aws s3 ls s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/2017/2/ --no-sign-request
Command to download a single file:
aws s3 cp s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/2017/2/S2A_7VEG_20170205_0_L2A/B01.tif --no-sign-request B01.tif
Attempt to download multiple files:
VAR1=B01.tif
for a in s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/:
for b in s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/2017/:
for c in s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/2017/2/:
NAME=$(aws s3 ls s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/$a$b$c | head -1)
aws s3 cp s3://sentinel-cogs/sentinel-s2-l2a-cogs/7/V/EG/$NAME/B01.tif --no-sign-request $NAME$VAR1
done
done
done
I don't know if there is a simple way to go automatically through every subfolder and save the files directly. I know my ls command is broken, because if there are multiple subfolders it will only take the first one as a variable.
It's easier to do this in a programming language rather than as a Shell script.
Here's a Python script that will do it for you:
import boto3
BUCKET = 'sentinel-cogs'
PREFIX = 'sentinel-s2-l2a-cogs/7/V/EG/'
FILE='B01.tif'
s3_resource = boto3.resource('s3')
for object in s3_resource.Bucket(BUCKET).objects.filter(Prefix=PREFIX):
if object.key.endswith(FILE):
target = object.key[len(PREFIX):].replace('/', '_')
object.Object().download_file(target)
i want to create the zip file of the current working directory in bash script. AND Redirect the stdout of zip command to a file called zip-output.txt.
i have a current workind directory called "music" and my script populates it with a lot of content. thus after it is populated i want to convert it to "music.zip" . but the contents of the "music" directory shouldn't be altered or zipped.
/Users/xyz/Downloads/music
this is the path to the cd "music" if need be and the file "zip-output.txt" could be created at the path "/Users/xyz/Downloads"
ik this is easy but i am new. please guide me
Do you mean something like this?
zip -r music.zip music/ | tee zip-output.txt
My OS is Ubuntu16 in virtualbox.
I'm trying to write a script to transfer multiple files(filename:t01,t02,t03) with scp.
This is my code:
vim scriptname
#!/bin/bash
for a in {01..03}
do scp -i ~/home/username/.ssh/id_rsa -r t$a
username#xx.xx.xx.xxx:/home/username/Desktop
done
And when I typed this in the terminal
./scriptname
I got this
Warning: Identity file /home/ian/home/ian/.ssh/id_rsa not accessible: No
such file or directory.
t01: No such file or directory
Warning: Identity file /home/ian/home/ian/.ssh/id_rsa not accessible: No
such file or directory.
t02: No such file or directory
Warning: Identity file /home/ian/home/ian/.ssh/id_rsa not accessible: No
such file or directory.
t03: No such file or directory
One thing I couldn't understand is that I actually wrote "/home/ian/.ssh/id_rsa" in the script. But the error message showed "/home/ian/home/ian/.ssh/id_rsa". I have tried to type my ssh_key directory in different ways, such as "/.ssh/id_rsa" but still couldn't work.
What did I do wrong?
Thank you!
t01: No such file or directory
Told you that it cannot access the file
Because the dictionary you run the bash script is not the same to where the files are.
If you want to put files not in the same dictionary, you have to give the full path for all of them.
I've written a bash script that executes a python script to write a file to a directory, then sends that file to Amazon S3. When I execute the script from the command line it executes perfectly, but when I run it with cron, the file writes to the directory, but never gets sent to S3. I must be doing something wrong with cron.
Here is the bash script:
#!/bin/bash
#python script that exports file to home directory
python some_script.py
#export file created by python to S3
s3cmd put /home/bitnami/myfile.csv s3://location/to/put/file/myfile.csv
Like I said before, manually executing works fine using ./bash_script.sh. When I set up the cron job, the file writes to the directory, but never gets sent to S3.
my cron job is:
18 * * * * /home/bitnami/bash_script.sh
Am I using cron incorrectly? Please help.
Cron looks OK, however your path to the .py file will not be found.
You will have to add a path or home like:
location=/home/bitnami/
python $location/some_script.py
Also s3cmd needs to be located correctly:
/bin/s3cmd
Alternative might also need to load your user environment first before executing the script to find username/password/ssh key for s3cmd