How do I change the order of a filename on Mac - bash

I've got a folder with a number of files in the following format:
Photo 31-12-2020, 00 08 09.jpg
i.e. dd-mm-yyyy, hr min sec.jpg
I would like to rename them all files to the following format:
2020-12-31 00.08.09.jpg
i.e. yyyy-mm-dd hr.mm.sec.jpg
The changes are: year and day moved around, comma removed, dots between hours, minutes and seconds.
However, there are a couple of hundred of files in the folder, so I would like to automate this with a bash script.
I have looked into running a bash script to do this, but I’m unfortunately not very comfortable with scripting and wasn’t successful.
Could anyone help me find an easy method to resolve my issue?

Assuming you're in the directory in which the photos are .... this should work.
for i in Photo*jpg
do
echo mv -v "${i}" "$(echo $i|awk 'BEGIN{FS="[ ,.-]"}{printf "%s-%s-%s %s.%s.%s.jpg\n",$4,$3,$2,$6,$7,$8}')"
done
... if the output looks sensible, trim out the first echo ..

Related

Bash - File name change Date + 1

I have around 500 files that I need to rename with the date the report represents. The filenames are currently:
WUSR1453722998383.csv
WUSR1453723010659.csv
WUSR1453723023497.csv
And so on. The numbers in the filename have nothing to do with the date, so I cannot use the filename as a guide for what the file should be renamed to. The reports start from 02/12/2014 and there is a report for every day of the month up until yesterday (09/04/2016). Luckily as well the filename is sequential - so 04/12/2014 will have a higher number than 03/12/2014 which will have a higher number than 02/12/2014. This means the files are automatically listed in alphabetical order.
There is however a date in the first line of the CSV before the data:
As at Date,2014-12-02
Now I've checked that I have all the files already and I do, so what's the best way to rename there to the date? I can either set the starting date as 02/12/2014 and rename each file as a +1 date or the script can read the date on the first line of the file (As at Date,2014-12-02 for example) and use that date to rename the file.
I have no idea how to write either of the method above in bash, so if you could help out with this, that would be really appreciated.
In terms of file output, I was hoping for:
02-12-2014.csv
03-12-2014.csv
And so on
Is that the answer you need? Assume all the file are under current directory. Do some testings before you do the real operation. The condition is every date string at your cvs file is unique. There will be some files be overwritten otherwise.
#!/bin/bash
for f in *.csv
do
o=$(sed '1q' $f |awk -F"[,-]" '{print $NF"-"$(NF-1)"-"$(NF-2)".csv"}')
# should we backup the file?
# cp $f ${f}.bak
mv $f $o
done

S3cmd move file and del folder

I'm trying to write a bash script to automate my backup plan. I use a script which creates a S3 folder each day with the day as folder name. And Each hour he uploads a backup in this folder.
exemple: /Application1/20130513/dump.01
My backup plan is to keep 2 days of full backup(each hour) and keep 1 backup by day for the latest 15 days in a s3 folder ("oldbackup").
What is wrong in my script?
#check and clean the S3 bucket
BUCKETNAME='application1';
FOLDERLIST = s3cmd ls s3://$BUCKETNAME
LIMITFOLDER = date --date='1 days ago' +'%Y%m%d'
for f in $FOLDERLIST
do
if [[ ${f} > $LIMITFOLDER && f != "oldbackup" ]]; then
s3cmd sync s3://$BUCKETNAME/$f/dump.rdb.0 s3://$BUCKETNAME/"oldbackup"/dump.rdb.$f
s3cmd del s3://$BUCKETNAME/$f --recursive;
fi
done
OLDBACKUP = s3cmd ls s3://$BUCKETNAME/"oldbackup"
LIMITOLDBACKUP = date --date='14 days ago' +'%Y%m%d'
for dump in $OLDBACKUP
if [${dump} > $LIMITOLDBACKUP]; then
s3cmd del s3://$BUCKETNAME/"oldbackup"/$dump
fi
done
Thanks
First, you are probably going to want to store FOLDERLIST as an array. You can do so like this: FOLDERLIST=($(command)).
Next, you should always store the output of commands which you intend to use like a string like so OUTPUT="$(command)".
So for example your first three lines should look like:
BUCKETNAME="application1"
FOLDERLIST=($(s3cmd ls s3://$BUCKETNAME))
LIMITFOLDER="$(date --date="1 days ago" +"%Y%m%d")"
Now your first for-loop should work.
That's the only thing I can guess is wrong with your script (the second for-loop suffers the same) but you really gave me nothing better to go on.
Your second for-loop, (besides not iterating over a proper array) has no do keyword, so you should do:
for dump in $OLDBACKUP
do
# rest of loop body
done
That could be another issue with your script.
Finally, you're only ever using OLDBACKUP and FOLDERLIST to iterate over. The same can be accomplished just by doing:
for f in $(s3cmd ls s3://$BUCKETNAME)
do
# loop body
done
There's no need to store the output in variables unless you plan to reuse it several times.
As a separate matter though, there's no need to use variable names consisting entirely of the capitalized alphabet. You can use lowercased variable names too so long as you understand that using the names of commands will cause errors.

Retrieving File name for bash/shell programing

I need to access two files in my shell script. The only issue is , I am not sure what the file name is going to be as it is system generated.A part of the file name is always constant , but the rest of it is system generated , hence may vary. I am not sure how to access these files?
Sample File Names
Type 1
MyFile1.yyyy-mm-dd_xx:yy:zz.log
In this case , I know MyFile1 portion is a constant for all the files, the other portion varies based on date and time. I can use date +%Y-%m-%d to get till MyFile1.yyyy-mm-dd_ but I am not sure how to select the correct file. Please note each day will have just one file of the kind. In unix the below command gives me the correct file .
unix> ls MyFile1.yyyy-mm-dd*
Type 2
MyFile2.yyyymmddxxyyxx.RandomText.SomeNumber.txt
In this file , as you can see Myfile2 portion is common,I can user Date +%Y%m%d to get till (current date) MyFile2.yyyymmdd, again not very clear how to go on from there .In unix the below command gives me the correct file .Also I need to have previous date in the dd column for File 2.
unix> ls MyFile2.yyyymmdd*
basically looking for the following line in my shell script
#!/bin/ksh
timeA=$(date +%Y-%m-%d)
timeB=$(date +%Y%m)
sysD=$(date +%d)
sysD=$((sysD-1))
filename1=($Home/folder/MyFile1.$timeA*)
filename2=($Home/folder/MyFile2.$timeB$sysD*)
Just not sure how to get the RHS for these two files.
The result when running the above scripts is as below
Script.ksh[8]: syntax error at line 8 : `(' unexpected
Perhaps this
$ file=(MyFile1.yyyy-mm-dd*)
$ echo $file
MyFile1.yyyy-mm-dd_xx:yy:zz.log
It should be noted that you must declare variables in this manner
foo=123
NOT
foo = 123
Notice carefully, bad
filename1=$($HOME/folder/MyFile1.$timeA*)
good
filename1=($HOME/folder/MyFile1.$timeA*)

Getting the name of the folder and comparing to integer in bash

Ok, so what I'm trying to do is, all my backup folders are named dates 03-07-13. So I'm trying to select the day, and if it's greater than or equal to 7 days old, it will delete. This is what I have so far, but it's not working.
DATE=$(date +"%d")
for i in /media/backupdrive/*; do
DAY=${i:22:2}
if [ "$DAY" -ge "7" ]
then
echo "day greater than 7";
fi
done
the 22:2 cuts off the /media/backupdrive/00-
00 represents the month
Right now It's just checking if it's greater than 7, if it is, it prints it out.
EDIT: The problem was resolved. I want to thank you all for helping a bash beginner. Thank you again!
Per a screenshot given in a comment, your actual code uses the following:
DAY=${i:22:2}
if [ "$day" -ge "7" ]
Emphasis on the capitalization-differences between DAY and $day. When this runs, it's trying to compare an empty-string to a string (or "numbers" via the -ge) and this will cause the error you're receiving.
Try updating your if statement to use the uppercase version:
if [ "$DAY" -ge "7" ]
It seems you want to delete files that are older than 7 days. The find command can find those files for you, and optionally delete them:
find /media/backupdrive -mtime +7 # Files that are older than 7 days
find /media/backupdrive -mtime +7 -delete # ... and delete them
Using the 'DAY' variable opens you up to "just rolled over" issues.
Some alternatives:
change the folder format so that it is more descriptive.
add a meta file in each folder that gives a time value that is easier to parse and work with.
have an index for the backup folders containing said data.
The time format I generally use incorporates the following:
[epoch seconds]-[YYYY][MM][DD]-[HH]:[MM]:[SS]
This lets you do things like asking for backups that are 7 days old from right now. You would do the math against the epoch seconds, which avoids the confusion of days rolling over.
Basically, the epoch seconds is for making time calcs easier. The other time stamp bits makes it human readable. The ordering makes it so that it sorts correctly in a folder listing.
EDIT:
In the event your backup path ever changes:
DAYtmp=${i: -8:5}
DAY=${DAYtmp: -2}
This will yield the DAY from the folder name if the parent paths change in length.

bash not adding current date to file name

I have a bash script which backups my source code on a 10 minute basis thru crontab. Script was working until the end of August. It's not working since September 1st. This is the script:
#!/bin/sh
date=`date +%e-%m-%y`
cd /home/neky/python
tar -zcf lex.tar.gz lex/
echo $date
mv lex.tar.gz lex-$date.tar.gz
mv lex-$date.tar.gz /home/neky/Dropbox/lex/lex-$date.tar.gz
If I execute it manually, it print the current date 4-09-12, and this error mv: target ‘4-09-12.tar.gz’ is not a directory
What could be the problem?
Your date contains a space when the day of month is a single digit (which also explains why it only stopped working in the new month). That results in your command being split up, i.e.
# this is what it you end up with
mv lex.tar.gz lex- 4-09-12.tar.gz
Use date +%d-%m-%y instead which will give you 04-09-12 (note %d instead of %e).
If you really want a space in the name, you'll need to quote your variables, i.e.:
mv lex.tar.gz "lex-$date.tar.gz"
mv "lex-$date.tar.gz" /home/neky/Dropbox/lex/
The character % (part of your date format) is a special one in cron scripts, so you need to escape it.

Resources