Bash scripting: Deleting the oldest directory - bash

I want to look for the oldest directory (inside a directory), and delete it. I am using the following:
rm -R $(ls -1t | tail -1)
ls -1t | tail -1 does indeed gives me the oldest directory, the the problem is that it is not deleting the directory, and that it also list files.
How could I please fix that?

rm -R "$(find . -maxdepth 1 -type d -printf '%T#\t%p\n' | sort -r | tail -n 1 | sed 's/[0-9]*\.[0-9]*\t//')"
This works also with directory whose name contains spaces, tabs or starts with a "-".

This is not pretty but it works:
rm -R $(ls -lt | grep '^d' | tail -1 | tr " " "\n" | tail -1)

rm -R $(ls -tl | grep '^d' | tail -1 | cut -d' ' -f8)

find directory_name -type d -printf "%TY%Tm%Td%TH%TM%TS %p\n" | sort -nr | tail -1 | cut -d" " -f2 | xargs -n1 echo rm -Rf
You should remove the echo before the rm if it produces the right results

Related

I want my script to echo "$1" into a file literally

This is part of my script
#!/bin/bash
echo "ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//';" >> script2.sh
This echos everything nicely into my script except $1 and $2. Instead of that it outputs the input of those variables but i want it to literally read "$1" and "$2". Help?
Escape it:
echo "ls /SomeFolder | grep \$1 | xargs cat | grep something | grep .txt | awk '{print \$2}' | sed 's/;\$//';" >> script2.sh
Quote it:
echo "ls /SomeFolder | grep "'$'"1 | xargs cat | grep something | grep .txt | awk '{print "'$'"2}' | sed 's/;"'$'"//';" >> script2.sh
or like this:
echo 'ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '\''{print $2}'\'' | sed '\''s/;$//'\'';' >> script2.sh
Use quoted here document:
cat << 'EOF' >> script2.sh
ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//';
EOF
Basically you want to prevent expansion, ie. take the string literaly. You may want to read bashfaq quotes
First, you'd never write this (see https://mywiki.wooledge.org/ParsingLs, http://porkmail.org/era/unix/award.html and you don't need greps+seds+pipes when you're using awk):
ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//'`
you'd write this instead:
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -exec \
awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}' {} +
or if you prefer using print | xargs instead of -exec:
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -print0 |
xargs -0 awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}'
and now to append that script to a file would be:
cat <<'EOF' >> script2.sh
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -print0 |
xargs -0 awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}'
EOF
Btw, if you want the . in .txt to be treated literally instead of as a regexp metachar meaning "any character" then you should be using \.txt instead of .txt.

Output a find with xargs to a log file

I have some code that works. But I want to output it to a log file so that I know what is being copied from one location to another.
echo "find ${varSrcDirectory} -maxdepth 1 -type f -printf "%p\t%t\n" | sort -t $'\t' -k2 -nr | grep ${varFullYear} | grep ${month} | cut -f 1 | xargs -i cp '{}' -p -t ${varDstDirectory}/${varFullYear}/${monthNum} " >> $LOG
find ${varSrcDirectory} -maxdepth 1 -type f -printf "%p\t%t\n" | sort -t $'\t' -k2 -nr | grep ${varFullYear} | grep ${month} | cut -f 1 | xargs -i cp '{}' -p -t ${varDstDirectory}/${varFullYear}/${monthNum} >> $LOG
Here is the result in my log file
find /ftp/bondloans/transfers/out/ -maxdepth 1 -type f -printf %pt%tn | sort -t $'\t' -k2 -nr | grep 2008 | grep Jan | cut -f 1 | xargs -i cp '{}' -p -t /ftp/bondloans/transfers/out/testa/2008/01
But what I want to see is the actual file being copied from one location to another.
Add the -v option to cp, so it will print what it's copying.
find ${varSrcDirectory} -maxdepth 1 -type f -printf "%p\t%t\n" | sort -t $'\t' -k2 -nr | grep ${varFullYear} | grep ${month} | cut -f 1 | xargs -i cp -v '{}' -p -t ${varDstDirectory}/${varFullYear}/${monthNum} >> $LOG

Bash : Find and Remove duplicate files from different folders

I have two folders with some common files, I want to delete duplicate files from xyz folder.
folder1:
/abc/file1.csv
/abc/file2.csv
/abc/file3.csv
/abc/file4.csv
folder2:
/xyz/file1.csv
/xyz/file5.csv
I want to compare both folders and remove duplicate from /xyz folder. Output should be: file5.csv
For now I am using :
find "/xyz" "/abc" "/abc" -printf '%P\n' | sort | uniq -u | -exec rm {} \;
But it failing with reason : if -exec is not a typo you can run the following command to lookup the package that contains the binary:
command-not-found -exec
-bash: -exec: command not found
-exec is an option to find, you've already exited the command find when you started the pipes.
Try xargs instead, it take all the data from stdin and appends to the program.
UNTESTED
find "/xyz" "/abc" "/abc" -printf '%P\n' | sort | uniq -u | xargs rm
Find every file in 234 and 123 directory get filename by -printf, sort them, uniq -d give list of duplications, give back path by sed, using 123 directory to delete the duplications from, and pass files to xargs rm
Command:
find ./234 ./123 -type f -printf '%P\n' | sort | uniq -d | sed 's/^/.\/123\//g' | xargs rm
sed don't needed if you are in the ./123 directory and using full path for folders in find.
Another approach: just find the files in abc and attempt to remove them from xyz:
UNTESTED
find /abc -type f -printf 'rm -f /xyz/%P' | sh
Remove Duplicate Files From Particular Directory
FileList=$(ls)
for D1 in $FileList ;do
if [[ -f $D1 ]]; then
for D2 in $FileList ;do
if [[ -f $D2 ]]; then
if [[ $D1 == $D2 ]]; then
: 'Skip Orignal File'
else
if [[ $(md5sum $D1 | cut -d'=' -f 2 | cut -d ' ' -f 1 ) == $(md5sum $D2 | cut -d'=' -f 2 | cut -d ' ' -f 1 ) ]]; then
echo "Duplicate File Found : $D2"
rm -rf $D2
fi #Detect Duplicate Using MD5
fi #Skip Orginal File
fi #D2 File available Then Next
done
fi #D1 File available Then Next
done

moving files with xargs

I want to pipe the output of ls into head and pipe it into mv.
I used the following command on terminal but it isn't working properly.
ls -t Downloads/ | head -7 | xargs -i mv {} ~/cso/
Please do rectify the error. Thanks in advance!
It is well documented that parsing ls output is not recommended. You can use this safe approach using find + sort + cut + head + xargs pipeline:
find . -maxdepth 1 -type f -printf '%T#\t%p\0' |
sort -z -rnk1 |
cut -z -f2 |
head -z -n 7 |
xargs -0 -I {} mv {} ~/cso/
Use -I like here :
ls -t Downloads/* | head -7 | xargs -I '{}' mv '{}' ~/cso/

Getting file size in bytes with bash (Ubuntu)

Hi, i'm looking for a way to output a filesize in bytes. Whatever i try i will get either 96 or 96k instead of 96000.
if [[ -d $1 ]]; then
largestN=$(find $1 -depth -type f | tr '\n' '\0' | du -s --files0-from=- | sort | tail -n 1 | awk '{print $2}')
largestS=$(find $1 -depth -type f | tr '\n' '\0' | du -h --files0-from=- | sort | tail -n 1 | awk '{print $1}')
echo "The largest file is $largestN which is $largestS bytes."
else
echo "$1 is not a directory..."
fi
This prints "The largest file [file] is 96k bytes"
there is -b option for this
$ du -b ...
Looks like you're trying to find the largest file in a given directory. It's more efficient (and shorter) to let find do the heavy lifting for you:
find $1 -type f -printf '%s %p\n' | sort -n | tail -n1
Here, %s expands to the size in bytes of the file, and %p expands to the name of the file.

Resources