I want to delete some revisions of docker images. I want to delete the last 2 lines. I'm able to print the last 2 lines with:
ls -lt | tail -n 2
gives my the 2 last lines.
drwxr-xr-x 2 root root 4096 Nov 9 10:56 541a303d3c82785293f89a401038ac33ef2b54b6aeb09efd3d3bda7xxxx
drwxr-xr-x 2 root root 4096 Oct 25 12:07 c74e1399c99de0c23517abc95bc9b16d09df5c4d518776e77d9ae67xxxx
Now is my question. How do I have to delete them?
I tried ls -lt | tail -n 2 | rm -r * but than I deleted everything (the whole output of ls)
You could get that to work. I would probably use something like rm -rf $(ls -t | tail -n2), but parsing ls is really not recommended.
A cleaner way to do this would be to use find. You can use that to delete everything before a certain time. Something like this: find . -mtime -180 -exec rm -f {} \; would delete everything newer than 180 days ago.
I would highly recommend testing whatever you are planning to run before you actually do the delete!
You have the right idea; however, whatever comes after 'rm' gets deleted, which is why * deleted all instead of what you were trying to pipe into it.
This is a pretty clean way to do it though
rm `ls | tail -n 2`
Just for test :
ls -t | tail -n 2 | xargs -i -t echo {}
-t, --verbose
Print the command line on the standard error output before executing it.
After the test, you can delete them with :
ls -t | tail -n 2 | xargs -i -t rm -fr {}
rm -fr 541a303d3c82785293f89a401038ac33ef2b54b6aeb09efd3d3bda7xxxx
rm -fr c74e1399c99de0c23517abc95bc9b16d09df5c4d518776e77d9ae67xxxx
Related
I set up a daily cron job to backup my server.
In my folder backup, the backup command generates 2 files : the archive itself .tar.gz and a file .info.json like the ones below:
-rw-r--r-- 1 root root 1617 Feb 2 16:17 20200202-161647.info.json
-rw-r--r-- 1 root root 48699726 Feb 2 16:17 20200202-161647.tar.gz
-rw-r--r-- 1 root root 1617 Feb 3 06:25 20200203-062501.info.json
-rw-r--r-- 1 root root 48737781 Feb 3 06:25 20200203-062501.tar.gz
-rw-r--r-- 1 root root 1618 Feb 4 06:25 20200204-062501.info.json
-rw-r--r-- 1 root root 48939569 Feb 4 06:25 20200204-062501.tar.gz
How to I write a bash script that will only keep the last 2 archives and deletes all the others backup (targ.gz and info.json).
In this example, that would mean deleted 20200204-062501.info.json and 20200204-062501.tar.gz .
Edit:
I replace -name by -wholename in the script but when I run it, it doesn't have any effects apparently.The old archives are still there and they have not been deleted.
the script :
#!/bin/bash
DEBUG="";
DEBUG="echo DEBUG..."; #put last to safely debug without deleting files
keep=2;
for suffix in /home/archives .json .tar; do
list=( $( find . -wholename "*$suffix" ) ); #allow for zero names
if [ ${#list[#]} -gt $keep ]; then
# delete all but last $keep oldest files
${DEBUG}rm -f "$( ls -tr "${list[#]}" | head -n-$keep )";
fi
done
Edit 2:
if I run #sorin script, does it actually delete everything if I believe the script output?
The archive folder before running the script:
https://pastebin.com/7WtwVHCK
The script I run:
find home/archives/ \( -name '*.json' -o -name '*.tar.gz' \) -print0 |\
sort -zr |\
sed -z '3,$p' | \
xargs -0 echo rm -f
The script output:
https://pastebin.com/zd7a2zcq
Edit 3 :
The command find /home/archives/ -daystart \( -name '*.json' -o -name '*.tar.gz' \) -mtime +1 -exec echo rm -f {} + works and does the job.
Marked as solved
If the file is generated daily, a simple approach would be to take advantage of the -mtime find condition:
find /home/archives/ -daystart \( -name '*.json' -o -name '*.tar.gz' \) -mtime +1 -exec echo rm -f {} +
-daystart - use the start of the day for comparing modification times
\( -name '*.json' -o -name '*.tar.gz' \) - select files that end either in *.json or *.tar.gz
-mtime +1 - modification time is older than 24 hours (from the day start)
-exec echo rm -f {} + - remove the files (remove the echo after testing and verifying the result is what you want)
A simpler solution avoiding ls and it's pitfalls and not depending on the modification time of the files:
find /home/archives/ \( -name '*.json' -o -name '*.tar.gz' \) -print0 |\
sort -zr |\
sed -nz '3,$p' | \
xargs -0 echo rm -f
\( -name '*.json' -o -name '*.tar.gz' \) - find files that end in either *.json or tar.gz
-print0 - print them null separated
sort -zr - -z tells sort to use null as a line separator, -r sorts them in reverse
sed -nz '3,$p' - -z same as above. '3,$p' - print lines between 3rd and the end ($)
xargs -0 echo rm -f - execute rm with the piped arguments (remove the echo after you tested and you are satisfied with the command)
Note: not all sort and sed support the -z but most do. If you are stuck with such a situation, you might have to use a higher level language
Find the two most recent files in path:
most_recent_json=$(ls -t *.json | head -1)
most_recent_tar_gz=$(ls -t *.tar.gz | head -1)
Remove everything else ignoring the found recent files:
rm -i $(ls -I $most_recent_json -I $most_recent_tar_gz)
Automatic deleting can be hazardous to your mental state if it deletes unwanted files or aborts long scripts early due to unexpected errors. Say when there are fewer than 1+2 files in your example. Be sure the script does not fail if there are no files at all.
tdir=/home/archives/; #target dir
DEBUG="";
DEBUG="echo DEBUG..."; #put last to safely debug without deleting files
keep=2;
for suffix in .json .tar; do
list=( $( find "$tdir" -name "*$suffix" ) ); #allow for zero names
if [ ${#list[#]} -gt $keep ]; then
# delete all but last $keep oldest files
${DEBUG}rm -f "$( ls -tr "${list[#]}" | head -n-$keep )";
fi
done
Assuming that you have fewer than 10 files and that they are created in pairs, then you can do something straightforward like this:
files_to_delete=$(ls -tr1 | tail -n+3)
rm $files_to_delete
The -tr1 tells the ls command to list the files in reverse chronological order by modification time, each on a single line.
The tail -n+3 tells the tail command to start at the third line (skipping the first two lines).
If you have more than 10 files, a more complicated solution will be necessary, or you would need to run this multiple times.
I've tried to develop a little backup shellscript.
When I Run it int the Backup_FILESERVER Folder it's creating the tgz.
But from /root I get an Error.
tar cvfz /NAS/for_tape/FILESERVER.tgz /NAS/Backup_FILESERVER/`ls -Art | tail -2 | head -n 1`
Error:
tar: Tuesday: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous errors
In the folder "/NAS/Backup_FILESERVER" are 5 folders for each weekday. Monday, Tuesday, ...
Is it possible to make it runable?
Can you try
tar cvzf /NAS/for_tape/FILESERVER.tgz `find /NAS/Backup_FILESERVER/ -type d -exec sh -c "ls -1rt" \; | tail -2 | head -n 1`
find command with ls -1rt sorts the files based on the modification time and reverses it.
You can confirm if the command find /NAS/Backup_FILESERVER/ -type d -exec sh -c "ls -1rt" \; | tail -2 | head -n 1 gives the folder you need before starting the compression
this command allows me to login to a server, to a specific directory from my pc
ssh -t xxx.xxx.xxx.xxx "cd /directory_wanted ; bash"
How can I then do this operation in that directory. I want to be able to basically delete all files except the N most newest.
find ./tmp/ -maxdepth 1 -type f -iname *.tgz | sort -n | head -n -10 | xargs rm -f
This command should work:
ls -t *.tgz | tail -n +11 | xargs rm -f
Warning: Before doing rm -f, confirm that the files being listed by ls -t *.tgz | tail -n +11 are as expected.
How it works:
ls lists the contents of the directory.-t flag sorts by
modification time (newest first). See the man page of ls
tail -n +11 outputs starting from line 11. Please refer the man page of
tail for more
detials.
If the system is a Mac OS X then you can delete based on creation time too. Use ls with -Ut flag. This will sort the contents based on the creation time.
You can use this command,
ssh -t xxx.xxx.xxx.xxx "cd /directory_wanted; ls -t *.tgz | tail -n
+11 | xargs rm -f; bash"
In side quotes, we can add what ever the operations to be performed in remote machine. But every command should be terminated with semicolon (;)
Note: Included the same command suggested by silentMonk. It is simple and it is working. But verify it once before performing the operation.
I found 5 last core files.
I need to delete all core files except these 5 files.
ls -t /u01/1/bin/core.siebprocmw.* |head -n 5
command to find 5 last files by time.
ls -t /u01/1/bin/core.siebprocmw.* |head -n 5 |xargs rm -r
command remove found last 5 files.
I need to delete all files except these last 5 files. Any ideas?
You could use sed to exclude the first five newest files, then delete the rest:
ls -t /u01/1/bin/core.siebprocmw.* | sed '1,5d' | xargs rm -r
You could also try
ls -t /u01/1/bin/core.siebprocmw.* | head -n -5 | xargs rm -r
head -n -5 selects everything except the last 5 lines in the output of ls.
A coworker showed me a nifty way of using rm and xargs for deleting filenames listed in a .txt - but I can't remember what he did.
I ran
echo | xargs -a file.txt
where file.txt contained
1
2
3
4
And it printed
1 2 3 4
My logic says that
rm | xargs -a file.txt
should delete the files I created titled 1 and 2 and 3 and 4.
But that is not the behavior I get.
How do I form this simple command?
I believe you want:
xargs -a file.txt rm
The last argument to xargs should be the command you want it to run on all of the items in the file.
The solution proposed by Lynch is also valid and equivalent to this one.
Try this command:
xargs rm < file.txt
xargs take every line in input and append it to the command you specify.
so if file.txt contains:
a
b
then xargs will execute rm a b
Unless file.txt is really large, xargs is unnecessary and this is equivalent:
rm $(<file.txt)
and portable (POSIX) too.