How to delete all files not in a set - bash

I have a plain text file with a list of file names. For example,
A.doc
E.doc
F.pdf
I would like to delete all files in the current directory except for those.
Can this be done in bash?

Let's say the list of files not to delete is goodfiles.txt. Then:
ls | grep -vx -f goodfiles.txt
Gives you the list of "other" files, that you want to delete. If you confirm those are the files you want to delete, then:
ls | grep -vx -f goodfiles.txt | xargs -d '\n' rm

Related

Batch rename all files in a directory to basename-sequentialnumber.extention

I have a directory containing .jpg files, currently named photo-1.jpg, photo-2.jpg etc. There are about 20,000 of these files, sequentially numbered.
Sometimes I delete some of these files, which creates gaps in the file naming convention.
Can you guys help me with a bash script that would sequentially rename all the files in the directory to eliminate the gaps? I have found many posts about renaming files and tried a bunch of things, but can't quite get exactly what I'm looking for.
For example:
photo-1.jpg
photo-2.jpg
photo-3.jpg
Delete photo-2.jpg
photo-1.jpg
photo-3.jpg
run script to sequentially rename all files
photo-1.jpg
photo-2.jpg
done
With find and sort.
First check the output of
find directory -type f -name '*.jpg' | sort -nk2 -t-
If the output is not what you expected it to be, meaning the order of sorting is not correct, then It might have something to do with your locale. Add the LC_ALL=C before the sort.
find directory -type f -name '*.jpg' | LC_ALL=C sort -nk2 -t-
Redirect it to a file so it can be recorded, add a | tee output.txt after the sort
Add the LC_ALL=C before the sort in the code below if it is needed.
#!/bin/sh
counter=1
find directory -type f -name '*.jpg' |
sort -nk2 -t- | while read -r file; do
ext=${file##*[0-9]} filename=${file%-*}
[ ! -e "$filename-$counter$ext" ] &&
echo mv -v "$file" "$filename-$counter$ext"
counter=$((counter+1))
done # 2>&1 | tee log.txt
Change the directory to the actual name of your directory that contains the files that you need to rename.
If your sort has the -V flag/option then that should work too.
sort -nk2 -t- The -n means numerically sort. -k2 means the second field and the -t- means the delimiter/separator is a dash -, can be written as -t -, caveat, if the directory name has a dash - as well, sorting failure is expected. Adjust the value of -k and it should work.
ext=${file##*[0-9]} is a Parameter Expansion, will remain only the .jpg
filename=${file%-*} also a Parameter Expansion, will remain only the photo plus the directory name before it.
[ ! -e "$filename-$counter$ext" ] will trigger the mv ONLY if the file does not exists.
If you want some record or log, remove the comment # after the done
Remove the echo if you think that the output is correct

Script for deleting files starting with texts listed in a file

I have a strings.txt file that contains a list of strings like this:
0100101
0100102
0100103
0100104
...
I need to create a script that deletes all the files in a directory, starting with each of the strings contained in the previous file.
If there are files with these names in the directory, they should be deleted:
0100101.jpg
0100101-01.jpg
0100101A1.jpg
If there are files with these names in the directory, they should not be deleted:
40100101.jpg
570100102.jpg
340100104-02.jpg
Can you help me?
You can use a combination of cat and xargs to achieve this
cat strings.txt | xargs -t -I{} sh -c 'rm {}*'
More Info on xargs: http://man7.org/linux/man-pages/man1/xargs.1.html

Bash script to separate files into directories, reverse sort and print in an HTML file works on some files but not others

Goal
Separate files into directories according to their filenames, run a Bash script that reverse sorts them and assembles the content into one file (I know steps to achieve this are already documented on Stack Overflow, but please keep reading...)
Problem
Scripts work on all files but two
State
Root directory
dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
dos-18-2-release-notes.html
dos-18-2-known-issues.html
Separate the files into directories according to their SQL Server version or name
ls | grep "^dos-18-1.*2012.*" | xargs -i cp {} dos181-2012
ls | grep "^dos-18-1.*2016.*" | xargs -i cp {} dos181-2016
ls | grep ".*notes.*" | xargs -i cp {} dos-18-2-release-notes
ls | grep ".*known.*" | xargs -i cp {} dos-18-2-known-issues
Result (success)
/dos181-2012:
dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
/dos181-2016:
dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
/dos-18-2-known-issues
dos-18-2-known-issues.html
/dos-18-2-release-notes
dos-18-2-release-notes.html
Variables (all follow this pattern)
dos181-2012.sh
file="dos181-2012"
export
dos-18-2-known-issues
file="dos-18-2-known-issues"
export
Reverse sort and assemble (assumes /$file exists; after testing all lines of code I believe this is where the problem lies):
cat $( ls "$file"/* | sort -r ) > "$file"/"$file".html
Result (success and failure)
dos181-2012.html has the correct content in the correct order.
dos-18-2-known-issues.html is empty.
What I have tried
I tried to ignore the two files in the command:
cat $( ls "$file"/* -i (grep ".*known.*" ) | sort -r ) > "$file"/"$file".html
Result: The opposite occurs
dos181-2012.html is empty
dos-18-2-known-issues.html is not empty
Thank you
I am completely baffled. Why do these scripts work on some files but not others? (I can share more information about the file contents if that will help, but the file contents are nearly identical.) Thank you for any insights.
first off, you question is quite incomplete. You start great, showing the input files and directories. But then you talk about variables and $files, but you do not show the code from which these originate. So I based my answer on the explanation in the first paragraph and what I deduced from the rest of the question.
I did this:
#!/bin/bash
cp /etc/hosts dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
cp /etc/hosts dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
cp /etc/hosts dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
cp /etc/hosts dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
cp /etc/hosts dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
cp /etc/hosts dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
cp /etc/hosts dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
cp /etc/hosts dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
cp /etc/hosts dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
cp /etc/hosts dos-18-2-release-notes.html
cp /etc/hosts dos-18-2-known-issues.html
DIRS='dos181-2012 dos181-2016 dos-18-2-release-notes dos-18-2-known-issues'
for DIR in $DIRS
do
if [ ! -d $DIR ]
then
mkdir $DIR
fi
done
cp dos-18-1*2012* dos181-2012
cp dos-18-1*2016* dos181-2016
cp *notes* dos-18-2-release-notes
cp *known* dos-18-2-known-issues
for DIR in $DIRS
do
/bin/ls -c1r $DIR >$DIR.html
done
The cp commands are just to create the files with something in them.
You did not specify how the directory names were produced, so I went with the easy option and listed them in a variable ($DIRS). These could be built based on the filenames, but you did not mention that.
Then created the directories (first for).
Then 4 cp commands. Your code is very complicated for something so basic. cp, like rm;mv;ls;... can do wildcard expansion, so there is no need for complex grep and xargs to copy files around.
Finally in the last for loop, list the files (ls), in 1 column (-c1, strictly output formatting), reversed the sort order (-r). The result of that ls is sent to a ".html" file of the same name as the directory.

Bash - Create a for loop to find all values in directories

I am trying to find all the files that have a value within them. The value is within another file I created.
So the file name with all the values in them is called value.txt.
Where I need to search for all the values there are multiple directories with .txt files
I am trying something like this
find -name "*.txt" | xargs grep value.txt
I want to change this so it loops through all directories and lists all the values. I need to create a for loop to do this.
How about just:
grep -l -r -f value.txt base_directory
Explanation:
-l - Just print the names of the files where we found a match
-r - Recurse into subdirectories.
-f value.txt - Read patterns from value.txt
base_directory - Where to look for matching files.
If you want to search multiple directories that aren't hierarchically organized:
for dir in some_dir1 some_other_dir some_dir/in/some_dir
do
grep -l -r -f value.txt $dir
done

using xargs in an alias to remove svn unversioned files interactively?

I'm trying to create a bash alias to remove, interactively, all svn unversioned files. I've gotten as far as
svn st | grep '^\?'| sed 's/^? //'
to get a list of such files. Piping into xargs -p rm just gives me a single prompt for all the files, e.g.
rm fileA fileB fileC fileD ?...
whereas I want to confirm each file individually. On the command line, I can do
rm -i $(svn st | grep '^\?'| sed 's/^? //')
to get the desired behavior, but it doesn't work when I stick it in an alias or function.

Resources