How to find a specific file(txt) in a specific directory(-mmin -1)? - macos

/usr/local/bin/growlnotify -m 'Looking for subtitles...'
found='find /Users -type d -mmin -1'
found1='find $found/*.txt'
if [ -d "$found1" ];
then
/usr/local/bin/growlnotify -m "Subtitles downloaded!"
else
/usr/local/bin/growlnotify -m "Could not download subtitles"
fi
I am trying to write a bash script that would locate the folder in which an app downloaded subtitles and inform user using growl if they are present or not.
$found gives me a list of directories, but I do not know how to get to the one I want..
Please help =)
Sorry for my english

thanks for the answers! This is what I used, and what seems to be working just fine:
for FILE in "$#"
do
if [ -e "${FILE%.*}.txt" ];
then
/usr/local/bin/growlnotify -a iNapi -m "Napisy zostały pobrane!"
else
/usr/local/bin/growlnotify -a iNapi -m "Nie udało się pobrać napisów."
fi
done

Basically you have some errors in the script, besides them, I dont think it's the correct way to do it.
Anyway, first of, you should do:
found=`find /Users -type d`
(note the use of ` and not ')
That will store in $found a list of directories under /Users, the -mmin 1 param just list those dirs that were created in the last minute, if that's correct, just add it again.
Later that you need to loop the results to look for txt files:
for d in $found; do
#here you do ll or find for .txt files, using $d as dir
done
That way isn't the best for me, I think that you can just do:
find /Users -name *.txt
and then see what you got, the find output will print the directory where each txt file resides and that's the same that you are trying to do, but only one step.

Related

Bash: Go up and down directories

Dear stackoverflow community,
I am new to bash and I've got a problem regarding loops and directories (code below). So I am working in the opensmile directory and want to look for .wav files in the subdirectory opensmile/test-audio/. But if I change my directory in the "for" section to test-audio/*.wav, it probably could find the .wav-files but then the main-action does have access to the necessary config file "IS10_paraling.conf". Within the main-action the directories have to be written like after "-C", so without a "/" before the directory.
My loop works, if the wav files are inside the opensmile directory, but not inside a sub-directory. I would like to look for files in the subdirectory test-audio while the main-action still has access to all of the opensmile-directory.
So basically: How do I go up and down directories within a for loop?
Thank you very much in advance!
This works
#! /bin/bash
cd /usr/local/opensmile/
for f in *.wav;
do
/usr/local/opensmile/build/progsrc/smilextract/SMILExtract -C config/is09-13/IS10_paraling.conf -I $f -D output/$f.csv ;
done
This does not work
#! /bin/bash
cd /usr/local/opensmile/
for f in test-audio/*.wav;
do
/usr/local/opensmile/build/progsrc/smilextract/SMILExtract -C config/is09-13/IS10_paraling.conf -I $f -D output/$f.csv ;
done
Saying "this does not work", doesn't tell us anything. What happens? Is there an error message?
Nevertheless, your question was "So basically: How do I go up and down directories within a for loop?"
If I'm tempted to go up and down directories within a loop, I'll do it in a subshell, so that I can be sure that the next time I enter the loop I'll be where I was originally. So I'll put all my commands in ( ).
#! /bin/bash
cd /usr/local/opensmile/
CONFIG=$PWD/config
OUTPUT=$PWD/output
for f in test-audio/*.wav;
do
(
cd test-audio
/usr/local/opensmile/build/progsrc/smilextract/SMILExtract -C $CONFIG/is09-13/IS10_paraling.conf -I `basename $f` -D $OUTPUT/$f.csv
)
done
though why one would need to to it for this case, I can't fathom
Instead of using a for loop, could you use find for this:
find /usr/local/opensmile/ -type f -name "*.wav" -exec /usr/local/opensmile/build/progsrc/smilextract/SMILExtract -C config/is09-13/IS10_paraling.conf -I $1 -D output/$1.csv "{}" \;

Returning True or False from find command

I need to check if a given file, with specific filename, exists in a directory (including sub-directories), and then perform an action if it exists, and another if not.
Is there a way to use find command, returning True or False, as a condition for an if statement? I'm struggling to get it to work, but I don't know if it's just a problem of syntax or if there is a better way to approach the problem.
My specific problem is that I have a folder with lots of pictures in it, altogether. And I have a second folder where part of the same pictures are organized by subject on subfolders.
So I want to delete, in the first folder, all the pictures that already are in the second folder.
What I'm trying to do is:
for file in "/first/folder"/*; do
filename=`basename $file`
if [find /second/folder -name $filename]
then
rm $file
fi
done
I don't know if it's just a problem of syntax or if there is a better way to approach the problem.
Yes there are syntax errors in your program. But there is no need to run find over and over for each file. Here is a more effective approach:
find /second/folder -type f -exec bash -c '
for fpath in "${#/#*\//"/first/folder/"}"; do
if [[ -f $fpath ]]; then
delete+=("$fpath")
fi
done
echo rm -- "${delete[#]}"' _ {} +
Drop echo if its output looks good.
Your block for if [ find ... ] is only going to execute once depending upon the overall execution result of find. find supports an option to execute a command on matches:
for file in "/first/folder"/* ; do
filename=$(basename "$file")
find /second/folder -name $filename -exec rm {} \;
fi

want to know when find command does not find anything

Is there a way to have the find command return a value when it does not find a match? Basically, I have an old backup, and I want to search for each of the files in it on my current computer. Here is a cheesy way I was going to do it:
first run the following from the home directory:
$ ls -lR * > AllFiles.txt;
This will build my listing of all of my files on the current computer.
Next run the following script for each file in the back up:
#! /bin/bash
if ! grep $1 ~/AllFiles.txt; then
echo file $1 not found;
exit 1;
fi
Of course this is clunky, and it does not account for filename changes, but it's close enough. Alternatively, I'd like to do a find command for each of the back up files.
You can use standard return value test if using a standard gnu find, such as:
find . ! -readable -prune -o -name 'filenameToSearch.ext'
then check for return value using:
echo $?
if any value other than 0 means it did not find a match.
If I understood you correctly;
grep -r valueORpatternToSearchinTEXT $(find . -type f) |wc -l
This will find for every file in the working/existing directory you are and its subdirs, search for what you need, then count for lines, if it is not found, you will get 0 at the end. Remove pipe and afterwards if you want to see what is found and where.

List directories not containing certain files?

I used this command to find all the directories containing .mp3 in the current directory, and filtered out only the directory names:
find . -iname "*.mp3" | sed -e 's!/[^/]*$!!' -e 's!^\./!!' | sort -u
I now want the opposite, but I found it a little harder. I can't just add a '!' to the find command since it'll only exclude .mp3 when printing them not find directories that do not contain .mp3.
I googled this and searched on stackoverflow and unix.stackexchange.com.
I have tried this script so far and it returns this error:
#!/bin/bash
find . -type d | while read dir
do
if [[! -f $dir/*.mp3 ]]
then
echo $dir
fi
done
/home/user/bin/try.sh: line 5: [[!: command not found
#!/bin/bash
find . -type d | while read dir
do
if [! -f $dir/*.mp3 ]
then
echo $dir
fi
done
/home/user/bin/try.sh: line 5: [!: command not found
#!/bin/bash
find . -type d | while read dir
do
if [[! -f "$dir/*.mp3" ]]
then
echo $dir
fi
done
/home/user/bin/try.sh: line 5: [!: command not found
I'm thinking it has to do with multiple arguments for the test command.
Since I'm testing all the directories the variable is going to change, and I use a wildcard for the filenames.
Any help is much appreciated. Thank You.
[ "$(echo $dir/*.mp3)" = "$dir/*.mp3" ]
should work.
Or simply add a space between '[' and '!'
A method that is probably significantly faster is
if find "$dir" -name '*.mp3' -quit ; then
: # there are mp3-files in there.
else
; # no mp3:s
fi
Okay, I solved my own answer by using a counter.
I don't know how efficient it is, but it works. I know it can be made better. Please feel free to critique.
find . -type d | while read dir
do
count=`ls -1 "$dir"/*.mp3 2>/dev/null | wc -l`
if [ $count = 0 ]
then
echo $dir
fi
done
This prints all directories not containing MP3s It also shows sub-directories thanks to the find command printing directories recursively.
I ran a script to automatically download cover art for my mp3 collection. It put a file called "cover.jpg" in the directory for each album for which it could retrieve the art. I needed to check for which albums the script had failed - i.e. which CDs (directories) did not contain a file called cover.jpg. This was my effort:
find . -maxdepth 1 -mindepth 1 -type d | while read dir; do [[ ! -f $dir/cover.jpg ]] && echo "$dir has no cover art"; done
The maxdepth 1 stops the find command from descending into a hidden directory which my WD My Cloud NAS server had created for each album and placed a default generic disc image. (This got cleared during the next scan.)
Edit: cd to the MP3 directory and run it from there, or change the . in the command above to the path to point to it.

deleting files in a given subdirectory

I have a few subdirectories in a given folder, where a file d2.sh~ exists. I want to delete this file via following shell script, which, rather than writing in a .sh file I wrote on terminal, on one line. [Edit: been formatted properly here for clarity]
for i in `ls *`; do
if [ -d $i ]; then
cd $i
rm d2.sh~
cd ..
fi
done
This did not give me any errors but it failed to delete d2.sh~ from the subdirectories. So I want to know what mistake I have made above?
find /some/path -type f -name "d2.sh~" -delete
Your first mistake is trying to parse ls. See this link as to why.
Just use for i in *; do ....
If you need recursion then you need to look to find or if you have Bash 4.X you can do:
shopt -s globstar; for i in **/d2.sh~; do rm "$i"; done

Resources