Bash Script: find command getting stuck - bash

I'm currently writing a bash script wherein a portion of it needs to be able to look at a bunch of directory hierarchies and spit out two text files each containing a list of the directories and all the files, respectively, in the given directory.
As I understand the following should do the trick:
find $directory -type d >> alldirs.txt
where directory is assigned different directory path names since I'm supposed to check a number of them.
I have a for loop the iterates through my list of directories and uses the above function to complete my task. The above command gets to a certain point and then it gets stuck. When I investigated the issue it seemed like it would get to a directory that's empty and then it get stuck. And or it would actually start looking for directories that don't exist in the first place then it would get stuck. Any ideas?
Is there something I'm missing? Or did I understand how that works incorrectly? Is there a better alternative?

You haven't said $directory is a name. Without it, bash will complain that "find: $directory: No such file or directory"
For example:
find . -iname $directory -type d >> alldirs.txt
Note: The above will start searching in the current directory, specified by the "."
Change it to whatever directory you wish e.g. /home/mys.celeste

I had similar issue: find / -name blahblah stuck somewhere
When debugging I tried to search in all root directories like/tmp, /var, /sbin, /user and so on. And found that it is stuck on /media.
In /media I had RHEL repo mounted. So afterunmount - find continue to work normally.

Related

macOS find command behaving strangely

Example of command as used in bash script:
find '/Files' -type d -name temp* -depth -delete -print
This command should delete all folders, whose names start with "temp" in '/Files' folder and its subfolders ("temp0", "temp1", "temp2" etc.).
Script is working as expected, folders are found and properly deleted.
But sometimes, for some users, on some computers etc. script is not working as expected, despite the fact that folders & files are exactly the same.
Find command fails like this:
find /Files -type d -name tempta temptal -depth -delete -print
find: temptal: unknown primary or operator
I can't find out where "tempta" and "temptal" are coming - i don't have files with that names anywhere in the folder. Temp* folders are present, but not deleted because of this error.
The only thing which might be connected, are two files named "AbcInstall.sh" and "AbcInstall.log" in "AbcTemp" subfolder. So we have "ta" and "tal" plus "Temp". These are elements which reminds on "tempta" and "temptal", but they make no real sense - it could be a coincidence.
How can "find" result resolve into something like this !?!
Sorry for the lack of better explanation - this problem is really weird. The problem is that i can't replicate this issue on my computer so all i can do is experimenting (so far without success).
Any hints or ideas are greatly appreciated.
Thx!

Is it not possible to use the CLI, bash in this case, to find a set, take a set, move it, and not have to write a library to do so

I thought this would be more of a one liner to be honest.
Pretty simple in notion:
using find on Mac OS X, locate files that meet a criteria, in my case:
all file and directories in a directory:
find ~/Downloads +time1000s
That finds what I need, then I run a conditional, if a dir exists, delete it, if not, create it:
mkdir -p ~/.Trash/safe-to-delte-these-old-files
This means I need to add print0 to my find as files will have spaces, and I want to move, not copy but either way, there is a source, and a destination, and I am getting stuck:
https://gist.github.com/5c0tt/5a2c1fd39ae99d6fca05
Line 27 and 26 seem to cause me issues, I am stuck.
Suggestions on everyone from line 1 to the end. I am trying to hard to do this with POSIX in mind, but i can't even get variables to work then.
It seems BSD does not work the exact same way as other shells and what arguments they accept, which is why I am trying to be more POSIX, as I was told it should run anywhere then.
Thank you.
Took a glance at your git link, a couple of remarks if I may (still a noob tbh so may be irrelevant) :
dir_to_clean="/Users/my_username/Downloads" should probably be dir_to_clean="/Users/$current_user/Downloads" unless you actually have a literal /Users/my_username/Downloads folder.
Instead of cd'ing into your users directory and since you have hardcoded the path to that directory, you could use pushd & popd instead in order to build a stack of directories.
To answer your question, to capture files with spaces in the name for removal you could use something like :
find $dir_to_clean -not -name . +time1000s -exec mv -- {} ~/.Trash/
Could be something like this :
# Initialise variables, user, source to clean, destination
user=$(whoami);
src="/Users/$user/Downloads";
dest="~/.Trash/safe_to_delete";
# Move to directory to clean, not necessary, but if you really want to
pushd $src;
# Check if the destination exists, if not, create it
if [ ! -d $dest ]; then
mkdir -p $dest;
else
echo "destination already exists";
fi
# Find all the file to move then move them
find . -not -name . +time1000s -exec mv -- {} "$dest/" \;
# Return to previous working directory
popd;
pushd the $src directory onto the stack. find all the files in the now current directory ., -not -name . in order to avoid trying to trash the . & .. folders, -- tells find to stop parsing command line options (in cas your file/folder is named i.e. -myfile.txt), exec mv all of the arguments to $dest. popd the still current directory off of the stack. man find (/exec) for more details.
Note : It is also interesting to know that the difference of execution time between the -exec option versus results being piped into xargs can and will often be quite dramatic. Also, if your are actually sure that those files are safe_to_delete, then delete them instead of moving them (-exec rm -- {} $dest/). With all that said, you were correct, one liner.
Further reading :
http://www.softpanorama.org/Tools/Find/using_exec_option_and_xargs_in_find.shtml

Script to find files in subdirectories

I need a script that will find and get me all files in all subdirectories (and leave them in the folder structure as they are now). I know how to find and print that files:
find . -name "something.extension"
The point is, in those directories are lots files that was used before, but I don't want to get those, so the script should only find me files that matches some kind of path pattern which is:
xxx/trunk/xxx/src/main/resources
xxx is different everytime, and after resources there are still some folders that directories are different based on xxx.
Every top xxx folder contains folder named 'tags' (the same level as trunk) that stores previous releases of module (and every release has files that name I am looking for, but I don't want outdated files).
So I want to find all that files in subdirectories of that path pattern that I specified and copy to new location but leave folder structure as it is right now.
I am using Windows and cygwin.
Update
I combined answer commands that 'that other guy' posted below, and it works. Just to be clear I have something like this:
find */trunk/*/src/main/resources -name "something.extension" -exec mkdir -p /absolute/target/path/{} \; -exec cp {} /absolute/target/path/{} \;
Thanks.
Instead of searching under the entire current directory (.), just search under the directories you care about:
find */trunk/*/src/main/resources -name "something.extension"

Copying multiple files with same name in the same folder terminal script

I have a lot of files named the same, with a directory structure (simplified) like this:
../foo1/bar1/dir/file_1.ps
../foo1/bar2/dir/file_1.ps
../foo2/bar1/dir/file_1.ps
.... and many more
As it is extremely inefficient to view all of those ps files by going to the
respective directory, I'd like to copy all of them into another directory, but include
the name of the first two directories (which are those relevant to my purpose) in the
file name.
I have previously tried like this, but I cannot get which file is from where, as they
are all named consecutively:
#!/bin/bash -xv
cp -v --backup=numbered {} */*/dir/file* ../plots/;
Where ../plots is the folder where I copy them. However, they are now of the form file.ps.~x~ (x is a number) so I get rid of the ".ps.~*~" and leave only the ps extension with:
rename 's/\.ps.~*~//g' *;
rename 's/\~/.ps/g' *;
Then, as the ps files have hundreds of points sometimes and take a long time to open, I just transform them into jpg.
for file in * ; do convert -density 150 -quality 70 "$file" "${file/.ps/}".jpg; done;
This is not really a working bash script as I have to change the directory manually.
I guess the best way to do it is to copy the files form the beginning with the names
of the first two directories incorporated in the copied filename.
How can I do this last thing?
If you just have two levels of directories, you can use
for file in */*/*.ps
do
ln "$file" "${file//\//_}"
done
This goes over each ps file, and hard links them to the current directory with the /s replaced by _. Use cp instead of ln if you intend to edit the files but don't want to update the originals.
For arbitrary directory levels, you can use the bash specific
shopt -s globstar
for file in **/*.ps
do
ln "$file" "${file//\//_}"
done
But are you sure you need to copy them all to one directory? You might be able to open them all with yourreader */*/*.ps, which depending on your reader may let browse through them one by one while still seeing the full path.
You should run a find command and print the names first like
find . -name "file_1.ps" -print
Then iterate over each of them and do a string replacement of / to '-' or any other character like
${filename/\//-}
The general syntax is ${string/substring/replacement}. Then you can copy it to the required directory. The complete script can be written as follows. Haven't tested it (not on linux at the moment), so you might need to tweak the code if you get any syntax error ;)
for filename in `find . -name "file_1.ps" -print`
do
newFileName=${filename/\//-}
cp $filename YourNewDirectory/$newFileName
done
You will need to place the script in the same root directory or change the find command to look for the particular directory if you are placing the above script in some other directory.
References
string manipulation in bash
find man page

Bash script to find specific files in a hierarchy of files

I have a folder in which there are many many folder and in each of these I have lots and lots of files. I have no idea which folder each files might be located in. I will periodically receive a list of files I need to copy to a predefined destination.
The script will run on a Unix machine.
So, my little script should:
read received list
find all files in the list
copy each file to a predefined destination via SCP
step 1 and 3, I think I'll manage on my own, but how will I do step 2?
I was thinking about using "find" to locate each file and when found, write the location in a string array. When all files are found I loop through the string array, running the "SCP" command for each file-location.
I think this should work, but I've never written a bash script before so could anyone help me a little to get started? I just need a basic "find" command which finds a filename and returns the file location if the file is found.
find $dir -name $name -exec scp {} $destination \;

Resources