Winter bashing with find - bash

Recently (that is in winter in few days) I wrote a simple script which packs some folders, script is listed below:
#!/bin/bash
for DIR in `find -name "MY_NAME*" -type d`
do
tar -zcvf $DIR.tar.gz $DIR &
done
echo "Packing is done" > packing.txt
It works fine except that it searches for MY_NAME* in every sub-directory of the folder where it runs.
Because MY_NAME* folders contain lots of files, and packing takes long hours, I want to limit time loss and I want the find command to find those MY_NAME* directories only within the folder where the script is running (without sub-directories). Is it possible with command find ?

If you want it only in the folder you are in, don't use find. Try this:
for DIR in MY_NAME*/
do
tar -zcvf "$DIR".tar.gz "$DIR" &
done
echo "Packing is done" > packing.txt

It seems you want to use the -maxdepth flag on the find command:
find -name "MY_NAME*" -type d -maxdepth 1

Related

Mv files contained in directories to directories/new path

I'm working with macOS Sierra.
I have ~ 1000+ directories with lots of files in it. Word, Excel and Zipped documents in it. Only one sub level. Important : there is spaces in the filenames and in the folder names.
We decided to change the arborescence of the files ; all the files in each directory need to be moved to a subdirectory in it called "Word & Excel" before merging with another directory tree.
I managed to create the Word & Excel directory with this command :
for dir in */; do mkdir -- "$dir/Word & Excel"; done
Basically, I just want to do
for dir in */; do mv $dir/* "./Word & Excel"; done
It is not going to work. I even do not understand if the problem is with the $dir — I need the double quote to avoid the space problem, but the asterisk is not going to work if I work with the double quote... — or with the asterisk.
I tried to get a cleaner version by following a previous answer found on the web to a similar problem, clearing the subfolder of the results (and trying basically to avoid my wildcard problem) :
for dir in */; do mv `ls -A "$dir" | grep -v "Word & Excel"` ./"Word & Excel" | cd ../ ; done
I am completely stuck.
Any idea how to handle this?
This should make it, even on Mac OS X. And yes, find sometimes needs the anchor directory.
while read dir; do
mkdir -p "$dir/Word & Excel"
find "$dir" -maxdepth 1 -type f -exec mv {} "$dir/Word & Excel" \;
done < <(find . -mindepth 1 -maxdepth 1 -type d)
This loops over the sub-directories of the current directory (one sub-level only), for each of them (dir), creates the dir/Word & Excel sub-sub-directory if it does not already exist, finds all regular files immediately inside dir and moves them in the dir/Word & Excel. And it should work even with crazy directory and file names.
This being said, if you could convince your boss not to use unusual file or directory names, you life with bash and the Command Line Interface (CLI) would probably be much easier.
Okay, I will use "subfolder" as my subfolder name.
First, creating subfolder within all the dirs
for dir in $(find -type d | grep "/");do mkdir $dir/subfolder; done
I each of one of those, I created a file. I order to move all files within the dirs to the subfolder, I will do something like:
for dir in $(find -type d | grep -vE 'subfolder' | grep '/');do for file in $(find $dir -type f);do mv $file $dir/subfolder;done ;done
You might want to experiment with --exec in find, but just creating a nested loop was the fastest solution for me.
Let me break it down for you. Here, I try to find all the directories in my path, excluding the subfolder directory and the current one. I could've used -maxdepth 0 with find but since I only had these dirs, it wasnt necessary
for dir in $(find -type d | grep -vE 'subfolder' | grep '/')
Now, in each of those dirs, we try to find all the files (in your case, the zip files and what now).
do for file in $(find $dir -type f)
Now, we just move the found files into the directories from the first loop with the name of the subfolder appended.
do mv $file $dir/subfolder;done ;done
Keep in mind that since the first loop is closed at the very end, it will do the move operation for 1 directory at a time, and for all files in only that directory. Nested loops can be a bit trickier to understand, especially when someone else does them their own way, I know :(

How to delete all files or Sub-folders (both) in a folder except 2 folders with shell script

I would like to know how to deleted all the contents of a folder (it contains other folders and some files) except for 2 folders and its contents
The below command keeps the folder conf and removes all the other folders
find . ! -name 'conf' -type d -exec rm -rf {} +
I have tried to pipe it like below
find . -maxdepth 1 -type d ! -name 'conf' |find . -maxdepth 1 -type d ! -name 'foldername2'
but didnt work.
is it possible to do with a single command
You haven't specified which shell you're using, but if you're using bash then extended globs can help:
printf '%s\n' !(#(conf|foldername2)/)
If you're happy with the list of files and directories produced by that, then pass the same glob to rm -rf:
rm -rf !(#(conf|foldername2)/)
Inside a script, you may need to enable extglob using shopt -s extglob. Later, you can change -s to -u to unset the option.
If you're using a different shell, then you can add some more options to your find command:
find -maxdepth 1 ! -name 'conf' -a ! -name 'foldername2' -exec rm -rf {} +
Try it without the -exec part first to print the matches rather than deleting everything.
It may my little program utility can help you. I hope so.
First of all you should find the path of your files .sh
then you should find the main folder that contains those files .sh
then remove anything except those folders
I wrote drr for such a purpose that it can do such a task so easy
drr, stands for: remove or rename files based on regular expression in D language. So you must compile it before using.
See the screenshot:
Please be careful since this is not an appropriate tool for beginner.

"for" loop in shell that go over directories

I have a directory of 24 sub-directories, no chronological order
I need to enter a sub directory, unzip a file there and then call "tophat" command on the unzipped file, then move to the next sub-directory. the loop should go over all the sub-directories with these commands.
I don't really know how to create this loop (I need it to run on a display and not according to numeric order)
(for sure many of you who work with RNA-seq results are familiar with this issue)
If anyone can help me with it
I'll be very thankful
for d in "/path/to/"*/
do
cd "$d" || continue
unzip the_file.zip
tophat the_file
done
Use find:
find /path -type d -print | \
while read path ; do
...
done
Note: This loop breaks when the file names contain new line characters.
Same using for loop
for directory in `find /path -type d -print`
do
cd "$directory"
unzip zip_filename
sh zip_dir/script_file.sh &
done

Linux Find and execute

I need to write a Linux script where it does the following:
Finding all the .ear and .war files from a directory called ftpuser, even the new ones that are going to appear there and then execute one command that it produces some reports. When the command finishes then those files need to be moved to another directory.
Below I think that I have found how the command starts. My question is does anyone know how to find the new entries on the directory and then execute the command so I can get the report?
find /directory -type f -name "*.ear" -or -type f -name "*.war"
It seems that you'd want the script to run indefinitely. Loop over the files that you find in order to perform the desired operations:
while : ; do
for file in $(find /directory -type f -name "*.[ew]ar"); do
some_command "${file}" # Execute some command for the file
mv "${file}" /target/directory/ # Move the file to some directory
done
sleep 60s # Maybe sleep for a while before searching again!
done
This might also help: Monitor Directory for Changes
If it is not time-critical, but you are not willing to start the script (like the one suggested by devnull) manually after each reboot or something, I suggest using a cron-job.
You can set up a job with
crontab -e
and appending a line like this:
* * * * * /usr/bin/find /some/path/ftpuser/ -type f -name "*.[ew]ar" -exec sh -c '/path/to/your_command $1 && mv $1 /target_dir/' _ {} \;
This runs the search every minute. You can change the interval, see https://en.wikipedia.org/wiki/Cron for an overview.
The && causes the move to be only executed if your_command succeeded. You can check by running it manually, followed by
echo $?
0 means true or success. For more information, see http://tldp.org/LDP/abs/html/exit-status.html

BASH Simple - Run for all folders

I want to run a bash code, that I have written, in all subfolders (where it has been "called").
The code I have written is simply named "all", and when I run it individually in each folders, it runs with no problems.
But when I run it with this code, it doesn't run (it runs, but gives errors like - files not found).
for D in *;do
all
done
I got the mistake!!! I need to enter each folder in order to run the code "all". How do I do that?
Thanks!
You do not need to run ls or find to get the list of the files in a folder. There is a pure bash solution for this. You may try something like:
for D in *; do
[ -d "$D" -a -x "$D/all" ] && "$D/all"
done
This works well even if the directory contains spaces. The ls and find ... fails in that case (if a directory is a b then D will be a and then b). Also the find also returns . dir. You can avoid this using find -maxdepth 1 -mindepth 1 -type d. The find has an advantage: it discovers hidden directories as well (.somedir). With bash this can be forced using .*, but then . and .. dirs have to be skipped.
You can use find
for D in `find . -type d`; do
"$D"/all
done
This will find every subfolder recursively! And will not follow symlinks (it's default behaviour of find)
You can specify the max depth of recursion with the parameter maaaxdepth
for D in `find . -type d -maxdepth 1`; do
"$D"/all
done
This will only take the subfolders in current pwd
EDIT by future me: Don't use that!
Better way: Globbing
for i in *
do
[[ -d $i ]] && ./"$i"/all
done
What about
for D in `ls`; do
$D/all
done
EDIT:
If you need to enter each folder:
for D in `ls`; do
cd $D
./all
cd ..
done
EDIT (spaces + directories only):
for D in `ls -d`; do
cd "$D"
./all
cd ..
done

Resources