Unix: Batch move files from multiple subdirectories up 1 level - macos

How can I move files from several subdirectories up 1 level in one terminal command?
File Structure:
path/to/files/A/remove/image.png
path/to/files/B/remove/image.png
path/to/files/C/remove/image.png
path/to/files/D/remove/image.png
path/to/files/E/remove/image.png
Desired Structure:
path/to/files/A/image.png
path/to/files/B/image.png
path/to/files/C/image.png
path/to/files/D/image.png
path/to/files/E/image.png
There are A LOT of directories and each "letter" directory above includes several images. Also, would like to delete the directory the files were moved from.

I had the same requirement (on a mac) but in my case the subdirectory did not have the same name as in the original question making it a bit more complicated. This is what worked for me:
from the "path/to" folder I ran
find . -mindepth 3 -maxdepth 3 -type f | awk -v sq="'" -F "/" '{print "mv -i " sq $0 sq " " sq "./" $2 sq}' > moveup.sh
and then
sh moveup.sh

There are many ways to do it.
This moves all files to their grand-parent directory:
$ find path/to/files -type f -exec mv {} $(dirname $(dirname {})) \;
You add a -name \*.type or whatever, instead of the -type f option, in order to be more specific.
This removes empty directories:
$ find . -type d -empty -exec rmdir {} \;
(although it generates benign errors, which I guess is because the directory structure has been altered while find is still working).

I found the solution to my question...
Using the example directories in my original question, I used this command:
for a in $(find path/to/files -maxdepth 1 -type d); do mv $a/remove/* $a/; rmdir $a/remove; done
"remove" is the name of the subdirectory I wanted to move the files out of within each directory.
The answer was actually found here:
https://serverfault.com/questions/405146/how-to-mass-move-files-one-directory-up

Related

Find index.html files & Rename with folder name and move file a level up

Little help in solving a workflow with shell script using the find command.
Finding all index.html files in every folder.
We can use this with find command for that.
find ./ -type f -name 'index.html'
Renaming the file index.html with the folder names.
After renaming the files, I wanted to move the files one level up.
I'm stuck at renaming and moving the files one level up.
As I have more than 100k files, Xargs will be handy for this.
Here is code I have so far
find ./ -type f -name 'index.html' | xargs -P 4
Any help in in renaming the index.html files and moving the files one level up?
Instead of xargs you can use find -exec. Inside, you can run a small sh-Script:
find . -mindepth 2 -type f -name 'index.html' -exec sh -c '
d="$(dirname "$1")";
mv "$1" "$d/../$(basename "$d").html";
rmdir "$d";
' find-sh {} \;
Received the help from Askubuntu forum for it :)
Reference : https://askubuntu.com/questions/1236564/find-index-html-files-rename-with-folder-name-and-move-file-a-level-up

copying files from subfolders via grep shell script

I want to write a shell script to do the following :
I have a folder with many subfolders. Each of these subfolders has a *.gz file and some other files which I don't need. I want to move all .gz files into a new subfolder called needed_files (I have already created this subfolder). So I did the following :
I went to the parent folder with all the subfolder and cp /.gz > needed_files/., but this did not work. Can you suggest what I should be doing?
grep is irrelevant here. Use find:
find . ! \( -type d -name needed_files -prune \) -type f -name '*.gz' \
-exec echo mv -t needed_files {} +
POSIX equivalent of that -exec is
-exec sh -c 'echo mv "$#" needed_files' _ {} +
If its output looks good, remove echo.
Btw I noticed that the title says copy but you also say I want to move, so decide on what you exactly want to do and let me know so I can edit my answer.

shell script to delete Files and sub directories from a directory in linux

I want to delete all files and sub directories created by using the specified files. I am currently using the command to delete files and directories
find . ! -name file.txt -type d -exec rm -r {} + #this is for sub directories
find . ! -name file.txt -type f -exec rm -f {} + #this is for files
It deletes all the files and sub directories when I run these command twice but I want to delete all files and directories created leaving one file at once sort. Any help is highly appreciable.
Regards
Jitendra
GNU find can directly delete files and directories:
find ! -name file.txt -delete
It will give error messages, because it cannot delete the directories up to file.txt, but everything else will still be deleted.
If you want to keep the files file1.txt, file2.txt, and file3.txt, chain the conditions like this:
find ! \( -name file1.txt -o -name file2.txt -o -name file3.txt \) -delete

A script that iterates over all files in folder

There is a script on a server that I need to run over all the files in a folder. To run this script over one file I use this shell script:
for input in /home/arashsa/duo-bokmaal/Bokmaal/DUO_BM_28042.txt ; do
name=$(basename "$input")
/corpora/bokm/tools/The-Oslo-Bergen-Tagger/./tag-lbk.sh "$input" > "/home/arashsa/duo-bokmaal-obt/$name"
done
I'm terrible at writing shell scripts, and have not managed to found out how to iterate over files. What I want it is to make the script iterate over all files in a given folder that end with .txt and not those that end with _metadata.txt. So I'm thinking I would give it the folder path as argument, make it iterate over all the files in that folder, and run script on files ending with .txt and not _metadata.txt
Use find and the exec option.
$ find /path/to/dir -exec <command here> \;
Each file or directory can be obtained by using {}.
Example usage: $ find . -exec echo {} \;, this will echo each file name recursively or directory name in the current directory. You can use some other options to further specify the desired files and directories you wish to handle. I will briefly explain some of them. Note that the echo is redundant because the output of find will automatically print but I'll leave it there to illustrate the working of exec. This being said, following commands yield the same result: $ find . -exec echo {} \; and $ find .
maxdepth and mindepth
Specifying the maxdepth and mindepth allows you to go as deep down the directory structure as you like. Maxdepth determines how many times find will enter a directory and mindepth determines how many times a directory should be entered before selecting a file or dir.
Example usages:
(1) listing only elements from this dir, including . (= current dir).
(2) listing only elements from current dir excluding .
(3) listing elements from root dir and all dirs in this dir
(1)$ find . -maxdepth 1 -exec echo {} \;
(2)$ find . -mindepth 1 -maxdepth 1 -exec echo {} \;
# or, alternatively
(2)$ find . ! -path . -maxdepth 1 -exec echo {} \;
(3)$ find / -maxdepth 2 -exec echo {} \;
type
Specifying a type option allows you to filter files or directories only, example usage:
(1) list all files in this dir
(2) call shell script funtion func on every directory in the root dir.
(1)$ find . -maxdepth 1 -type f -exec echo {} \;
(2)$ find / -maxdepth 1 -type d -exec func {} \;
name & regex
The name option allows you to search for specific filenames, you can also look for files and dirs using a regex format.
Example usage: find all movies in a certain directory
$ find /path/to/dir -maxdepth 1 -regextype sed -regex ".*\.\(avi\|mp4\|mkv\)"
size
Another filter is the file size, any file or dir greater than this value will be returned. Example usage:
(1) find all empty files in current dir.
(2) find all non empty files in current dir.
(1)$ find . -maxdepth 1 -type f -size 0
(2)$ find . -maxdepth 1 -type f ! -size 0
Further examples
Move all files of this dir to a directory tmp present in .
$ find . -type f -maxdepth 1 -exec mv {} tmp \;
Convert all mkv files to mp4 files in a dir /path/to/dir and child directories
$ find /path/to/dir -maxdepth 2 -regextype sed -regex ".*\.mkv" -exec ffmpeg -i {} -o {}.mp4 \;
Convert all your jpeg files to png (don't do this, it will take very long to both find them and convert them).
$ find ~ -maxdepth 420 -regextype sed -regex '.*\.jpeg' -exec mogrify -format png {} \;
Note
The find command is a strong tool and it can prove to be fruitful to pipe the output to xargs. It's important to note that this method is superior to the following construction:
for file in $(ls)
do
some commands
done,
as the latter will handle files and directories containing spaces the wrong way.
In bash:
shopt -s extglob
for input in /dir/goes/here/*!(_metadata).txt
do
...
done

How to copy files recursively, rename them but keep the same extension in Bash?

I have a folder with tens of thousands of different file types. Id like to copy them all to a new folder (Copy1) but also rename them all to $RANDOM but keep the extension intact. I realize I can write a line specifying which extension to find and how to name it, but there is got to be a way to do it dynamically, because there are at least 100 file types and may be more in the future.
I have the following so far:
find ./ -name '*.*' -type f -exec bash -c 'cp "$1" "${1/\/123_//_$RANDOM}"' -- {} \;
but that puts the random number after the extension, and also it puts the all in the same folder. I cant figure out how to do the following 2 things:
1 - Keep all paths intact, but in a new root folder (Copy1)
2 - How to have name be $RANDOM.extension, instead of .extension.$RANDOM
PS - by $RANDOM i mean actual randomly generated number. I am interested in keeping folder structure, so we are dealing with a few hundred files at most per directory, but all directories/files need to be renamed to $RANDOM. Another way to look at what I need to do. Copy all contents or Folder1 with all subdirectories and files to Folder2 (where Fodler2 is a $RANDOM name), then rename all folders and files to random names but keep all extensions.
EDIT: Ok i figured out how to rename and keep extension. But I have a problem where its dumping all of the files into the root directory where script is run from. How do I keep them in their respective folders? Command Im using is:
find ./ -name '*.*' -type f -exec bash -c 'mv "$1" $RANDOM.${1##*.}' -- {} \;
Thanks!
Ok i figured out how to rename and keep extension. But I have a
problem where its dumping all of the files into the root directory
where script is run from. How do I keep them in their respective
folders? Command Im using is:
find ./ -name '*.*' -type f -exec bash -c 'mv "$1" $RANDOM.${1##*.}' -- {} \;
Change your command to:
PATH=/bin:/usr/bin find . -name '*.*' -type f -execdir bash -c 'mv "$1" $RANDOM.${1##*.}' -- {} \;
Or alternatively using uuids instead of random numbers:
PATH=/bin:/usr/bin find . -name '*.*' -type f -execdir bash -c 'mv "$1" $(uuidgen).${1##*.}' -- {} \;
Here's what I came up with :
i=1
random="whatever"
find . -name "*.*" -type f | while read f
do
newbase=${f/*./$random$i.} //added counter to filename
cp $f /Path/Name/"$newbase"
((i++))
done
I had to add a counter to random (i), otherwise, if the extensions are similar, your files would overwrite themselves when copied.
In your new folder, your files should look like this :
whatever1.txt
whatever2.txt
etc etc
I hope this is what you were looking for.
Here is the command that worked for me.
find . -name '*.pdf' -type f -exec bash -c 'echo "{}" && cp "$1" ./$RANDOM.${1##*.}' -- {} \;

Resources