Move files from several folders to subfolders - macos

I have many folders with files in them, in each of them I have a subfolder called wvfm . What I want to do is move all the files into each of the wvfm folder.
I tried doing the line below but it is not working
for i in "./*"; mv $i "./wvfm"; done
but that didn't work quite right

Use a for loop (for i in */; do) to list the files, then move to the folders to list and save all files in to a variable (F=$(ls)).
Then move all your files with excluding your folder (mv ${F/wvfm/} wvfm), like this:
#!/bin/bash
subdirectory="wvfm"
for i in */; do
cd $i
F=$(ls)
mv ${F/$subdirectory/} $subdirectory
cd ../
done

find * -maxdepth 0 -type d | xargs -n 1 -I {} mv {}/* {}/wvfm
should do the trick; quick and dirty. Not a MacOS user, but works in bash.
Explanation:
find * -maxdepth 0 -type d find all directories at depth 0 (ie: do not descend dirs),
pipe to xargs, using options -n 1, operate on one value at a time, -I replace-str, string replacement (see man xargs).
action command: mv {}/* {}/wvfm substitues to mv dirA/* dirA/wvfm for each dir match.
You will get an "error", mv: cannot move /wvfm' to a subdirectory of itself, 'wvfm/wvfm', but you can ignore / take advantage of it (quick and dirty).
You could cover all bases with:
for entry in $(find * -maxdepth 0 -type d); do
(cd $entry; [[ -d wvfm ]] && \
find * -maxdepth 0 ! -name wvfm -exec mv {} wvfm \;
)
done
find * -maxdepth 0 -type d, again, find only the top-level directories,
in a subshell, change to the input directory, if there's a directory wvfm,
look for all contents except the wvfm directory and -exec the mv command
exiting the subshell leaves you back in the starting (root) directory, ready for next input.

Related

Sorting loop function creates infinite subdirectories

I routinely have to sort large amounts of images in multiple folders into two 2 file types, ".JPG" and ".CR2". I'm fairly new to bash but have created a basic script that will sort through one individual folder successfully and divide these file types into distinct folders.
I'm having problems scaling this up to automatically loop through subdirectories. My current script creates an infinite loop of new subfolders until terminal times out.
How can I use the loop function without having it cycle through new folders?
function f {
cd "$1"
count=`ls -1 *.JPG 2>/dev/null | wc -l`
if [ $count != 0 ]; then
echo true
mkdir JPG; mv *.JPG jpg
else
echo false
fi
count=`ls -1 *.CR2 2>/dev/null | wc -l`
if [ $count != 0 ]; then
echo true
mkdir RAW; mv *.CR2 raw;
else
echo false
fi
for d in * .[!.]* ..?*; do
cd "$1"
test -d "$1/$d" && f "$1/$d"
done
}
f "`pwd`"
I still advise people to use find instead of globulation * in scripts. The * may not work reliably always, may fail and confuse.
First we create directories to move to:
mkdir -p ./jpg ./cr2
Note that -p in mkdir will make mkdir not fail in case the directory already exists.
Use find. Find all files named *.JPG and move each file to jpg :
find . -maxdepth 1 -mindepth 1 -name '*.JPG' -exec mv {} ./jpg \;
// similar
find . -maxdepth 1 -mindepth 1 -name '*.CR2' -exec mv {} ./cr2 \;
The -maxdepth 1 -mindepth 1 is so that the find does not scan the directory recursively, which is the default. You can remove them, but If you want, you can add -type f to include files only.
Notes to your script:
Don't parse ls output
You can use the find . -mindepth 1 -maxdepth 1 -file '*.jpg' -print . | wc -c to get the number of files in a directory instead.
for d in * .[!.]* ..?*; do I have a vague idea what this is supposed to do, some kind of recursively scanning the directory. Buf if the directory JPG is inside $(pwd) then you will scan infinitely into yourself and move the file into yourself etc... If the destination folder is outside current directory, just modify the find scripts by removing -mindepth 1, it will scan recursively then.
Don't use backticks, they are less readable and are deprecated. Use $( .. ) instead.

"find -mtime +5 | xargs rm -rf" destroying whole directories, even when files newer than 5 days exist

I have a series of folders, subfolders and files like this :
year_folder
month1_folder
day1_folder
filea, fileb
day2_folder
filea
month2_folder
I want to delete the folders and files older than X days.
I have tried
find /c/Documents/year_folder -mtime +5 | xargs rm -rf
This command line works perfectly on my test folders (locally on my computer).
But when I run the script on synology, somehow it deletes the whole year_folder.
Unfortunately I do not know how to test my script on the server of synology to understand what I am doing wrong.
Using GNU Extensions
Split this into two pieces:
Delete files (only!) older than five days
# for GNU find; see below for POSIX
find "$root" -type f -mtime +5 -delete
Delete empty directories
# for GNU find; see below for POSIX
find "$root" -depth -type d -empty -delete
When you use rm -rf, you're deleting the entire directory when the directory itself hasn't been updated in five days. However, if you create or modify a/b/c, that doesn't update the modification time of a (or, in the case of modifications that don't require the directory itself to be updated, not even that of a/b) -- thus, your "modification time older than five days" rule is destructive when you apply it recursively.
The only caveat to the above is that it may not delete multiple layers of empty directories at a run -- that is, if a/b/c is empty, and a/b is empty other than c, then only c may be deleted on the first run, and it may require another invocation before a/b is removed as well.
Supporting Baseline POSIX
POSIX find doesn't support -delete. Thus, the first command becomes:
find "$root" -type f -mtime +5 -exec rm -rf -- {} +
Similarly, it doesn't support -empty. Because rmdir will fail when passed a non-empty directory, however, it's easy enough to just let those instances referring to non-empty directories fail:
find "$root" -depth -type d -exec rmdir -- {} +
If you aren't comfortable doing that, then things get stickier. An implementation that uses a shell to test whether each directory is empty may look like:
find "$root" -depth -type d -exec sh -c '
rmdir_if_empty() {
dir=$1
set -- "$dir"/* # replace argument list w/ glob result
[ "$#" -gt 1 ] && return # globbed to multiple results: nonempty
{ [ -e "$1" ] || [ -L "$1" ]; } && return # globbed to one result that exists: nonempty
rmdir -- "$dir" # neither of the above: empty, so delete.
}
for arg; do
rmdir_if_empty "$arg"
done
' _ {} +

How to move files en-masse while skipping a few files and directories

I'm trying to write a shell script that moves all files except for the ones that end with .sh and .py. I also don't want to move directories.
This is what I've got so far:
cd FILES/user/folder
shopt -s extglob
mv !(*.sh|*.py) MoveFolder/ 2>/dev/null
shopt -u extglob
This moves all files except the ones that contain .sh or .py, but all directories are moved into MoveFolder as well.
I guess I could rename the folders, but other scripts already have those folders assigned for their work, so renaming might give me more trouble. I also could add the folder names but whenever someone else creates a folder, I would have to add its name to the script or it will be moved as well.
How can I improve this script to skip all folders?
Use find for this:
find -maxdepth 1 \! -type d \! -name "*.py" \! -name "*.sh" -exec mv -t MoveFolder {} +
What it does:
find: find things...
-maxdepth 1: that are in the current directory...
\! -type d: and that are not a directory...
\! -name "*.py: and whose name does not end with .py...
\! -name "*.sh: and whose name does not end with .sh...
-exec mv -t MoveFolder {} +: and move them to directory MoveFolder
The -exec flag is special: contrary to the the prior flags which were conditions, this one is an action. For each match, the + that ends the following command directs find to aggregate the file name at the end of the command, at the place marked with {}. When all the files are found, find executes the resulting command (i.e. mv -t MoveFolder file1 file2 ... fileN).
You'll have to check every element to see if it is a directory or not, as well as its extension:
for f in FILES/user/folder/*
do
extension="${f##*.}"
if [ ! -d "$f" ] && [[ ! "$extension" =~ ^(sh|py)$ ]]; then
mv "$f" MoveFolder
fi
done
Otherwise, you can also use find -type f and do some stuff with maxdepth and a regexp.
Regexp for the file name based on Check if a string matches a regex in Bash script, extension extracted through the solution to Extract filename and extension in Bash.

A script that iterates over all files in folder

There is a script on a server that I need to run over all the files in a folder. To run this script over one file I use this shell script:
for input in /home/arashsa/duo-bokmaal/Bokmaal/DUO_BM_28042.txt ; do
name=$(basename "$input")
/corpora/bokm/tools/The-Oslo-Bergen-Tagger/./tag-lbk.sh "$input" > "/home/arashsa/duo-bokmaal-obt/$name"
done
I'm terrible at writing shell scripts, and have not managed to found out how to iterate over files. What I want it is to make the script iterate over all files in a given folder that end with .txt and not those that end with _metadata.txt. So I'm thinking I would give it the folder path as argument, make it iterate over all the files in that folder, and run script on files ending with .txt and not _metadata.txt
Use find and the exec option.
$ find /path/to/dir -exec <command here> \;
Each file or directory can be obtained by using {}.
Example usage: $ find . -exec echo {} \;, this will echo each file name recursively or directory name in the current directory. You can use some other options to further specify the desired files and directories you wish to handle. I will briefly explain some of them. Note that the echo is redundant because the output of find will automatically print but I'll leave it there to illustrate the working of exec. This being said, following commands yield the same result: $ find . -exec echo {} \; and $ find .
maxdepth and mindepth
Specifying the maxdepth and mindepth allows you to go as deep down the directory structure as you like. Maxdepth determines how many times find will enter a directory and mindepth determines how many times a directory should be entered before selecting a file or dir.
Example usages:
(1) listing only elements from this dir, including . (= current dir).
(2) listing only elements from current dir excluding .
(3) listing elements from root dir and all dirs in this dir
(1)$ find . -maxdepth 1 -exec echo {} \;
(2)$ find . -mindepth 1 -maxdepth 1 -exec echo {} \;
# or, alternatively
(2)$ find . ! -path . -maxdepth 1 -exec echo {} \;
(3)$ find / -maxdepth 2 -exec echo {} \;
type
Specifying a type option allows you to filter files or directories only, example usage:
(1) list all files in this dir
(2) call shell script funtion func on every directory in the root dir.
(1)$ find . -maxdepth 1 -type f -exec echo {} \;
(2)$ find / -maxdepth 1 -type d -exec func {} \;
name & regex
The name option allows you to search for specific filenames, you can also look for files and dirs using a regex format.
Example usage: find all movies in a certain directory
$ find /path/to/dir -maxdepth 1 -regextype sed -regex ".*\.\(avi\|mp4\|mkv\)"
size
Another filter is the file size, any file or dir greater than this value will be returned. Example usage:
(1) find all empty files in current dir.
(2) find all non empty files in current dir.
(1)$ find . -maxdepth 1 -type f -size 0
(2)$ find . -maxdepth 1 -type f ! -size 0
Further examples
Move all files of this dir to a directory tmp present in .
$ find . -type f -maxdepth 1 -exec mv {} tmp \;
Convert all mkv files to mp4 files in a dir /path/to/dir and child directories
$ find /path/to/dir -maxdepth 2 -regextype sed -regex ".*\.mkv" -exec ffmpeg -i {} -o {}.mp4 \;
Convert all your jpeg files to png (don't do this, it will take very long to both find them and convert them).
$ find ~ -maxdepth 420 -regextype sed -regex '.*\.jpeg' -exec mogrify -format png {} \;
Note
The find command is a strong tool and it can prove to be fruitful to pipe the output to xargs. It's important to note that this method is superior to the following construction:
for file in $(ls)
do
some commands
done,
as the latter will handle files and directories containing spaces the wrong way.
In bash:
shopt -s extglob
for input in /dir/goes/here/*!(_metadata).txt
do
...
done

Unix: Batch move files from multiple subdirectories up 1 level

How can I move files from several subdirectories up 1 level in one terminal command?
File Structure:
path/to/files/A/remove/image.png
path/to/files/B/remove/image.png
path/to/files/C/remove/image.png
path/to/files/D/remove/image.png
path/to/files/E/remove/image.png
Desired Structure:
path/to/files/A/image.png
path/to/files/B/image.png
path/to/files/C/image.png
path/to/files/D/image.png
path/to/files/E/image.png
There are A LOT of directories and each "letter" directory above includes several images. Also, would like to delete the directory the files were moved from.
I had the same requirement (on a mac) but in my case the subdirectory did not have the same name as in the original question making it a bit more complicated. This is what worked for me:
from the "path/to" folder I ran
find . -mindepth 3 -maxdepth 3 -type f | awk -v sq="'" -F "/" '{print "mv -i " sq $0 sq " " sq "./" $2 sq}' > moveup.sh
and then
sh moveup.sh
There are many ways to do it.
This moves all files to their grand-parent directory:
$ find path/to/files -type f -exec mv {} $(dirname $(dirname {})) \;
You add a -name \*.type or whatever, instead of the -type f option, in order to be more specific.
This removes empty directories:
$ find . -type d -empty -exec rmdir {} \;
(although it generates benign errors, which I guess is because the directory structure has been altered while find is still working).
I found the solution to my question...
Using the example directories in my original question, I used this command:
for a in $(find path/to/files -maxdepth 1 -type d); do mv $a/remove/* $a/; rmdir $a/remove; done
"remove" is the name of the subdirectory I wanted to move the files out of within each directory.
The answer was actually found here:
https://serverfault.com/questions/405146/how-to-mass-move-files-one-directory-up

Resources