I have a bunch of files in separate folders, that have the same name.
For example,
/path/to/your/directory/1/results.pdf
/path/to/your/directory/2/results.pdf
But, I would like to copy them to the one directory:
/path/to/your/directory/results/
so I have:
/path/to/your/directory/results/results-1.pdf
/path/to/your/directory/results/results-2.pdf
etc
The trouble being that the scripts I write, the files will overwrite each other.
Thanks
I do this regularly
#!/usr/bin/sh
for i in `find <source_dir_path> -type f`
do
new=`echo $i|nawk -F"/" '{split($NF,a,".");print "<target_dir_path>"a[1]"_"$(NF-1)"."a[2]}'`
cp $i $new
done
perhaps something like this will help you
find . -type f -exec sh -c 'if [ ! -e ../aaa/{} ]; then cp {} ../aaa/;else i=0; while [ -e ../aaa/{}_${i} ]; do ((i++)); done; cp {} ../aaa/{}_${i}; fi' \;
So basically it searches from a current dir, and for any file, it will copy to ../aaa/ dir, if there is already any such file exists, it will append _0 or _1 or ... whichever makes it uniq, ideally it should be written in a script and find -exec should call for the script if any further refinement is needed, but i think this one liner should be enough
To rename the files in place as result-0001.pdf, result-0002.pdf etc., so you can move them without any problems:
rename --no-act --verbose 's/.*/sprintf "result-%04d.pdf", ++$main::Mad/e' /path/to/your/directory/results/*/result.pdf
Remove --no-act if you're happy with the result.
Related
I'm writing a script that will perform some actions, and one of those actions is to find all occurrences of a string in both file names and directory names, and replace it with another string.
I have this so far
find . -name "*foo*" -type f -depth | while read file; do
newpath=${file//foo/bar}
mv "$file" "$newpath"
done
This works fine as long as the path to the file doesn't also contain foo, but that isn't guaranteed.
I feel like the way to approach this is to ONLY change the file names first, then go back through and change the directory names, but even then, if you have a structure that has more than one directory with foo in it, it will not work properly.
Is there a way to do this with built in macOS tools? (I say built-in, because this script is going to be distributed to some other folks in our organization and it can't rely on any packages to be installed).
Separating the path_name from the file_name, something like.
#!/usr/bin/env bash
while read -r file; do
path_name="${file%/*}"; printf 'Path is %s\n' "$path_name"
file_name="${file#"$path_name"}"; printf 'Filename is %s\n' "$file_name"
newpath="$path_name${file_name//foo/bar}"
echo mv -v "$file" "$newpath"
done < <(find . -name "*foo*" -type f)
Have a look at basename and dirname as well.
The printf's is just there to show which is the path and the filename.
The script just replace foo to bar from the file_name, It can be done with the path_name as well, just use the same syntax.
newpath="${path_name//bar/more}${file_name//foo/bar}"
So renaming both path_name and file_name.
Or renaming the path_name and then the file_name like your idea is an option also.
path_name="${file%/*}"
file_name="${file#"$path_name"}"
new_pathname="${path_name//bar/more}"
mv -v "$path_name" "$new_pathname"
new_filename="${file_name//foo/bar}"
mv -v "${new_pathname%/*}$file_name" "$new_pathname$new_filename"
There are no additional external tool/utility used, except from the ones being used by your script.
Remove the echo If you're satisfied with the result/output.
You can use -execdir to run a command on just the filename (basename) in the relevant directory:
find . -depth -name '*foo*' -execdir bash -c 'mv -- "${1}" "${1//foo/bar}"' _ {} \;
The scenario is that I want to convert all of my music files from .mp3 to .ogg. They are in a folder called "Music". In this folder there are folders and files. The files are .mp3s. The directories may contain .mp3s or directories which further contain .mp3s or directories, and so on. This is because some artists have albums which have parts and some do not, etc.
I want to write a script that converts each file using avconv.
Basically, what I am going to do is manually cd into every directory and run the following:
for file in $(ls); do avconv -i $file `echo \`basename $file .mp3\`.ogg`; done
This successfully gets me what I want. However, this is not great as I have a lot of folders, and manually going into each of them and executing this is slow.
My question, then, is how do I write a script that runs this in any directory that has .mp3s, and then goes into any subdirectory it finds and recursively calls itself? My intuition tells me to use Perl or Python because of the complex nature of this.
Thanks for any suggestions!
I'm not familiar with avconv but assuming your command is:
avconv -i inputname outputname
And you want to convert all inputname.mp3 to inputname.ogg in their original directories below Music, then the following should work in bash:
#!/bin/bash
while read -r fname; do
avconv -i "$fname" "${fname%.mp3}.ogg"
done < <(find /path/to/Music -type f -name "*.mp3")
Note: this does not remove the original .mp3, and the space between < < is required. Also note, for file in $(ls) is filled with potential for errors.
You can do it with bash in one liner:
First you find all files (of type file (-type f) ) that match next pattern "*.mp3". To read each one you use 'while' and invoke avconf.
For exchange extension I prefer 'sed' command, that keep folder so you don't need the 'cd' command.
Notice that you must put quotes on $FN variable because it can contain spaces.
find -type f -iname "*.mp3" | while read "FN" ; do avconf -i "$FN" $(echo "$FN" | sed 's/\.mp3/\.ogg/g') ; done
find <music-folder> -type f -name '*.mp3' | \
xargs -I{} bash -c 'mp3="$0"; ogg="${mp3%.mp3}.ogg"; avconv -i "$mp3" "$ogg";' {}
This should survive in cases of "weird" filenames with spaces, quotes and other strange symbols within.
You can list directories with absolute paths and recursively cd into every directory using find $PWD -type d syntax:
Just inside from Music directory run:
for d in $(find $PWD -type d)
do
cd $d
for file in $(find . -maxdepth 1 -type f)
do
echo $file
avconv -i $file `echo \`basename $file .mp3\`.ogg`
done
done
Say I have a folder with many files and directories (NOT actual filenames, this is like a trash can directory, so the filenames are completely random, and some files are without extension):
dir1/
dir2/
...
dirN/
file1
file2
...
fileM
Now I need to move all the files in this directory into the dir1/. That is, move file1, file2 ... fileM into dir1/. What's the easiest way to do that?
If they are all files with extension then the problem is simple, just mv *.* dir1/. But I don't know what to do if there are files without extensions.
find . -type f -maxdepth 1 -exec mv {} dir1/ \;
Although find is a good solution, here is another solution using bash only :
for file in *; do [[ -f $file ]] && mv "$file" dir1; done
Bash does not have a direct way to select only regular files. You have to use find for that.
Maybe zsh is also an option for you. With zsh you could simply write
mv *(.) dir1/
One way:
mv $(find * -maxdepth 0 -type f) dir1
Another:
for file in *; do
if [ -f "$file" ]
then mv "$file" dir1
fi
done
My directory structure is as follows
Directory1\file1.jpg
\file2.jpg
\file3.jpg
Directory2\anotherfile1.jpg
\anotherfile2.jpg
\anotherfile3.jpg
Directory3\yetanotherfile1.jpg
\yetanotherfile2.jpg
\yetanotherfile3.jpg
I'm trying to use the command line in a bash shell on ubuntu to take the first file from each directory and rename it to the directory name and move it up one level so it sits alongside the directory.
In the above example:
file1.jpg would be renamed to Directory1.jpg and placed alongside the folder Directory1
anotherfile1.jpg would be renamed to Directory2.jpg and placed alongside the folder Directory2
yetanotherfile1.jpg would be renamed to Directory3.jpg and placed alongside the folder Directory3
I've tried using:
find . -name "*.jpg"
but it does not list the files in sequential order (I need the first file).
This line:
find . -name "*.jpg" -type f -exec ls "{}" +;
lists the files in the correct order but how do I pick just the first file in each directory and move it up one level?
Any help would be appreciated!
Edit: When I refer to the first file what I mean is each jpg is numbered from 0 to however many files in that folder - for example: file1, file2...... file34, file35 etc... Another thing to mention is the format of the files is random, so the numbering might start at 0 or 1a or 1b etc...
You can go inside each dir and run:
$ mv `ls | head -n 1` ..
If first means whatever the shell glob finds first (lexical, but probably affected by LC_COLLATE), then this should work:
for dir in */; do
for file in "$dir"*.jpg; do
echo mv "$file" "${file%/*}.jpg" # If it does what you want, remove the echo
break 1
done
done
Proof of concept:
$ mkdir dir{1,2,3} && touch dir{1,2,3}/file{1,2,3}.jpg
$ for dir in */; do for file in "$dir"*.jpg; do echo mv "$file" "${file%/*}.jpg"; break 1; done; done
mv dir1/file1.jpg dir1.jpg
mv dir2/file1.jpg dir2.jpg
mv dir3/file1.jpg dir3.jpg
Look for all first level directories, identify first file in this directory and then move it one level up
find . -type d \! -name . -prune | while read d; do
f=$(ls $d | head -1)
mv $d/$f .
done
Building on the top answer, here is a general use bash function that simply returns the first path that resolves to a file within the given directory:
getFirstFile() {
for dir in "$1"; do
for file in "$dir"*; do
if [ -f "$file" ]; then
echo "$file"
break 1
fi
done
done
}
Usage:
# don't forget the trailing slash
getFirstFile ~/documents/
NOTE: it will silently return nothing if you pass it an invalid path.
When using sudo rm -r, how can I delete all files, with the exception of the following:
textfile.txt
backup.tar.gz
script.php
database.sql
info.txt
find [path] -type f -not -name 'textfile.txt' -not -name 'backup.tar.gz' -delete
If you don't specify -type f find will also list directories, which you may not want.
Or a more general solution using the very useful combination find | xargs:
find [path] -type f -not -name 'EXPR' -print0 | xargs -0 rm --
for example, delete all non txt-files in the current directory:
find . -type f -not -name '*txt' -print0 | xargs -0 rm --
The print0 and -0 combination is needed if there are spaces in any of the filenames that should be deleted.
rm !(textfile.txt|backup.tar.gz|script.php|database.sql|info.txt)
The extglob (Extended Pattern Matching) needs to be enabled in BASH (if it's not enabled):
shopt -s extglob
find . | grep -v "excluded files criteria" | xargs rm
This will list all files in current directory, then list all those that don't match your criteria (beware of it matching directory names) and then remove them.
Update: based on your edit, if you really want to delete everything from current directory except files you listed, this can be used:
mkdir /tmp_backup && mv textfile.txt backup.tar.gz script.php database.sql info.txt /tmp_backup/ && rm -r && mv /tmp_backup/* . && rmdir /tmp_backup
It will create a backup directory /tmp_backup (you've got root privileges, right?), move files you listed to that directory, delete recursively everything in current directory (you know that you're in the right directory, do you?), move back to current directory everything from /tmp_backup and finally, delete /tmp_backup.
I choose the backup directory to be in root, because if you're trying to delete everything recursively from root, your system will have big problems.
Surely there are more elegant ways to do this, but this one is pretty straightforward.
I prefer to use sub query list:
rm -r `ls | grep -v "textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt"`
-v, --invert-match select non-matching lines
\| Separator
Assuming that files with those names exist in multiple places in the directory tree and you want to preserve all of them:
find . -type f ! -regex ".*/\(textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt\)" -delete
You can use GLOBIGNORE environment variable in Bash.
Suppose you want to delete all files except php and sql, then you can do the following -
export GLOBIGNORE=*.php:*.sql
rm *
export GLOBIGNORE=
Setting GLOBIGNORE like this ignores php and sql from wildcards used like "ls *" or "rm *". So, using "rm *" after setting the variable will delete only txt and tar.gz file.
Since nobody mentioned it:
copy the files you don't want to delete in a safe place
delete all the files
move the copied files back in place
You can write a for loop for this... %)
for x in *
do
if [ "$x" != "exclude_criteria" ]
then
rm -f $x;
fi
done;
A little late for the OP, but hopefully useful for anyone who gets here much later by google...
I found the answer by #awi and comment on -delete by #Jamie Bullock really useful. A simple utility so you can do this in different directories ignoring different file names/types each time with minimal typing:
rm_except (or whatever you want to name it)
#!/bin/bash
ignore=""
for fignore in "$#"; do
ignore=${ignore}"-not -name ${fignore} "
done
find . -type f $ignore -delete
e.g. to delete everything except for text files and foo.bar:
rm_except *.txt foo.bar
Similar to #mishunika, but without the if clause.
If you're using zsh which I highly recommend.
rm -rf ^file/folder pattern to avoid
With extended_glob
setopt extended_glob
rm -- ^*.txt
rm -- ^*.(sql|txt)
Trying it worked with:
rm -r !(Applications|"Virtualbox VMs"|Downloads|Documents|Desktop|Public)
but names with spaces are (as always) tough. Tried also with Virtualbox\ VMs instead the quotes. It deletes always that directory (Virtualbox VMs).
Just:
rm $(ls -I "*.txt" ) #Deletes file type except *.txt
Or:
rm $(ls -I "*.txt" -I "*.pdf" ) #Deletes file types except *.txt & *.pdf
Make the files immutable. Not even root will be allowed to delete them.
chattr +i textfile.txt backup.tar.gz script.php database.sql info.txt
rm *
All other files have been deleted.
Eventually you can reset them mutable.
chattr -i *
I belive you can use
rm -v !(filename)
Except for the filename all the other files will e deleted in the directory and make sure you are using it in
This is similar to the comment from #siwei-shen but you need the -o flag to do multiple patterns. The -o flag stands for 'or'
find . -type f -not -name '*ignore1' -o -not -name '*ignore2' | xargs rm
You can do this with two command sequences.
First define an array with the name of the files you do not want to exclude:
files=( backup.tar.gz script.php database.sql info.txt )
After that, loop through all files in the directory you want to exclude, checking if the filename is in the array you don't want to exclude; if its not then delete the file.
for file in *; do
if [[ ! " ${files[#]} " ~= "$file" ]];then
rm "$file"
fi
done
The answer I was looking for was to run script, but I wanted to avoid deleting the sript itself. So incase someone is looking for a similar answer, do the following.
Create a .sh file and write the following code:
cp my_run_build.sh ../../
rm -rf * cp
../../my_run_build.sh .
/*amend rest of the script*/
Since no one yet mentioned this, in one particular case:
OLD_FILES=`echo *`
... create new files ...
rm -r $OLD_FILES
(or just rm $OLD_FILES)
or
OLD_FILES=`ls *`
... create new files ...
rm -r $OLD_FILES
You may need to use shopt -s nullglob if some files may be either there or not there:
SET_OLD_NULLGLOB=`shopt -p nullglob`
shopt -s nullglob
FILES=`echo *.sh *.bash`
$SET_OLD_NULLGLOB
without nullglob, echo *.sh *.bash may give you "a.sh b.sh *.bash".
(Having said all that, I myself prefer this answer, even though it does not work in OSX)
Rather than going for a direct command, please move required files to temp dir outside current dir. Then delete all files using rm * or rm -r *.
Then move required files to current dir.
Remove everything exclude file.name:
ls -d /path/to/your/files/* |grep -v file.name|xargs rm -rf