I'm trying to add white noise to hundreds of files. I find a command but don't know how to adapt it.
There is a source folder with files and an empty destination folder. Could someone help me with the right command?
I'm using OSX.
http://linguistics.berkeley.edu/plab/guestwiki/index.php?title=Sox_in_phonetic_research#Add_noise_to_an_audio_file
You should put the command in a for loop, then refer to the for variable as file name.
I assume that the source folder and the destination folder are on the same level.
for file in *.wav; do sox "$file" -p synth whitenoise vol 0.02 | sox -m "$file" - "../DESTINATION/$file"; done
In addition, if you want to use a suffix in the name of the output files, use basename to exclude the extension:
for file in *.wav; do sox "$file" -p synth whitenoise vol 0.02 | sox -m "$file" - "../DESTINATION/$(basename $file)_output.wav"; done
Related
can someone point me in the right direction if i need this
exiftool -tagsfromfile XYZ.xmp -all:all XYZ.jpg
to work for hundreds of jpgs? so i have a folgeder with houndreds of jpegs and xmps with the same name but different file ending (xmp and jpeg). what would be the elegant way to go through all of them and replace XYZ with the actual filename?
i want / need to do this in a shell on osx.
do in need something like a for loop? or is there any direct way in the shell?
Thank you so much in advance!!
Your command will be
exiftool -r --ext xmp -tagsfromfile %d%f.xmp -all:all /path/to/files/
See Metadata Sidecar Files example #15.
The -r (-recurse) option allows recursion into subdirectories. Remove it if recursion is not desired.
The -ext (-extension) option is used to prevent the copying from the XMP files back onto themselves.
The %d variable is the directory of the file currently being processed. The %f variable is the base filename without the extension of that file. Then xmp is used as the extension. The result creates a file path to a corresponding XMP file in the same directory for every image file found. This will work for any writable file found (see FAQ #16).
This command creates backup files. Add -overwrite_original to suppress the creation of backup files.
You do not want to loop exiftool as shown in the other answers. Exiftool's biggest performance hit is the startup time and looping it will increase the processing time. This is Exiftool Common Mistake #3.
solved it by doing this:
#!/bin/bash
FILES="$1/*.jpg"
for f in $FILES; do
if [ -f "$f" ]; then
echo "Processing $f file..."
#cat "$f"
FILENAME="${f%%.*}"
echo $FILENAME
# exiftool -tagsfromfile "$FILENAME".xmp -all:all "$FILENAME".jpg
exiftool -overwrite_original_in_place -ext jpg -tagsFromFile "$FILENAME".xmp -# xmp2exif.args -# xmp2iptc.args '-all:all' '-FileCreateDate<XMP-photoshop:DateCreated' '-FileModifyDate<XMP-photoshop:DateCreated' "$FILENAME".jpg
else
echo "Warning: Some problem with \"$f\""
fi
done
An elegant and easy way, IMHO, is to use GNU Parallel to do them all in parallel:
parallel --dry-run exiftool -tagsfromfile {.}.xmp -all:all {} ::: *.jpg
If that looks correct, remove --dry-run and run again to do it for real.
{} just means "the current file"
{.} just means "the current file without its extension"
::: is just a separator followed by the names of the files you want GNU Parallel to process
You can install GNU Parallel on macOS with homebrew:
brew install parallel
I am making a bash script that converts all .webp files into .png files using ffmpeg. But it only works for the directory that it is in. It would be great to be able to place the script in a /home/username folder and let it scan all folders for webp files. here is what i have written so far:
for i in *.webp;do ffmpeg -i "$i" "$i.png" ;done
rm *webp
for i in *.webp.png
do
mv "$i" "`echo $i | sed 's/.webp//'`"
done
Consider a list of files (e.g. files.txt) similar (but not limited) to
/root/
/root/lib/
/root/lib/dir1/
/root/lib/dir1/file1
/root/lib/dir1/file2
/root/lib/dir2/
...
How can I copy the specified files (not any other content from the folders which are also specified) to a location of my choice (e.g. ~/destination) with a) intact folder structure but b) N folder components (in the example just /root/) stripped from the path?
I already managed to use
cp --parents `cat files.txt` ~/destination
to copy the files with an intact folder structure, however this results in all files ending up in ~/destination/root/... when I'd like to have them in ~/destination/...
I think I found a really nice an concise solution by using GNU tar:
tar cf - -T files.txt | tar xf - -C ~/destination --strip-components=1
Note the --strip-components option that allows to remove an arbitrary number of path components from the beginning of the file name.
One minor problem though: It seems tar always "compresses" the whole content of folders mentioned in files.txt (at least I couldn't find an option to ignore folders), but that is most easily solved using grep:
cat files.txt | grep -v '/$' > files2.txt
This might not be the most graceful solution - but it works:
for file in $(cat files.txt); do
echo "checking for $file"
if [[ -f "$file" ]]; then
file_folder=$(dirname "$file")
destination_folder=/destination/${file_folder#/root/}
echo "copying file $file to $destination_folder"
mkdir -p "$destination_folder"
cp "$file" "$destination_folder"
fi
done
I had a look at cp and rsync, but it looks like they would benefit more if you to cd into /root first.
However, if you did cd to the correct directory before hand, you could always run it as a subshell so that you would be returned to your original location once the subshell has finished.
I have about 500+ ZIP files which I need to re-compress using the best compression possible.
I looking at creating a script (one liner if possible) to extract the contents from each ZIP files (Some of them contain multiple files inside, but no more folders), pipe the output so that 7zip can re-compress them achieving a maximum compression ZIP file.
The script that I created below uses an intermediate stage by extracting into a temporary folder and then re-compressing, but I would like to skip that stage if possible.
Could anyone help? Thanks in advance.
This is my script at the moment, which works OK. But I am sure there must be a more efficient manner to achieve this.
predir=$(pwd)
for i in *.zip; do
dirname='echo "$i" | cut -d'.' -f1'
mkdir "$dirname"
7z x "$i" -o"$dirname"/ && rm "$i"
7za a -mm=Deflate -mfb=258 -mpass=15 -r "$i" "$predir"/"$dirname"/"*" && rm -rf "$dirname"/
done
I have a large repository of media files that follow torrent naming conventions- something unpleasant to read. At one point, I had properly named the folders that contain said files, but not want to dump all the .avi, .mkv, etc files into my main media directory using a bash script.
Overview:
Current directory tree:
Proper Movie Title/
->Proper.Movie.Title.2013.avi
->Proper.Movie.Title.2013.srt
Title 2/
->Title2[proper].mkv
Movie- Epilogue/
->MOVIE EPILOGUE .AVI
Media Movie/
->MEDIAMOVIE.CD1.mkv
->MEDIAMOVIE.CD2.mkv
.
.
.
Desired directory tree:
Proper Movie Title/
->Proper Movie Title.avi
->Proper Movie Title.srt
Title 2.mkv
Movie- Epilogue.avi
Media Movie/
->Media Movie.cd1.mkv
->Media Movie.cd2.mkv
Though this would be an ideal, my main wish is for the directories with only a single movie file within to have that file be renamed and moved into the parent directory.
My current approach is to use a double for loop in a .sh file, but I'm currently having a hard time keeping new bash knowledge in my head.
Help would be appreciated.
My current code (Just to get access to the internal movie files):
#!/bin/bash
FILES=./*
for f in $FILES
do
if [[ -d $f ]]; then
INFILES=$f/*
for file in $INFILES
do
echo "Processing >$file< folder..."
done
#cat $f
fi
done
Here's something simple:
find * -type f -maxdepth 1 | while read file
do
dirname="$(dirname "$file")"
new_name="${dirname##*/}"
file_ext=${file##*.}
if [ -n "$file_ext" -a -n "$dirname" -a -n "$new_name" ]
then
echo "mv '$file' '$dirname/$new_name.$file_ext'"
fi
done
The find * says to run find on all items in the current directory. The -type f says you only are interested in files, and -maxdepth 1 limits the depth of the search to the immediate directory.
The ${file##*.} is using a pattern match. The ## says the largest left hand match to *. which is basically pulling everything off to the file extension.
The file_dir="$(dirname "$file")" gets the directory name.
Note quotes everywhere! You have to be careful about white spaces.
By the way, I echo instead of doing the actual move. I can pipe the output to a file, examine that file and make sure everything looks okay, then run that file as a shell script.