Concatenate wav files ignore if # channels is 2 - bash

I need to concatenate all the wav files in a folder.
I tried using the command
sox folder_name/*.wav folder_name.wav
But got the error
sox FAIL sox: Input files must have the same # channels
Turns out only 21 of 2864 wav files in that folder have 2 channels instead of 1.
How can I just ignore the 21 files with 2 channels and concatenate all the 2843 wavs with 1 channel?

Use soxi to get all one channel files, put them in an array and then call sox:
for file in folder_name/*.wav; do
if soxi "$file" | grep -q 'Channel.*1'; then
files+=("$file")
fi
done
sox "${files[#]}" folder_name.wav

Related

Taking 1 seconds for each mp3 file and combine it into 1 file using ffmpeg

im using this code to combine all wanted mp3 file into 1 and its working fine, but i need to cut it then combine it
files: a.mp3, b.mp3, c.mp3 -> each of these file got duration of 3 seconds
expected output 1s of a.mp3 + 1s of b.mp3 + 1s of c.mp3
code im using:
ffmpeg -f concat -i file.txt -c copy full.mp3
file.txt:
file 'melodies/a.mp3'
file 'melodies/b.mp3'
file 'melodies/c.mp3'
file 'melodies/d.mp3'
file 'melodies/e.mp3'
output: full.mp3 the duration is 9 seconds
Documentation of concat demuxer indicates that it supports various directives to auto-cut input streams: duration, inpoint, & outpoint. Follow the example in the doc.

Merging video in a specific order using ffmpeg

I use the following to merge video in numeric order.
for f in *; do mv "$f" "${f: -17}"; done &&. find *.ts|. sed 's:\:\ :g'| sed 's/^/file /' > fraglist.txt && ffmpeg -f concat -safe 0 -i fraglist.txt -c copy output.ts; rm fraglist.txt
This works great for files named like the following...
000001
000002
000003
000004
000005
000006
000007
000008
000009
000010
But If I need something like the following merged the order is based on how many digits there are in the file name...
1708.ts 9803.ts 13798.ts 17815.ts 21804.ts 25819.ts
29832.ts
What command could I use to get the second group of files merged in that order? Thank you for your help!
Simplify your whole process by using printf alone to make the txt file contents, and use sort to provide natural/version sorting:
printf "file '%s'\n" *.ts | sort -V > fraglist.txt
ffmpeg -f concat -i fraglist.txt -c copy output.ts
Result:
file '1708.ts'
file '9803.ts'
file '13798.ts'
file '17815.ts'
file '21804.ts'
file '25819.ts'
file '29832.ts'
No need to rm fraglist.txt as the sort redirect (>) will overwrite fraglist.txt.
Note that I removed -safe 0 (an option specific to the concat demuxer) from the ffmpeg command because you don't need it in this exact example. But if you get the Unsafe file name error (such as due to special characters including spaces) then you will need to add it.

downloading and concatenating parts of videos from youtube

I'm trying to create a video quiz, that will contain small parts of other videos, concatenated together (with the purpose, that people will identify from where these short snips are taken from).
For this purpose I created a file that contain the URL of the video, the starting time of the "snip", and its length. for example:
https://www.youtube.com/watch?v=5-j6LLkpQYY 00:00 01:00
https://www.youtube.com/watch?v=b-DqO_D1g1g 14:44 01:20
https://www.youtube.com/watch?v=DPAgWKseVhg 12:53 01:00
Meaning that the first part should take the video from the first URL from its beginning and last for a minute, the second part should be taken from the second URL starting from 14:44 (minutes:seconds) and last one minute and 20 seconds and so forth.
Then all these parts should be concatenated to a single video.
I'm trying to write a script (I use ubuntu and fluent in several scripting languages) that does that, and I tried to use youtube-dl command line package and ffmpeg, but I couldn't find the right options to achieve what I need.
Any suggestions will be appreciated.
Considering the list of videos is in foo.txt, and the output video to be foo.mp4, this bash script should do the job:
eval $(cat foo.txt | while read u s d; do echo "cat <(youtube-dl -q -o - $u | ffmpeg -v error -hide_banner -i - -ss 00:$s -t 00:$d -c copy -f mpegts -);"; done | tee /dev/tty) | ffmpeg -i - -c copy foo.mp4
This is using a little trick with process substitution and eval to avoid intermediate files, container mpegts to enable simple concat protocol, and tee /dev/tty just for debugging.
I have tested with youtube-dl 2018.09.26-1 and ffmpeg 1:4.0.2-3.

timelapse images into a movie, 500 at a time

I am trying to make a script to turn a bunch of timelapse images into a movie, using ffmpeg.
The latest problem is how to loop thru the images in, say, batches of 500.
There could be 100 images from the day, or there could be 5000 images.
The reason for breaking this apart is due to running out of memory.
Afterwards I would need to cat them using MP4Box to join all together...
I am entirely new to bash, but not entirely programming.
What I think needs to happen is this
1) read in the folders contents as the images may not be consecutively named
2) send ffmpeg a list of 500 at a time to process (https://trac.ffmpeg.org/wiki/Concatenate)
2b) while you're looping thru this, set a counter to determine how many loops you've done
3) use the number of loops to create the MP4Box cat command line to join them all at the end.
the basic script that works if there's only say 500 images is:
#!/bin/bash
dy=$(date '+%Y-%m-%d')
ffmpeg -framerate 24 -s hd1080 -pattern_type glob -i "/mnt/cams/Camera1/$dy/*.jpg" -vcodec libx264 -pix_fmt yuv420p Cam1-"$dy".mp4
MP4Box's cat command looks like:
MP4Box -cat Cam1-$dy7.mp4 -cat Cam1-$dy6.mp4 -cat Cam1-$dy5.mp4 -cat Cam1-$dy4.mp4 -cat Cam1-$dy3.mp4 -cat Cam1-$dy2.mp4 -cat Cam1-$dy1.mp4 "Cam1 - $dy1 to $dy7.mp4"
Needless to say help is immensely appreciated for my project
Here is something to get you started. It sorts the individual frames into time order, and then chunks them up into chunks of 500 and loops through all the chunks:
#!/bin/bash
# User-changeable number of frames per chunk
chunksize=500
# Rename files by date/time so they collate in order
jhead -n%Y-%m-%d_%H-%M-%S *.jpg
# Remove any remnants from previous runs (which may have been longer)
rm chunk* sub-*mp4
# Split filename list into chunks - chunkaa, chunkab, chunkac ...
ls *jpg | split -l $chunksize - chunk
# Put 'file' keyword before each filename
sed -i.bak 's/^/file /' chunk*
n=0
for c in chunk*; do
# Generate zero-padded output filename so that collates for final assembly too
out=$(printf "sub-%03d.mp4" $n)
echo Processing chunk $c into sequence $out
ffmpeg -f concat -i "$c" ... "$out"
((n+=1))
done
# Final assembly of "sub-*.mp4"
ffmpeg ... sub-*mp4 ...

Cross fading several audio files using sox

I'm trying to cross-fade several audio files together with a 3 second cross-fade and join them together in to one file with sox.
I can join several files together by the command below but not sure how to cross fade between each one:
sox $(ls /tmp/a*.wav | sort -n) /tmp/out/out.wav
I can cross fade two files with the commands below but not sure how to combine the first line that joins several files together with the second line that splices / cross fades
sox 100hz.wav 440hz.wav out.wav splice $(soxi -D 100hz.wav),3
I found this question but the answer doesn't work for me.
crossfading a group of audio files with sox splice
I don't know if you are aware of the crossfade_cat.sh script offered by sox. You could just use it successively:
./crossfade_cat.sh 1 440.wav 660.wav auto auto && ./crossfade_cat.sh 1 mix.wav 880.wav auto auto
Or if you want to crossfade a high number of wav files, to use all files in a directory you could use a shell loop, something like this:
crossfade_dur=1
i=0
for file in *.wav
do
i=$((i+1))
if [ $i -eq 1 ]
then
cp $file mix.wav
else
crossfade_cat.sh $crossfade_dur mix.wav $file auto auto
fi
done

Resources