I'm trying to find a way to record an HD IP camera feed in a 3rd party Windows 7 program which uses DirectShow.
The 3rd party program records other specific input signals so I'm stuck with it. In addition, I need to have the video feed sync with the other signals so a delay in the video feed is unacceptable.
I've attempted to do this utilizing different methods with over a dozen cameras. All methods other than the following have resulted in a video lag. (Attempted to use Gstreamer without success)
I'm using a Raspberry Pi to transmit the video feed from an IP camera to a Win 7 PC.
Raspivid is installed on the Pi with the following parameters: -t 0 -w 1280 -h 720 -hf -ih -fps 20 -o - | nc -k -l 2222
I can view the video feed through Mplayer with no lag by running the command: C:\mplayer\mplayer -fps 200 -demuxer h264es ffmpeg://tcp://192.168.1.6:2222
My problem is that I cannot pipe the feed into the 3rd party program. I believe I need to have a DirectShow source filter created to do this.
Would anyone be able to create a filter to do this or could you direct me to someone who can?
I would be more than happy to compensate someone for the development time.
Thanks much,
Chuck
Related
Trying to play a 3840x2160 video recorded by an iPhone 7 (#30fps), I get frequent pauses -- in the video, music keeps playing.
This happens both in firefox and when ffplay is invoked to play the file directly -- from command-line. The CPU is a dual E6700 #3.20GHz -- not super fast, but it should be able to play smoothly, shouldn't it? Video is Intel's "series 4" integrated chipset -- again, not a speed-daemon, but it should be adequate... Support for Intel's VA API is included.
I build ffmpeg-4.1 from source using FreeBSD port. As you can see, the port has a maddening amount of options -- including several different ones for the 264-codec.
Any suggestions for improving the decoding speed to the point, where it is watchable by a human? Thank you!
Is it possible to have a named pipe on my PS side of the Zedboard; that leads to a FIFO in the the PL side (using DMA,AXI,I2S etc) that I then revert to the audio out port and play songs from my PS side and listen from the audio out port on the PL side?
If yes then what steps are to be followed on the PS Side?
I'm guessing at mapping of user space into kernel space.
Yes, it turns out that ANALOG DEVICES has just the stuff you need.
There is a different kernel that Analog Device's maintains, which
includes both ALSA drivers
for the audio chip (ADAU1761) and the HDMI output (ADV7511).
https://github.com/analogdevicesinc/linux
there are a few zynq branches in there. Normally Xilinx pulls drivers
from there for their kernels,
but anyone can do the same.
The build instructions (if that's the sort of thing you want to do) is
at:
http://wiki.analog.com/resources/eval/user-guides/ad-fmcomms2-ebz/software/linux/zynq_2014r2
Or, alternatively you could just download the ready-made image for your particular board from this dropbox link:
https://www.dropbox.com/sh/yfbpj63pcenqatr/AAAt0s3xFXs47I7q5pNopheHa?dl=0
After you download the file; uncompress it with this command:
unxz -d sdimage-8G-zedboard.direct.xz
Find out the name of your SD Card with this command:
dmesg|tail
And then write the resulting image to your 8GB SD Card with this command:
sudo dd if=sdimage-8G-zedboard.direct of=/dev/sdX
where sdX is your particular SDCard which you noted from command dmesg|tail
This command will erase all the pre-existing data on the sd card so make sure you have a backup if that data is important to you.
WARNING: Please be VERY careful while using the dd command. Writing the image to the wrong /dev/sdX location could possibly lead to corruption of OS and/or the hardware also and is extremely risky.
After you burn the image; you're good to go! A full-blown graphical linux environment will turn up ( You need to connect an HDMI display; and use USB OTG port to use the mouse and keyboard)
NOTE: You can also choose between what path you want your sound to play;
whether from the headphone jack or through the HDMI cable.
ffmpeg has all kinds of options in it to record video off of a webcam, transcode video, send video up to streaming servers. Is there a way to loop over a file and make it look like a webcam?
I found this for Linux:
https://gist.github.com/zburgermeiszter/42b651a11f17578d9787
I've search around a lot to try to find something for Windows, but have not yet found anything.
No, that's not part of FFmpeg so you'll need to create this "virtual video device" yourself. See e.g. How to create virtual webcam in Windows 10?.
I wish to simultaneously play sounds through up to 12 mono speakers.
I could connect these to my MacBook using 6 USB soundcards, and use the left and right channel of each.
But how can I get the MacBook to play sound out of speaker #5, for example?
PS If anyone can see a smarter way to wire up 12 speakers to a MacBook, please do say!
You can setup an Aggregate Device (Audio Midi Setup > Create Aggregate Device), which allows the ability to combine multiple devices of the same model, or to combine multiple inputs and outputs for apps that don't support separate input and output devices. This Apple guide shows how it works and is surprisingly quite easy to setup.
Another way to route audio to multiple channels and outputs (up to 64) is with the free app/plug-in Soundflower. You can download a compiled version, or compile the source code if you want/need to specifically do something that the current compiled version might not.
So I was just wondering is there anything like this on a mac with the isight, where you can record video using the isight via the command line? Thanks in advance!
On Linux 'ffmpeg' uses the 'video4linux2' capture API, and on Windows there is a version called 'video4windows.' A version of this API does not exist for the Mac.
If you wish to record video from your iSight camera from the commandline, use this free software instead:
Wacaw - Webcam Tools for Mac OS X
Here is an example of its usage.
Step 1) See what video hardware is present:
wacaw -L
Step 2) Capture your video to file. On my MacBook, it reports my internal iSight camera as a USB device of ID '2' with an input of ID '0'. Here's how it looks for my MacBook. The 'video-device' may differ for your computer, and you might also be able to omit the '--video-input 0' section:
wacaw --video --video-device 2 --video-input 0 --duration 3 --VGA ~/MyMovie
Another alternative is to install Linux, and 'isight-firmware-tools', on your Mac hardware. There are details on using on this blog, though I did not test them. Unlike the Wacaw method above, you would have to boot into an entirely different OS to use this second alternative.
Hope this helps!