My app saves OPUS live streaming to a file.
This file is a stream in the middle of a broadcast with a header attached to the front.
In this case, playback is possible, but the playback time is incorrect.
Is there any way to fix this?
Page 1-3
enter image description here
Page End
enter image description here
VLC Media Player
enter image description here
Related
I am developing a H265 directshow decoder that will handle live stream. I am facing issue while rendering the live stream. It only shows one frame on the rendered active window. On the other the function I am using to fill the output buffer is continuously filling the output buffer.
For testing purposes I stored the output buffer in a file and then rendered it using some YUV player and that file dont have that issue. This means that the buffer is getting the frames but then why render show only one. Where can be the issue.
Thanks.
I am trying capture several images from a RTP stream in order to make a timelapse video, I would like the images show a on-screen time label. I have been using this command:
vlc.exe rtsp://192.168.1.49/live/main --video-filter=scene --marq-marquee=Time:%H:%M:%S --marq-position=9 --sub-filter=marq --scene-prefix=Timelapse- --scene-format=jpg --scene-path="c:\Timelapse" --scene-ratio 200 --sout-x264-lookahead=10 --sout-x264-tune=stillimage --run-time 43200
I can see the time label in the VLC interface, but when the images are saved they do not show this marquee.
Any suggestion?
Thanks in advance
May be it's too late but i spend long time to find the solution:
This is part for loading module marq and adding overlay with time:
--sub-filter=marq --marq-marquee='%Y-%m-%d %H:%M:%S' --marq-color=32768 --marq-position=20 --marq-opacity=25 --marq-refresh=-1 --marq-size=15
also you need to add module to transcode:
#transcode{vcodec=h264,vb=2000,acodec=mpga,ab=128,channels=2,samplerate=44100,sfilter=marq}:duplicate{dst=http{dst=:8080/stream.wmv},dst=file{dst=stream.mp4,no-overwrite}}'
This is my full code:
cvlc v4l2:///dev/video0 --quiet-synchro --no-osd --sub-filter=marq --marq-marquee='%Y-%m-%d %H:%M:%S' --marq-color=32768 --marq-position=20 --marq-opacity=25 --marq-refresh=-1 --marq-size=15 :v4l2-standard= :input-slave=alsa://hw:0,0 :live-caching=200 :sout='#transcode{vcodec=h264,vb=2000,acodec=mpga,ab=128,channels=2,samplerate=44100,sfilter=marq}:duplicate{dst=http{dst=:8080/stream.wmv},dst=file{dst=stream.mp4,no-overwrite}}' :sout-keep
Vlc stream via http and recording video fo file with timestamp overlay.
Hope it will help to other people who are looking for a way to get it.
I'm trying to make a batch of videos for uploading to youtube. My emphasis is on the audio (mostly MP3, with some WMA). I have several image files that need to be picked up at random to go with the audio (ie) display an image for 5 seconds before showing the next. I want the video to stop when the audio stream ends. How should I use ffmpeg to achieve this ?
Ref:
http://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
My video file is in .raw file and i wan read it frame by frame using bitmap method.But my video size too big is around 1.6gb.So What can i do to it in order to display in on my phone?
You need to transcode your video into a more suitable format for the phone, MP4 for example. You can transcode with software such as VLC.
I have an NC541 IP camera, which supposedly does have an MJPEG stream, as in the manual it says "The video is compressed by MJPEG", but I can not find a way of how to get that stream from the camera. Seems that it wants to work only with the build-in program, while I need the way mjpeg stream instead.
Any ideas? Thanks!
I don't have this camera, but on many you can simply right click on the video window in your browser, select properties, and it will tell you the URL of the raw stream. If this is a multi codec camera you may or may not get the mjpeg stream depending on which one is chosen for the camera's home page. This often works for me.