I have been trying to play FLV videos in ShadowBox.
According to the FAQ,
I am using Shadowbox.init(); on page load.
And then later on
Shadowbox.open({
content: http://mysite.com/video.flv,
title: 'video',
player: 'flv'
});
But to no avail. The video doesnt even stream in the background, I get to see only a black box. However, when I use player: 'swf', the video atleast streams in the background, but doesnt play.
Any help ?
It's working now.. !!
With the same code written above..!
Related
I am building a screen recording application using fluent-ffmpeg.
let command = ffmpeg({ source: '1:0' })
command.inputFormat('avfoundation')
command.fps(60)
command.videoCodec('libx264')
command.audioBitrate('320k')
command.audioCodec('libmp3lame')
command.addOption('-pix_fmt', 'yuv420p')
command.save('output.mp4')
I am able to record the screen, but the audio of the screen is not being recorded. For example: When I play a YouTube clip and record my screen, the video will be there but the audio from the YouTube clip not.
Am I missing an option in ffmpeg? I cant get the screen audio (I am not talking about a microphone).
Edit: I want to use ffmpeg instead of Electrons desktopCapture as ffmpeg has more options and is more powerful.
i'm trying to use gst to generate an hls video from frames within an existing pipeline. once i get the frame as a numpy array i use the following to create the ts and m3u8 file :
appsrc emit-signals=True do-timestamp=true is-live=True
caps={DEFAULT_CAPS}".format(**locals()) !
"queue" !
"videoconvert" !
"x264enc" !
"mpegtsmux" !
f"hlssink location={playlist}.%04d.ts " !
f"playlist-location={playlist}.m3u8"])
where default caps = "video/x-raw,format={VIDEO_FORMAT},width={WIDTH},height={HEIGHT},framerate={FPS_STR}".format(**locals())
here's an example of the m3u8 file :
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:NO
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-TARGETDURATION:15
#EXTINF:15.000000953674316,
20201014_103647.0000.ts
#EXTINF:15.000000953674316,
20201014_103647.0001.ts
#EXTINF:15.000000953674316,
20201014_103647.0002.ts
#EXTINF:7.8000001907348633,
20201014_103647.0003.ts
#EXT-X-ENDLIST
it's playing fine with my ubuntu video player and on chrome but not on safari and firefox. i've tried changing the pipeline a little but nothing worked and don't really know what's the problem.
does anyone have any idea ?
following the advice in the comments i tried changing the profile but it didn't change anything.
I also found that it adding a silent audio could resolve the problem cause the browser might be expecting that.
EDIT
so the combo audio + profile makes it work but since i'm using appsrc to get the frames i don't know how long the video is gonna be so how can i generate an audio without that information ?
thanks
So to make it work i set the profile to high and added an audio over the video.
I am trying capture several images from a RTP stream in order to make a timelapse video, I would like the images show a on-screen time label. I have been using this command:
vlc.exe rtsp://192.168.1.49/live/main --video-filter=scene --marq-marquee=Time:%H:%M:%S --marq-position=9 --sub-filter=marq --scene-prefix=Timelapse- --scene-format=jpg --scene-path="c:\Timelapse" --scene-ratio 200 --sout-x264-lookahead=10 --sout-x264-tune=stillimage --run-time 43200
I can see the time label in the VLC interface, but when the images are saved they do not show this marquee.
Any suggestion?
Thanks in advance
May be it's too late but i spend long time to find the solution:
This is part for loading module marq and adding overlay with time:
--sub-filter=marq --marq-marquee='%Y-%m-%d %H:%M:%S' --marq-color=32768 --marq-position=20 --marq-opacity=25 --marq-refresh=-1 --marq-size=15
also you need to add module to transcode:
#transcode{vcodec=h264,vb=2000,acodec=mpga,ab=128,channels=2,samplerate=44100,sfilter=marq}:duplicate{dst=http{dst=:8080/stream.wmv},dst=file{dst=stream.mp4,no-overwrite}}'
This is my full code:
cvlc v4l2:///dev/video0 --quiet-synchro --no-osd --sub-filter=marq --marq-marquee='%Y-%m-%d %H:%M:%S' --marq-color=32768 --marq-position=20 --marq-opacity=25 --marq-refresh=-1 --marq-size=15 :v4l2-standard= :input-slave=alsa://hw:0,0 :live-caching=200 :sout='#transcode{vcodec=h264,vb=2000,acodec=mpga,ab=128,channels=2,samplerate=44100,sfilter=marq}:duplicate{dst=http{dst=:8080/stream.wmv},dst=file{dst=stream.mp4,no-overwrite}}' :sout-keep
Vlc stream via http and recording video fo file with timestamp overlay.
Hope it will help to other people who are looking for a way to get it.
I'm trying to make a batch of videos for uploading to youtube. My emphasis is on the audio (mostly MP3, with some WMA). I have several image files that need to be picked up at random to go with the audio (ie) display an image for 5 seconds before showing the next. I want the video to stop when the audio stream ends. How should I use ffmpeg to achieve this ?
Ref:
http://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
I want to play video in WP7.
This is my code:
MediaPlayerLauncher player = new MediaPlayerLauncher();
player.Media = new Uri("video link", UriKind.RelativeOrAbsolute);
player.Location = MediaLocationType.Data;
player.Controls = MediaPlaybackControls.All;
player.Show();
This is working fine.
After finishing this video I want to continue playing another video. I want to play two videos one after another.
Is this possible in WP7? How can I accomplish this?
The title asks how to play videos in general. Are you aware of the MediaElement? It can be used to play back video as well and it has an event telling you when video playback ends. And it can also give you the video length.
This blog post has an example of both MediaElement and MediaPlayerLauncher.
The MediaPlayerLauncher does not expose an event or callback for you to find out if and/or when the video has ended. I am afraid it is not possible to hook into these events.