Can anyone share the code and which ActiveX control did you used to integrate IP camera in vb6?
This is a very wide scoped question. There are various 3rd party controls that can receive and render motion JPEG/H.264 like the DTK video capture control.
Alternatively, you can set up a TCP/HTTP connection to receive the video stream, split it into MIME parts and decode and render each in turn.
Related
This says:
A capture device is represented in Media Foundation by a media source object, which exposes the IMFMediaSource interface. In most cases, the application will not use this interface directly, but will use a higher-level API such as the Source Reader to control the capture device.
When I have an IMFMediaSource I can use MFCreateSourceReaderFromMediaSource and create the source reader. However this function fails with MF_E_MULTIPLE_SUBSCRIBERS when I'm also previewing the video I want to capture, which is what I 'd want.
hr = MFCreateSourceReaderFromMediaSource(t.source, 0, &t.rdr); // Fails if I'm previewing, suceeeds when I'm not.
Is there a way to aquire a SourceReader in order to capture video I'm already previewing?
I'm previewing with the Media Session.
Or, if this is not possible, how do I use the IMFMediaSource directly without a source reader? Much like what the embedded Camera application does.
Thanks a lot.
MF_E_MULTIPLE_SUBSCRIBERS part is touched by another question: you cannot have the source managed by Media Session and additionally work with it otherwise.
If you need both you need to either use a tee in that media session and use two legs of the tee to preview and capture.
Alternatively, manage the source yourself and use a custom-developed proxy in the media session to accept data from the source.
Or, if this is not possible, how do I use the IMFMediaSource directly without a source reader? Much like what the embedded Camera application does.
Or just get rid of Media Session, read from media source directly and use the data outside of Media Foundation.
The use pattern is rather straightforward (and repeats what media session or source reader would do on your behalf): create presentation descriptor, set it up, subscrive to events, start, receive samples. This API is fully documented.
This is my first time trying chromecast apps.
I started with the CastVideos-android paired with a Styled Media Player with custom skin url. After some hurdles, was able to get the custom skin to work plus the video clips from the sender app plays nicely.
Now I'm attempting a custom media player using the sample CastHelloVideoPlayer receiver app from google's sample list and paired with the CastVideos-android sender app. After creating a new application id and recompiled CastVideos-android, I tried to cast some videos to the cast device.
1) First thing I noticed is the TV is purely blanked. No default app name or anything, just plain black screen. Didn't think much about it since this is a custom media player, a lot of things may not be set such as the logo/splash/watermark.
2) Main issue I have encountered, when I tried to play a video clip the cast device remains blank. Looking at the chrome debugging console I noticed this error message:
[ 32.941s] [cast.receiver.MediaManager] Load metadata error: [object Object]pd # cast_receiver.js:formatted:2249nd.Zc # cast_receiver.js:formatted:2234tb.log # cast_receiver.js:formatted:675G # cast_receiver.js:formatted:710W.Yb # cast_receiver.js:formatted:4855g.Yb # cast_receiver.js:formatted:3660Jc # cast_receiver.js:formatted:1500Gc # cast_receiver.js:formatted:1550(anonymous function) # cast_receiver.js:formatted:1447
cast_receiver.js:formatted:2249 [ 32.955s] [cast.receiver.MediaManager] Sending error message to b5d9d1e6-f6d6-a0bd-440c-fe7255ebfcbc.11:com.google.sample.cast.refplayer-172
Now I'm surprised to encounter this because the same video clips played nicely when I'm using the Styled Media Player. But failed when I used the sample CastHelloVideoPlayer?
Just found out that the sample receiver app I took from Chromecast sample page is limited to the default media containers defined in Custom Receiver Supported media format. And the sample sender app is sending m3u8 containers. So after changing the sender app to select the correct target media (mp4), everything starts working.
To support HLS, need to use the Google Cast Media Library which is not part of any of the sample apps.
I need to implement continuous playback in my google cast custom receiver. For that I am handling the video event 'ended' after playing my first content, and I have to make an API call there to get the next content's media url. And now I am confusing about how can I restart playing with my new content.
Please advice.
Thanks in advance.
You can utilize MediaQueueItem (and .Builder as well) to create queue items before its being played (essentially a playlist itself). RemoteMediaPlayer.queueLoad is already used by the VideoCastManager to load and start a new queue of media items.
I found an example on github for closedcaptions on custom receiver which is about two years old. This example uses sender-receiver communication messageBus to send message to receiver to add a track element to show captions. At that time, chromecast's default media receiver did not support caption track. But as of today, it supports it which can be enabled using chrome.cast.media.Media.editTracksInfo. I tried using editTracksInfo API method to enable captions on my custom receiver which is built using Media Player Library, but it did not work. Can someone please confirm if I still will have to use messageBus to tell my receiver app to create/insert track elements to add captions support to my receiver or can I leverage MPL to automatically do it?
You shouldn't need any custom message; a fully UX compliant receiver sample that supports Closed Captions is available on our GitHub repo. You can use that receiver as the base for your own or you can read through it to see how MPL is utilized to provide closed captions for adaptive media. If you have a simple mp4 + side-loaded vtt file, then MPL doesn't get involved. On the sender side, we now support tracks through the SDK directly; if you want to see a sample of that, take a look at CCL.
we're using flow player with Wowza. We've managed to get it so the stream starts on opening the page, however the stream is made up of several individual videos and between videos the user has to click play to resume the stream. Is there anyway to get flowplayer to automatically play the next video? Thanks
Dobro.
Assuming you are using the flash version of flowplayer (not the HTML5). Flowplayer offers a complete api, with events and properties to modify every aspect of the original configuration. For instance you could declare the 'onFinish' method in the clip, and then load another clip automatically
Check the flowplayer documentation:
http://flash.flowplayer.org/documentation/api/clip.html