I am working on a WPF scoreboard project. I have created my dashboard. I am capturing my WPF window using with Chroma key technique with OBS (Open Broadcaster Software) and I can stream it over internet. Etc. Youtube.
I would like to use my same WPF project with TV source. I mean I would like to capture my WPF window and send it to TV source using PC HDMI output with alpha channel.
is it possible without using any hardware? If it is where will I start to learn?
Related
I want to create a wear application that has 2 mode : the connected mode and the disconnected mode. The first mode is used when the watch is connected to the bluetooth : you can control a media player and have the basic controls on what's being played on the phone.
The second mode is used when the mobile is not paired to the watch : you can listen to content on the watch with a bluetooth headset.
My question is the following : on the connected mode, I created my own interface with the basic controls (play/pause/next/previous) and I can synchronize the controls with the phone sending messages via the message api.
Is there a better way to do? Like using the Notifications?
If you want to have the basic media controls on your wear device (controlling then playback of your media app on your phone), then you can use the MediaSessionCompat to handle that for you; take a look at the UniversalMusicPlayer for example; basically if your media app is using the MediaSessionCompat (or MediaSession if you are not concerned with earlier versions of Android), then the basic controls should appear on your watch and if you implement MediSessionComapt.Callback in your media app, framework will send the control commands to your app from your watch.
I wish to develop a windows application that can steam video media to a TV. What protocol/interface allows two-way communication between a TV and a Computer?
I'm aware that HDMI is one-way communication (from the source to the TV) so the computers application cannot query what the current volume or current source is which is not useful. Is there a protocol that allows 2 way communication?
Also using HDMI is it possible to:
Change a TV's source? For example change to/from HDMI, AV1, AV2, etc.?
Change a TV's volume?
Or is it only possible to use HDMI to transmit video/audio data?
HDMI wont be able to control other aspects of tv like changing the vole or source. You can just stream via HDMI. If you wish to manage other things i would prefer you do it with chromecast or apple tv or roku. Chromecast is easy to start off.
does Anybody know if it is possible to capture the media you are playing on windows phone, so you can stream it to another source,
Just like apple does with airplay
No, it is not possible to write a 3rd party app that can capture audio or video from other apps and stream it somewhere else.
Airplay on iOS is a system level service that apps can use.
The closest thing Windows Phone has is Play To, which allows you to pick existing media on the phone (pictures, music, video, etc) and share it to a compatible device. Currently, this technology isn't available to developers.
i am looking for differnt solutions to capture video stream from monitor screen and send it to vidoestreaming server to broadcast in web. it must occuring in "live".
i'd not like to use external services like "procaster" for broad.
OS: Windows.
it will be great to know the ideas and expirience people have to accomplish that.
Thanks all.
Recently, I build a GoLang project called ScreenStreamer, is a tool to stream current active window or full screen (Linux's or Windows's) to other device, like phone or another PC, as MJPEG over http or FLV over rtmp, it's very realtime (delay < 100ms). It works on Windows and Linux.
After building it, you can run it as:
# enter the project root directory
cd ./src/ScreenStreamer
# run it
./mjpeg or .\mjpeg.exe
# use a web browser or other video player, open http://host:port/mjpeg
./rtmp or .\rtmp.exe
# use a video player, open rtmp://host:port/live/screen
Screenshot:
Windows SDK includes Push Source Filters Sample, which in turn contains CPushSourceDesktop filter/class.
CPushSourceDesktop: Copy of current desktop image (GDI only)
It captures desktop image and pushes it into DirectShow pipeline. From there on you can process it using video compression codec and stream it to remote location. A decent screen image compression codec is included with Windows Media subsystem, network streaming will have to be a custom or third party component. Alternatively, it is possible to make the capture class a virtual camera and have Windows Media Encoder broadcast it (or, it already has a simila feature built in).
Alternatively, you can check VNC (or one of the clones) source code and see how it hooks windows and captures image updates, then compresses them and makes it available for remote applications.
Note that you will have to specifically capture non-GDI images (such as coming from video/gaming applications, which use hardware acceleration and non-RGB surffaces).
Whats the best and easiest way to play an incoming live video stream in a c++ windows application (visual studio 2010) and write some notes (eg. this is a blue ball) on the stream display? ActiveX? DirectX? Flash?
I have Windows SDK 7.1 installed. Do I need to install any other software?
Appreciate any pointers.
In simplest, you can do everything you ask with just directshow. There is directshow.net managed library that wraps it for you.
So - try to find an example that just gets video from capture device to the renderer. Then, insert SampleGrabber filter in between those, and modify frame data accordingly. I am using such technique to draw timestamp on the recorded video in my recorder, I am even drawing it with simple GDI+ calls.
Thing to consider: you'll have to watch out for PICTURE format - some webcams have YUY2 as default or ONLY format. You'll want RGB24 format to be able to wrap Bitmap then Graphics around it.