Nativescript Audio and Video channels - nativescript

I am new to nativescript was wondering if there are any plugins that would stream audio and video data just like we have navigator.getmedia on browser.
Use case being i am trying to implement voice security(not recognition) and facial recognition using tensorflow

Looks like you are already using tensorflow for machine learning part, if you are looking for just the camera access to capture a picture / video, you could use nativescript-camera or nativescript-videorecorder.
There is also nativescript-plugin-firebase, their ML Kit already supports face detection.

Related

How to convert sequence of image file to video using c programming?

I am working on a V4L2 camera driver.The webcam taking number of sequence of image files.Now I want to convert it into video (mp4) file.How it is possible using FFMPEG/GSTREAM using pure c source code instead of ubuntu terminal command ?
GStreamer can be used to write applications using C. There are application guides available on their documentation site. But you need few extra topics related GObject as well to start using the framework.
I would recommend to go through their documentation site.
https://gstreamer.freedesktop.org/documentation/tutorials/index.html?gi-language=c
It is not that hard if you read through the documentation and they have lots of plugins available to to achieve quite lots of things related to audio/video processing.

How to get image data with DJI OSDK using DJI matrice 600 pro

I'm trying to do some image-processing using matrice 600pro(drone) with a jetson Xavier(mini computer) attached. A camera which has HDMI output function is attached to ronin-MX(gimbal) and data will be transmitted through SRW-60G(wireless video link using hdmi port). I thought using some functions in onboard sdk such like "/dji_sdk/main_camera_images (sensor_msgs/Image)"
(http://wiki.ros.org/dji_sdk)
can get image data easily, but I found that those functions are only available for M210 so I may cannot use these on my matrice 600pro.
Using a HDMI-USB converter may could solve this problem(make the camera-transmitter-receiver as a usb camera) but the converter is quite expensive and I'm not sure if there's better way to do this.
Any clue will be very helpful. Thank you!
as far as i know, osdk does not support video streaming for M600 series. The thing you can do is to use Ronnin gimbal with a 3rd party camera e.g ids/flir/matrixvision camera to direct connect to xavier. then based on this stream, do your processing.
If you need to stream the 3rd party video source down, its easier. Just use opencv imshow in full screen for current frame. This will output to desktop hdmi. Connect HDMI to the m600 video port. it will be stream down to your remote controller. THis is like a cheap version of the PSDK
Hope this will help you with your development work
Regards
Dr Yuan Shenghai

How to use Windows Media Foundation instead DirectShow Editing Services?

I am developing a non-linear video editor. I need to have the support timeline, mixing of audio streams, the transitions between videos, etc. These all features are in DirectShow Editing Services, but it is no longer supported in the new versions of Windows. Instead, offer to use Microsoft Media Foundation. Is it possible to implement the same functionality in the MF or is using other SDK? For example, gstreamer. Maybe someone will recommend SDK for video editing on the basis of MF?
With Media Foundation you have to implement it all by yourself. For instance: video trimming could be implemented by Source Reader to Sink Writer and you have to manipulate the samples manually to compare their timestamps with the required range etc. Trimming has been implemented in the MFCopy Media Foundation example already. MFCopy uses the Source Reader/Sink Writer approach, because this way the app has more control over the timestamps.
For a Windows 10 UWP App you can use the Windows.Media.Editing.MediaComposition class.

How to get Live video stream from Epiphan to VLC player?

Is there any way to get the live stream from the Epiphan device using Epiphan SDK to play the same in VLC or ffplay?
Using EPIPHAN SDK, I am able to grab the frames and also their SDK provides a way to convert the frames to GUI but i am not able to find a way to get the live stream of video.
Does ffmpeg lib provide a way to do so?
NitinG,
You don't need SDK itself for that. Epiphan driver already exposes the device via DirectShow/V4Lx interfaces (depending on the OS). It basically looks like a camera for VLC or ffmpeg. Start VLC and go to "Media"->"Open Capture Devices".
You need SDK only if you want to have programmatic access to more advanced, low level features that are difficult to expose over DirectShow or V4Lx interfaces.
Cheers.

Xbox 360 XNA HLS Streaming

I am interested in creating a simple HLS Viewer for the Xbox 360 that works kind of like SilverLive. I have read about as much documentation oh HLS that I can find and am not sure where to continue. I feel like I could probably write the code to handle HLS myself, it doesn't sound overly complex (unless there is something I am missing). I do not however, have any idea where to start with playing the video/audio segments on the 360 once they get there.
What does everyone think? Would I need to implement some sort of codec pack and use that to render the frames as a texture? If so, are there any functional codecs supported by .NET for mpeg2 and aac that don't use anything the 360 has no support for?
Unfortunately XNA Framework games are not able to access the Internet (or perform any network communication at all other than through Xbox LIVE or System Link), so creating a streaming application isn't possible.
What you want to achieve is possible using the professional development tools, but your game/application wouldn't pass the certification requirements for release on the Xbox 360 (because of its use of the Internet/LAN).

Resources