FFMPEG - RTSP: where does sprop-parameter-sets come from? - ffmpeg

I'm encoding .mp4 video files (h265) with ffmpeg to create RTSP stream, put it to rtsp-simple-server and then after aler9's restreamer the stream is available to watch. I can watch the result in VLC (adding network source rtsp://my.server/stream), but can't watch the stream in Android application - no sprop-parameter-sets headers exist
So, the question is: where can I get these headers and how can I add them to the stream, so that everything worked?
P.S. if I add the stream to Wowza, these headers (sprop-vps, sprop-sps, sprop-pps) are added by Wowza to the stream, but I need to use the server directly.

OK, found an answer: '-bsf:v', 'dump_extra'

Related

DASH playback for encrypted .webm video files in Shaka Player

I have been trying to play encrypted .WEBM media files in ShakaPlayer without much success and I am here to seek advise from anybody who had been through this. It would be great if somebody in this awesome developer community can guide me here.
Round 1 - What I tried (Encoded & Dashed):
Encoded .MP4 file to multiple-streams Video .WEBM (VP9) &
single-stream Audio .WEBM (Vorbis) files using FFMPEG.
Created DASH MANIFEST.MPD file with WEBM_TOOLS/WEBM_DASH_MANIFEST
Outcome: I am able to play this in Shaka Player without any issues.
Round 2 - What I tried (Encoded, Encrypted & Dashed):
Encoded .MP4 file to multiple-streams Video .WEBM (VP9) & single-stream Audio .WEBM (Vorbis) files using FFMPEG.
Encrypted generated .WEBM files with WEBM_TOOLS/WEBM_CRYPT
Created DASH MANIFEST.MPD file with WEBM_TOOLS/WEBM_DASH_MANIFEST
Outcome: I don't know how should I play this content in Shaka Player. Where and how should I provide the .key file generated in step 2 above to Shaka Player. I would like to use Clearkeys with CENC on browser. I don't want to encode to multi-stream .MP4, but only .WEBM.
Thanks so much!
If you just want to test the content then you can configure the clear keys directly in the Shaka player itself. From their documentation at https://github.com/google/shaka-player/blob/master/docs/tutorials/drm-config.md:
player.configure({
drm: {
clearKeys: {
'deadbeefdeadbeefdeadbeefdeadbeef': '18675309186753091867530918675309',
'02030507011013017019023029031037': '03050701302303204201080425098033'
}
}
});
If you want to have the player request the keys from a key server, which is like a typical DRM interaction, then you need to have a license server (key server) that you request the key from. You'd don't really need to do this if all you want to do it make sure that you are packaging and encrypting the content correctly - the local clearkey config above will probably do fine for you.

Retrieving Timed Metadata on Chromecast

We are developing a custom receiver for HLS video playback on Chromecast.
Our stream has Timed ID3 Metadata embedded in the MPEG-2 Transport Stream (TS files). I need the exact position in the stream that these ID3 tags are located for our app to function properly.
In my Custom Receiver, I am registering for the Host.processMetadata event, and am receiving the metadata tags as the fragments are processed, but I am unable to determine the position of the tags in the stream.
I am looking for the best way to determine the position in the stream that the Timed Metadata is located. Is there an API Call I am missing?
Notes:
We are able to stream our HLS video, using the proper CORS headers.
We are getting the accurate position of the Timed Metadata when playing this stream on iOS and Android players.
We are working with an Android Sender.
Working with:
Cast Receiver 2.0.0
Media Player 1.0.0
Chromecast Firmware Version 26653
Thank you!
We are working on adding a new feature to MPL to address this exact same issue, to make the media time corresponding to the ID3 data available in processMetadata. I'll try to update this post when that is implemented and released.
Google has updated the Host API's ProcessMetadata method a week or two after I posted this question. The callback now includes the time in the stream that the metadata is located.
See the docs for more:
https://developers.google.com/cast/docs/reference/player/cast.player.api.Host#processMetadata

Audio track switching impossible with MPL 0.7.0 and smoothstream protocol

What steps will reproduce the problem?
1. On mediaManager loadData, stream info (audio) is sent to sender.
2. Selected audio track index is sent back to receiver.
3. Receiver disables the existing audio stream and enables the new selected audio stream and reloads the player.
What is the expected output? What do you see instead?
Expected Output: Audion stream should be enabled on the protocol, and no duplicate of stream info should be present.
Current output: No new audio stream is enabled and duplicate stream info is present,and if you change the audio stream again on client and send it to receiver, now the stream info count is tripled.
What version of the product are you using? On what operating system?
Mac, MPL 0.7.0
Smooth streaming playready DRM.
Please provide any additional information below.
Also something very strange: My asset has swedish audio lang as default, but while fetching the audio stream info from smoothstreaming protocol, the first lang is selected by default (in this case it is finish), no matter what. Even if one tries to modify the protocol manually, MPL throws an exception as "Uncaught exception: cannot call method Ra of null"
And it always reamins the first lang of the protocol audio stream , no matter how many times the player is reloaded.
P.S. - There is no API document for smoothstream protocol, the one which is present in API refrences, slapped me with 404 error.
https://developers.google.com/cast/docs/reference/player/player.StreamingProtocol

Uploading an mp3 audio file to Http Server

I am trying to record an audio file from the microphone and uploading the same to a server using MultiPart data POST method. I would appreciate some help.
I am currently working on C# and making the code for Windows Phone 7.
Would deeply appreciate some help and if possible, some code reference as i am a newbie on the platform.
When recording from the Microphone, your data is raw PCM data. You will have to encode it manually, which may not be an easy task if you want to encode it into an MP3. You could upload the file (example in the link ba__friend` posted) as a WAV, or simply the raw data, and convert it to an MP3 on your server.

Windows Phone 7 ASX Streaming

I'm trying a small app which play a asx streaming file. My understanding was i should parse the asx and get the URL. But in my case, REFHREF in ASX points like this www.website.com:8084. Is this the server configuration need to be modified ? Totally new to this audio streaming protocols. Any suggestion would be much appreacited ...
My code streams audio fine when i test with a ww.website.com/file.MP3
the URL from refhref may most probably lead you to another asx file, which needs to be parsed again, i would advice recursively parsing till u reach a valid playable stream! Wp7 supports asx streams but it disallows(throws an exception for some tags check here )
so you will have to parse the asx yourself, extract the URL and process it further!
Good Luck! post your findings too!

Resources