Uploading an mp3 audio file to Http Server - windows-phone-7

I am trying to record an audio file from the microphone and uploading the same to a server using MultiPart data POST method. I would appreciate some help.
I am currently working on C# and making the code for Windows Phone 7.
Would deeply appreciate some help and if possible, some code reference as i am a newbie on the platform.

When recording from the Microphone, your data is raw PCM data. You will have to encode it manually, which may not be an easy task if you want to encode it into an MP3. You could upload the file (example in the link ba__friend` posted) as a WAV, or simply the raw data, and convert it to an MP3 on your server.

Related

FFMPEG - RTSP: where does sprop-parameter-sets come from?

I'm encoding .mp4 video files (h265) with ffmpeg to create RTSP stream, put it to rtsp-simple-server and then after aler9's restreamer the stream is available to watch. I can watch the result in VLC (adding network source rtsp://my.server/stream), but can't watch the stream in Android application - no sprop-parameter-sets headers exist
So, the question is: where can I get these headers and how can I add them to the stream, so that everything worked?
P.S. if I add the stream to Wowza, these headers (sprop-vps, sprop-sps, sprop-pps) are added by Wowza to the stream, but I need to use the server directly.
OK, found an answer: '-bsf:v', 'dump_extra'

How do I test a webm videostream using jmeter?

AFAIK there are 2 video streaming plugins available for JMeter:
BlazeMeter - HLS Plugin and
UbikLoadPack Video Streaming plugin
UbikLoadPack as a prohibitive pricing tag and HLS Plugin doesn't test the format I need. Also I want a FOSS solution not a paid one.
Does anyone know of some other plugin or method I could test a webm video stream ?
Edit
#dmitri-t when I try to do this it just hangs. Here I found this script that shows how to test a video. But when I changed the parameters to my video and range to 0-100 it hanged.
Also the example is using HTTP and my video uses HTTPS.
Tried to include a timer. It hangs also.
Yet the video loads perfectly in Chrome with the same url I used in Jmeter.
I also tested the request with Postman. It ignores the range header. So what's probably happening in JMeter is that it's trying to load the whole continuous stream. How do I make it consider range header ?
I tested with Postman on an image in the same server to see if range header was being considered or if it was a server problem and range was respected correctly.
Content-Range header doesn't work also. Please check this related question I did relative to the range problem with streams: Request to a webm stream ignores range header
I don't think you need any form of plugin, you can simulate the browser playing the video using normal HTTP Request sampler sending simple HTTP GET request
Here is the evidence that "playing" an webm "stream" is nothing more than downloading it.
It would be a good idea to add Timers to simulate users watching the video till the end (or according to your test case)

DASH playback for encrypted .webm video files in Shaka Player

I have been trying to play encrypted .WEBM media files in ShakaPlayer without much success and I am here to seek advise from anybody who had been through this. It would be great if somebody in this awesome developer community can guide me here.
Round 1 - What I tried (Encoded & Dashed):
Encoded .MP4 file to multiple-streams Video .WEBM (VP9) &
single-stream Audio .WEBM (Vorbis) files using FFMPEG.
Created DASH MANIFEST.MPD file with WEBM_TOOLS/WEBM_DASH_MANIFEST
Outcome: I am able to play this in Shaka Player without any issues.
Round 2 - What I tried (Encoded, Encrypted & Dashed):
Encoded .MP4 file to multiple-streams Video .WEBM (VP9) & single-stream Audio .WEBM (Vorbis) files using FFMPEG.
Encrypted generated .WEBM files with WEBM_TOOLS/WEBM_CRYPT
Created DASH MANIFEST.MPD file with WEBM_TOOLS/WEBM_DASH_MANIFEST
Outcome: I don't know how should I play this content in Shaka Player. Where and how should I provide the .key file generated in step 2 above to Shaka Player. I would like to use Clearkeys with CENC on browser. I don't want to encode to multi-stream .MP4, but only .WEBM.
Thanks so much!
If you just want to test the content then you can configure the clear keys directly in the Shaka player itself. From their documentation at https://github.com/google/shaka-player/blob/master/docs/tutorials/drm-config.md:
player.configure({
drm: {
clearKeys: {
'deadbeefdeadbeefdeadbeefdeadbeef': '18675309186753091867530918675309',
'02030507011013017019023029031037': '03050701302303204201080425098033'
}
}
});
If you want to have the player request the keys from a key server, which is like a typical DRM interaction, then you need to have a license server (key server) that you request the key from. You'd don't really need to do this if all you want to do it make sure that you are packaging and encrypting the content correctly - the local clearkey config above will probably do fine for you.

Retrieving Timed Metadata on Chromecast

We are developing a custom receiver for HLS video playback on Chromecast.
Our stream has Timed ID3 Metadata embedded in the MPEG-2 Transport Stream (TS files). I need the exact position in the stream that these ID3 tags are located for our app to function properly.
In my Custom Receiver, I am registering for the Host.processMetadata event, and am receiving the metadata tags as the fragments are processed, but I am unable to determine the position of the tags in the stream.
I am looking for the best way to determine the position in the stream that the Timed Metadata is located. Is there an API Call I am missing?
Notes:
We are able to stream our HLS video, using the proper CORS headers.
We are getting the accurate position of the Timed Metadata when playing this stream on iOS and Android players.
We are working with an Android Sender.
Working with:
Cast Receiver 2.0.0
Media Player 1.0.0
Chromecast Firmware Version 26653
Thank you!
We are working on adding a new feature to MPL to address this exact same issue, to make the media time corresponding to the ID3 data available in processMetadata. I'll try to update this post when that is implemented and released.
Google has updated the Host API's ProcessMetadata method a week or two after I posted this question. The callback now includes the time in the stream that the metadata is located.
See the docs for more:
https://developers.google.com/cast/docs/reference/player/cast.player.api.Host#processMetadata

Windows Phone 7 ASX Streaming

I'm trying a small app which play a asx streaming file. My understanding was i should parse the asx and get the URL. But in my case, REFHREF in ASX points like this www.website.com:8084. Is this the server configuration need to be modified ? Totally new to this audio streaming protocols. Any suggestion would be much appreacited ...
My code streams audio fine when i test with a ww.website.com/file.MP3
the URL from refhref may most probably lead you to another asx file, which needs to be parsed again, i would advice recursively parsing till u reach a valid playable stream! Wp7 supports asx streams but it disallows(throws an exception for some tags check here )
so you will have to parse the asx yourself, extract the URL and process it further!
Good Luck! post your findings too!

Resources