Playing a Media File from Isolated Storage - windows-phone-7

I am reading a wav file saved as a byte stream from a web service and want to play it back when my record is displayed. Phone 7 app.
My approach has been to save the byte stream to a wav file in isolated storage upon navigating to the record and subsequently set the source of my media player (MediaElement1) to that source when a button is clicked and play it back.
Below is my current code in my "PlayButton". (size matches byte stream but no audio results). If I set the stream to a WAV file stored as a resource it does work so perhaps I just need to know how to set the Uri to the Isolated storage file.
(e.g. following code works)
Mediaelement1.Source = new Uri("SampleData\\MyMedia.wav",UriKind.Relative) Works
Mediaelement1.Position = new TimeSpan(0,0,0,0) ;
Mediaelement1.Play() ;
Here is my code sample... any ideas?
IsolatedStorageFile isf = IsolatedStorageFile.GetUserStoreForApplication() ;
IsolatedStorageFileStream str = new IsolatedStorageFileStream(
"MyMedia.wav", FileMode.Open, isf) ;
long size = str.Length;
mediaelement mediaelement = new MediaElement() ;
mediaelement.SetSource(str) ;
mediaElement1.Source = mediaelement.Source ;
mediaElement1.Position = new TimeSpan(0, 0, 0, 0);
mediaElement1.Play();

You shouldn't have to create 2 media elements. Just call .SetSource on mediaElement1 directly.
I have similar code which sets the MediaElement source to a movie in isolated storage and that works fine:
using (var isf = IsolatedStorageFile.GetUserStoreForApplication())
{
using (var isfs = new IsolatedStorageFileStream("trailer.wmv", FileMode.Open, isf))
{
this.movie.SetSource(isfs);
}
}
With the above, movie is a MediaElement I've already created in XAML and set autoPlay to true.
I did have a few issues with the above when first getting it working.
I suggest trying the following to help debug:
Ensure that the file has been written to isolated storage correctly and in it's entirety.
Handle the MediaFailed event to find out why it isn't working.

One thing I noticed is that when the device is tethered to the computer the Audio doesn't work... Spent a couple hours with this one when trying to listen to mp3 files.

Related

Add a chapter track while creating a video with AVFoundation

I'm creating a video (QuickTime .mov format, H.264 encoded) from a bunch of still images, and I want to add a chapter track in the process. The video is being created fine, and I am not detecting any errors, but QuickTime Player does not show any chapters. I am aware of this question but it does not solve my problem.
The old QuickTime Player 7, unlike recent versions, can show information about the tracks of a movie. When I open a movie with working chapters (created using old QuickTime code), I see a video track and a text track, and the video track knows that the text track is providing chapters for the video. Whereas, if I examine a movie created by my new code, there is a metadata track along with the video track, but QuickTime does not know that the metadata track is supposed to be providing chapters. Things I've read have led me to believe that one is supposed to use metadata for chapters, but has anyone actually gotten that to work? Would a text track work?
Here's how I am creating the AVAssetWriterInput for the metadata.
// Make dummy AVMetadataItem to get its format
AVMutableMetadataItem* dummyMetaItem = [AVMutableMetadataItem metadataItem];
dummyMetaItem.identifier = AVMetadataIdentifierQuickTimeUserDataChapter;
dummyMetaItem.dataType = (NSString*) kCMMetadataBaseDataType_UTF8;
dummyMetaItem.value = #"foo";
AVTimedMetadataGroup* dummyGroup = [[[AVTimedMetadataGroup alloc]
initWithItems: #[dummyMetaItem]
timeRange: CMTimeRangeMake( kCMTimeZero, kCMTimeInvalid )] autorelease];
CMMetadataFormatDescriptionRef metaFmt = [dummyGroup copyFormatDescription];
// Make the input
AVAssetWriterInput* metaWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeMetadata
outputSettings: nil
sourceFormatHint: metaFmt];
CFRelease( metaFmt );
// Associate metadata input with video input
[videoInput addTrackAssociationWithTrackOfInput: metaWriterInput
type: AVTrackAssociationTypeChapterList];
// Associate metadata input with AVAssetWriter
[writer addInput: metaWriterInput];
// Create a metadata adaptor
AVAssetWriterInputMetadataAdaptor* metaAdaptor = [AVAssetWriterInputMetadataAdaptor
assetWriterInputMetadataAdaptorWithAssetWriterInput: metaWriterInput];
P.S. I tried using a text track instead (an AVAssetWriterInput of type AVMediaTypeText) and QuickTime Player says the result is "not a movie". Not sure what I'm doing wrong.
I managed to use a text track to provide chapters. I spent an Apple developer tech support incident and was told that this is the right way to do it.
Setup:
I assume that the AVAssetWriter has been created, and an AVAssetWriterInput for the video track has been assigned to it.
The trickiest part here is creating the text format description. The docs say that CMTextFormatDescriptionCreateFromBigEndianTextDescriptionData takes as input a TextDescription structure, but neglects to say where that structure is defined. It is in Movies.h, which is in QuickTime.framework, which is no longer part of the Mac OS SDK. Thanks, Apple.
// Create AVAssetWriterInput
AVAssetWriterInput* textWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeText
outputSettings: nil ];
textWriterInput.marksOutputTrackAsEnabled = NO;
// Connect input to writer
[writer addInput: textWriterInput];
// Mark the text track as providing chapter for the video
[videoWriterInput addTrackAssociationWithTrackOfInput: textWriterInput
type: AVTrackAssociationTypeChapterList];
// Create the text format description, which we will need
// when creating each sample.
CMFormatDescriptionRef textFmt = NULL;
TextDescription textDesc;
memset( &textDesc, 0, sizeof(textDesc) );
textDesc.descSize = OSSwapHostToBigInt32( sizeof(textDesc) );
textDesc.dataFormat = OSSwapHostToBigInt32( 'text' );
CMTextFormatDescriptionCreateFromBigEndianTextDescriptionData( NULL,
(const uint8_t*)&textDesc, sizeof(textDesc), NULL, kCMMediaType_Text,
&textFmt );
Writing a Sample:
CMSampleTimingInfo timing =
{
CMTimeMakeWithSeconds( endTime - startTime, timeScale ), // duration
CMTimeMakeWithSeconds( startTime, timeScale ),
kCMTimeInvalid
};
CMSampleBufferRef textSample = NULL;
CMPSampleBufferCreateWithText( NULL, (CFStringRef)theTitle, true, NULL, NULL,
textFmt, &timing, &textSample );
[textWriterInput appendSampleBuffer: textSample];
The function CMPSampleBufferCreateWithText is taken from the open source CoreMediaPlus.

Firefox 37 throwing error when trying to add microphone volume control for WebRTC audio context

Since firefox 37 I cannot add volume control to the input(microphone), i get the error :
IndexSizeError: Index or size is negative or greater than the allowed amount
It works fine on Chrome.
Here is the code sample :
var audioContext = new (window.AudioContext || window.webkitAudioContext)(); // define audio context
var microphone = audioContext.createMediaStreamDestination();
var gain = audioContext.createGain();
var speaker = audioContext.createMediaStreamDestination(gain);
gain.gain.value = 1;
microphone.connect(gain);
gain.connect(speaker);
The error is thrown here :
microphone.connect(gain);
weirdly it works on firefox nightly.
This error is similar to this stackoverflow :link
Related link :
link on StackOverflow
Shouldn't you use this for microphone?
var microphone = audioContext.createMediaStreamSource();
instead of this
var microphone = audioContext.createMediaStreamDestination();
A microphone is not a destination. It is a source.
Firstly I think it should be
var microphone = audioContext.createMediaStreamSource(stream);
Here stream is the microphone audio stream. Find more info here.
Also check out this demo with elaboration here. It is similar to what you are trying. Replace createMediaElementSource with createMediaStreamSource will work.

how to create a photo unique filename for isolated storage

I'm adding a Astronomy Picture of The Day to my Windows Phone Astronomy app, and I want to allow users to save the displayed photo to their media library. All of the examples I found show how to do this, but all of the filenames are hard coded and overwrite files that have the existing name. So I need a way to create a unique file name. How can I adjust this example to create a unique filename?
// Create a filename for JPEG file in isolated storage.
String tempJPEG = "fl.jpg";
// Create virtual store and file stream. Check for duplicate tempJPEG files.
var store = IsolatedStorageFile.GetUserStoreForApplication();
if (store.FileExists(tempJPEG))
{
store.DeleteFile(tempJPEG);
}
IsolatedStorageFileStream fileStream = store.CreateFile(tempJPEG);
StreamResourceInfo sri = null;
Uri uri = new Uri("fl.jpg", UriKind.Relative);
sri = Application.GetResourceStream(uri);
BitmapImage bitmap = new BitmapImage();
bitmap.SetSource(sri.Stream);
WriteableBitmap wb = new WriteableBitmap(bitmap);
Thanks in advance for any help.
Provided you don't expect multiple saves per second
String tempJPEG = DateTime.Now.ToString("yyyy-MM-dd-HH-mm-ss")+".jpg";
Or some variant of that.
Just one way.

wp7 record and play video concurrently

I would like to record and play a video on Windows Phone 7 simultaneously.
For recording I use:
CaptureSource captureSource = new CaptureSource();
VideoBrush videoBrush = new VideoBrush();
videoBrush.SetSource(captureSource);
uxScreen.Fill = videoBrush;
captureSource.Start();
For playing:
IsolatedStorageFileStream isoVideoFile;
isoVideoFile = new IsolatedStorageFileStream("aaa.mp4",FileMode.Open, FileAccess.Read, IsolatedStorageFile.GetUserStoreForApplication());
uxScreen2.SetSource(isoVideoFile);
Separately they work like the should, but if I try to play and record simultaneously I've got an error "NotSupportedException was unhandled" 0x80131515
Is it possible to play and record a video in the same time or maybe it's hardware restricted?

Base64String in Windows 8

I've got a windows 8 program that uses an image picker and downloads the selected image on the server.
The server provides an API which needs the image to be converted in base64string. And an image must be less than 7Mb.
I'm using the code below:
FileOpenPicker openPicker = new FileOpenPicker();
openPicker.ViewMode = PickerViewMode.Thumbnail;
openPicker.SuggestedStartLocation = PickerLocationId.PicturesLibrary;
openPicker.FileTypeFilter.Add(".jpg");
openPicker.FileTypeFilter.Add(".jpeg");
openPicker.FileTypeFilter.Add(".png");
StorageFile file = await openPicker.PickSingleFileAsync();
if (file != null)
{
// Application now has read/write access to the picked file
bitmap = new BitmapImage();
byte[] buf;
using (var stream = await file.OpenStreamForReadAsync())
{
buf = ReadToEnd(stream);
}
using (var stream = await file.OpenAsync(FileAccessMode.Read))
{
base64String = Convert.ToBase64String(buf);
bitmap.SetSource(stream);
}
}
And the bitmap goes to the server.
But there is a problem: the bitmap size is much more bigger than jpg's size, for example. And none of small jpgs go to the server, because their bitmap version is larger than 7 Mb.
Can I convert an image to base64string without converting it to a bitmap?
In this code, you read the image (encoded in jpeg) and convert it to a base 64 string.
You can not reduce the size of the base 64 without reducing the size of the image.
To do so, you can use a BitmapEncoder/Decoder and resize the image to a smaller size.
Regards

Resources