I would like to record and play a video on Windows Phone 7 simultaneously.
For recording I use:
CaptureSource captureSource = new CaptureSource();
VideoBrush videoBrush = new VideoBrush();
videoBrush.SetSource(captureSource);
uxScreen.Fill = videoBrush;
captureSource.Start();
For playing:
IsolatedStorageFileStream isoVideoFile;
isoVideoFile = new IsolatedStorageFileStream("aaa.mp4",FileMode.Open, FileAccess.Read, IsolatedStorageFile.GetUserStoreForApplication());
uxScreen2.SetSource(isoVideoFile);
Separately they work like the should, but if I try to play and record simultaneously I've got an error "NotSupportedException was unhandled" 0x80131515
Is it possible to play and record a video in the same time or maybe it's hardware restricted?
Related
I'm creating a video (QuickTime .mov format, H.264 encoded) from a bunch of still images, and I want to add a chapter track in the process. The video is being created fine, and I am not detecting any errors, but QuickTime Player does not show any chapters. I am aware of this question but it does not solve my problem.
The old QuickTime Player 7, unlike recent versions, can show information about the tracks of a movie. When I open a movie with working chapters (created using old QuickTime code), I see a video track and a text track, and the video track knows that the text track is providing chapters for the video. Whereas, if I examine a movie created by my new code, there is a metadata track along with the video track, but QuickTime does not know that the metadata track is supposed to be providing chapters. Things I've read have led me to believe that one is supposed to use metadata for chapters, but has anyone actually gotten that to work? Would a text track work?
Here's how I am creating the AVAssetWriterInput for the metadata.
// Make dummy AVMetadataItem to get its format
AVMutableMetadataItem* dummyMetaItem = [AVMutableMetadataItem metadataItem];
dummyMetaItem.identifier = AVMetadataIdentifierQuickTimeUserDataChapter;
dummyMetaItem.dataType = (NSString*) kCMMetadataBaseDataType_UTF8;
dummyMetaItem.value = #"foo";
AVTimedMetadataGroup* dummyGroup = [[[AVTimedMetadataGroup alloc]
initWithItems: #[dummyMetaItem]
timeRange: CMTimeRangeMake( kCMTimeZero, kCMTimeInvalid )] autorelease];
CMMetadataFormatDescriptionRef metaFmt = [dummyGroup copyFormatDescription];
// Make the input
AVAssetWriterInput* metaWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeMetadata
outputSettings: nil
sourceFormatHint: metaFmt];
CFRelease( metaFmt );
// Associate metadata input with video input
[videoInput addTrackAssociationWithTrackOfInput: metaWriterInput
type: AVTrackAssociationTypeChapterList];
// Associate metadata input with AVAssetWriter
[writer addInput: metaWriterInput];
// Create a metadata adaptor
AVAssetWriterInputMetadataAdaptor* metaAdaptor = [AVAssetWriterInputMetadataAdaptor
assetWriterInputMetadataAdaptorWithAssetWriterInput: metaWriterInput];
P.S. I tried using a text track instead (an AVAssetWriterInput of type AVMediaTypeText) and QuickTime Player says the result is "not a movie". Not sure what I'm doing wrong.
I managed to use a text track to provide chapters. I spent an Apple developer tech support incident and was told that this is the right way to do it.
Setup:
I assume that the AVAssetWriter has been created, and an AVAssetWriterInput for the video track has been assigned to it.
The trickiest part here is creating the text format description. The docs say that CMTextFormatDescriptionCreateFromBigEndianTextDescriptionData takes as input a TextDescription structure, but neglects to say where that structure is defined. It is in Movies.h, which is in QuickTime.framework, which is no longer part of the Mac OS SDK. Thanks, Apple.
// Create AVAssetWriterInput
AVAssetWriterInput* textWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeText
outputSettings: nil ];
textWriterInput.marksOutputTrackAsEnabled = NO;
// Connect input to writer
[writer addInput: textWriterInput];
// Mark the text track as providing chapter for the video
[videoWriterInput addTrackAssociationWithTrackOfInput: textWriterInput
type: AVTrackAssociationTypeChapterList];
// Create the text format description, which we will need
// when creating each sample.
CMFormatDescriptionRef textFmt = NULL;
TextDescription textDesc;
memset( &textDesc, 0, sizeof(textDesc) );
textDesc.descSize = OSSwapHostToBigInt32( sizeof(textDesc) );
textDesc.dataFormat = OSSwapHostToBigInt32( 'text' );
CMTextFormatDescriptionCreateFromBigEndianTextDescriptionData( NULL,
(const uint8_t*)&textDesc, sizeof(textDesc), NULL, kCMMediaType_Text,
&textFmt );
Writing a Sample:
CMSampleTimingInfo timing =
{
CMTimeMakeWithSeconds( endTime - startTime, timeScale ), // duration
CMTimeMakeWithSeconds( startTime, timeScale ),
kCMTimeInvalid
};
CMSampleBufferRef textSample = NULL;
CMPSampleBufferCreateWithText( NULL, (CFStringRef)theTitle, true, NULL, NULL,
textFmt, &timing, &textSample );
[textWriterInput appendSampleBuffer: textSample];
The function CMPSampleBufferCreateWithText is taken from the open source CoreMediaPlus.
I'm just getting started with TVOS and was wondering if anyone has addressed looping in TVOS / TVJS / TVML. I'm using this tutorial to create a simple interface for playing videos. My wrinkle is that I want these videos to play in a continuous seamless loop - basically a moving screensaver type effect - see example videos at art.chrisbaily.com.
Is there a simple way to do this, or do I need to build some kind of event listener to do the looping manually? I'd like the videos to be fairly hi res, and the videos would be somewhere between 1 and 3 minutes.
I was looking for an answer for this question as well. And found that the only way this would be possible is to create an event listener and adding a duplicate media item to the playlist. But honestly it is not that hard, given if you followed the tutorial that you listed in your post.
So the code would be something like
player.playlist = playlist;
player.playlist.push(mediaItem);
//Again push the same type of media item in playlist so now you have two of the same.
player.playlist.push(mediaItem);
player.present();
This will make sure that once your first video ends the second one starts playing which is essentially a loop. Then for the third one and on you implement an event listener using "mediaItemWillChange" property. This will make sure that once a video ends a new copy of the same video is added to the playlist.
player.addEventListener("mediaItemWillChange", function(e) {
player.playlist.push(mediaItem);
});
Just put the event listener before you start the player using
player.present();
Note this question and idea of this sort was already asked/provided on Apple's discussion board. I merely took the idea and had implemented it in my own project and now knowing that it works I am posting the solution. I found that if I pop the first video out of the playlist and add then push a new on the playlist as mentioned in the linked thread below, my videos were not looping. Below is the link to the original post.
How to repeat video with TVJS Player?
You could also set repeatMode on the playlist object
player.playlist.repeatMode = 1;
0 = no repeat
1 = repeat all items in playlist
2 = repeat current item
There's really no need to push a second media item onto the. Simply listen for the media to reach its end, then set the seek time back to zero. Here's the code.
Play the media.
private func playVideo(name: String) {
guard name != "" else {
return
}
let bundle = NSBundle(forClass:object_getClass(self))
let path = bundle.pathForResource(name, ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.avPlayer = AVPlayer(URL: url)
if (self.avPlayerLayer == nil) {
self.avPlayerLayer = AVPlayerLayer(player: self.avPlayer)
self.avPlayerLayer.frame = previewViews[1].frame
previewViews[1].layer.addSublayer(self.avPlayerLayer)
avPlayer.actionAtItemEnd = .None
//Listen for AVPlayerItemDidPlayToEndTimeNotification
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: self.avPlayer.currentItem)
}
avPlayer.play()
}
Play Again
func playerItemDidReachEnd(notification: NSNotification) {
let item = notification.object as? AVPlayerItem
item?.seekToTime(kCMTimeZero)
}
Since firefox 37 I cannot add volume control to the input(microphone), i get the error :
IndexSizeError: Index or size is negative or greater than the allowed amount
It works fine on Chrome.
Here is the code sample :
var audioContext = new (window.AudioContext || window.webkitAudioContext)(); // define audio context
var microphone = audioContext.createMediaStreamDestination();
var gain = audioContext.createGain();
var speaker = audioContext.createMediaStreamDestination(gain);
gain.gain.value = 1;
microphone.connect(gain);
gain.connect(speaker);
The error is thrown here :
microphone.connect(gain);
weirdly it works on firefox nightly.
This error is similar to this stackoverflow :link
Related link :
link on StackOverflow
Shouldn't you use this for microphone?
var microphone = audioContext.createMediaStreamSource();
instead of this
var microphone = audioContext.createMediaStreamDestination();
A microphone is not a destination. It is a source.
Firstly I think it should be
var microphone = audioContext.createMediaStreamSource(stream);
Here stream is the microphone audio stream. Find more info here.
Also check out this demo with elaboration here. It is similar to what you are trying. Replace createMediaElementSource with createMediaStreamSource will work.
how can I get the current playing list that is playing by "music + video" app. The only thing I can get so far is the current playing song, but users will have no idea what is the next song or what list is being played.
I check this http://msdn.microsoft.com/en-us/library/ff769558%28VS.92%29.aspx but it's about updating the hub. I want to update my app.....
For example, if I let's user choose to play 3rd Album in the Hub, I do it like this:
mySongCollection = library.Albums[3].Songs;
and to update my lists I use:
listbox.Itemssource = mySongCollection;
how can I get the current playing list from hub to mySongCollection list?
One way would be to loop through the MediaPlayer.Queue:
for(int i = 0; i < MediaPlayer.Queue.Count; i++)
{
Debug.WriteLine(MediaPlayer.Queue[i].Artist.Name);
//to add the song to the listbox
playListLb.Items.Add(MediaPlayer.Queue[i]);
}
I am reading a wav file saved as a byte stream from a web service and want to play it back when my record is displayed. Phone 7 app.
My approach has been to save the byte stream to a wav file in isolated storage upon navigating to the record and subsequently set the source of my media player (MediaElement1) to that source when a button is clicked and play it back.
Below is my current code in my "PlayButton". (size matches byte stream but no audio results). If I set the stream to a WAV file stored as a resource it does work so perhaps I just need to know how to set the Uri to the Isolated storage file.
(e.g. following code works)
Mediaelement1.Source = new Uri("SampleData\\MyMedia.wav",UriKind.Relative) Works
Mediaelement1.Position = new TimeSpan(0,0,0,0) ;
Mediaelement1.Play() ;
Here is my code sample... any ideas?
IsolatedStorageFile isf = IsolatedStorageFile.GetUserStoreForApplication() ;
IsolatedStorageFileStream str = new IsolatedStorageFileStream(
"MyMedia.wav", FileMode.Open, isf) ;
long size = str.Length;
mediaelement mediaelement = new MediaElement() ;
mediaelement.SetSource(str) ;
mediaElement1.Source = mediaelement.Source ;
mediaElement1.Position = new TimeSpan(0, 0, 0, 0);
mediaElement1.Play();
You shouldn't have to create 2 media elements. Just call .SetSource on mediaElement1 directly.
I have similar code which sets the MediaElement source to a movie in isolated storage and that works fine:
using (var isf = IsolatedStorageFile.GetUserStoreForApplication())
{
using (var isfs = new IsolatedStorageFileStream("trailer.wmv", FileMode.Open, isf))
{
this.movie.SetSource(isfs);
}
}
With the above, movie is a MediaElement I've already created in XAML and set autoPlay to true.
I did have a few issues with the above when first getting it working.
I suggest trying the following to help debug:
Ensure that the file has been written to isolated storage correctly and in it's entirety.
Handle the MediaFailed event to find out why it isn't working.
One thing I noticed is that when the device is tethered to the computer the Audio doesn't work... Spent a couple hours with this one when trying to listen to mp3 files.