I have a Xamarin.ios app that needs to play a video from the camera roll. The app records a video from the camera and then saves it to the camera roll. I then need to play this recording in the app. I have the OutputFileUrl from the e.g.
file:///private/var/mobile/Containers/Data/Application/872214F0-8C50-46ED-854C-D0C51AF11508/tmp/64E56883-701E-451D-8964-D974C17CAE7E-294-0000001438002E09.mov
However, if I pass this to the constructor of the MPMoviePlayerController it does not play e.g.
moviePlayer = new MPMoviePlayerController(new NSUrl("file:///private/var/mobile/Containers/Data/Application/872214F0-8C50-46ED-854C-D0C51AF11508/tmp/64E56883-701E-451D-8964-D974C17CAE7E-294-0000001438002E09.mov"));
What am I doing wrong here?
Not sure, but it looks like that might be a temp directory rather than the url of the video in the Camera Roll. I am not very familiar with this, but it might be that once you save it to the camera roll that the temp file is deleted. But the only way I can see to get the URL for the video in the camera roll is to use the UIImagePickerController which allows the user to select a video to play from their photos library. If you don't want to use a UIImagePickerController, then perhaps you should save the video to the App's documents folder as well as saving it to the camera roll?
Related
How can I overlay an image onto a video without changing the video file?
I have many videos and I want to be able to open them and overlay a ruler onto them and then measure the distance an individual moved visually. All I want is to play a video and then to open up an image with some transparency and position the image over the video. This way i would be able to look at the video and see how far the individual moved.
I would like to do this without having to embed the image like a watermark, because that is computationally expensive. I would need to copy the video, embed it with the ruler and then watch the video, then delete that video file. This seems unnecessary. I would like to just watch the video and have a transparent image over it while I a watching.
Is there a program that does this all together?
Alternatively, is there a program which I can use to open an image and make it transparent and then move it over the video that is playing?
Note: I am using Windows.
It sounds form your requirements that simply overlaying a separate image layer over the video will meet your needs.
Implementing this approach will depend on the video player client you are using, but you could implement an HTML5 based solution and play the videos locally with this (or even from a URL on the web if you have them there).
There is a nice answer with a working fiddle which shows how to do this with HTML5 here: https://stackoverflow.com/a/31175193/334402
One thing to note - you have not mentioned scale in your question. If you need to measure how far the person has moved in real distance, rather than in just cm's across the video screen, then you will need to somehow work out the scale of the video. This makes things considerably harder as the video may zoom in and out during the sequence you want to measure, so you would need some reference to calculate the scale for each frame. One approach would be to use the individual as a reference, assuming they are in all the frames you are interested in.
What about using good old VLC for that?
Open VLC go to Tools→Effects and Filters→Video Effects→Overlay and select Add logo checkbox:
Then, add your transparent overlay image and play any video with VLC. The output looks like this:
I need to be able take a video from Photos and re-rendering, both clipping it in time, changing the width and height, and frame rate. Certainly I need to start with:
PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init];
[self.asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
// Get full image
NSURL *url = [contentEditingInput fullSizeImageURL];
}];
And I should be able to adjust width, height and duration. Grab an NSData from that, write that out to the file syset.m
But the url is nil, which implies to me that I can't edit videos with the new Photos framework. (ALAsset didn't have a problem with this using AVAssetExportSession.) This makes sense since the Apple Dev sample code can't edit videos either.
Now, to make life easier I could just pass that url to an AVAssetExportSession but I can't, because it is nil. If I just modified width, height and duration I'd still need to grab an NSData from it, write that out to the file system.
I do not need to write the modified video back to Photos, I actually need the video on the file system since I'll be uploading it to our servers.
fullSizeImageURL is for working with Photo assets. You want the avAsset property when working with a video. Modify the actual video, not the metadata, by writing a new video file.
To do that, you could use that avAsset in an AVMutableComposition:
Insert the appropriate time range of the avAsset's video track (AVAssetTrack) into an AVMutableCompositionTrack. That'll do your trimming.
Place/size it appropriately using layer instructions. (AVMutableVideoCompositionLayerInstruction) to do your cropping and scaling.
I'd like to add a poster to my audio player previews just like with the video mode of the mediaelement.js player.
I've checked all the shortcodes and there doesn't seem to be anything for this. Basically all I want is a static image immediately above the audio player that is visible before and during playing.
The reason I want this to be an 'all-in-one' player is because my WP theme allows me to create posts in the 'audio' format so that the audio player appears at the top of my category view of recent posts. It'd be awesome to add that image for each post and have it part of the audio player. Kind of like video but with less overhead.
Thanks in advance.
I have application which simply is an animation (some circles moving around).
I want to know how can I save this animation as video like MP4?
OR is it possible to record(capture) things which happen inside a node and save it as video format?
There is no build-in functionality for that.
If you just want to record how your application run there are several tools for that. E.g Fraps
If you want to create your own video programmatically you need to use some 3rd party software (or write one), which allows to encode set of images to video. E.g. Xuggle. Here you can find how to take screenshots in JavaFX: Taking a screenshot of a scene or a portion of a scene in JavaFx 2.2
Can I use this code to save a recorded video in the Camera Roll?
Medialibrary.SavePictureToCameraRoll(fileName, Stream);
where Stream is photo stream or video stream.
You can't save videos in the camera roll from a third party application at the moment. That API can be used just to save pictures.
Apparently it is. Have you tried it yet?
A quick google's come up with this:
http://wp.qmatteoq.com/how-to-save-a-picture-captured-with-the-new-cameras-api-in-the-camera-roll-in-windows-phone-8/