AVAssetWriter audio with video together - cocoa

I tried to write simple demo to capture a video with audio from iphone(like a in game recorder). After searching for some solutions I came up with the follow stuffs:
-(void) startScreenRecording
{
NSLog(#"start screen recording");
// create the AVAssetWriter
NSString *documentPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *moviePath = [documentPath stringByAppendingPathComponent: #"/video.mov"];
NSLog(#"moviePath:%#", moviePath);
if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
{
[[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
}
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
NSError *movieError = nil;
[assetWriter release];
assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
fileType: AVFileTypeQuickTimeMovie
error: &movieError];
NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
[NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
nil];
assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
outputSettings:assetWriterInputSettings];
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];
[assetWriterPixelBufferAdaptor release];
assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:nil];
[assetWriter startWriting];
firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
[assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
// start writing samples to it
[assetWriterTimer release];
assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
target:self
selector:#selector (writeSample:)
userInfo:nil
repeats:YES];
}
-(void) writeSample: (NSTimer*) _timer
{
if (assetWriterInput.readyForMoreMediaData)
{
CVReturn cvErr = kCVReturnSuccess;
// get screenshot image!
CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[AWScreenshot takeAsImage]] CGImage];
// prepare the pixel buffer
CVPixelBufferRef pixelBuffer = NULL;
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
FRAME_WIDTH,
FRAME_HEIGHT,
kCVPixelFormatType_32ARGB,
(void*)CFDataGetBytePtr(imageData),
CGImageGetBytesPerRow(image),
NULL,
NULL,
NULL,
&pixelBuffer);
// calculate the time
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
//NSLog (#"elapsedTime: %f", elapsedTime);
CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
if (appended)
{
NSLog (#"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
}
else
{
NSLog (#"failed to append");
[self stopScreenRecording];
}
}
}
And the video(.mov) file is generated successfully...
But, now I'd like to capture the audio from the iphone(say some sound effect and bg music while playing a game) along with the video...
I searched the net and all I got is solutions about "how to merge a already-exist sound file with a already-exist movie"...
Do I have to record the audio and video separately and then merge it after recording? Is there a way to capture them together ?
Any suggestion would be much appreciated, thanks :)

Related

Capturing blank stills from a AVCaptureScreenInput?

I'm working on sampling the screen using AVCaptureScreenInput and outputting it using a AVCaptureVideoDataOutput, and it's not working. The images it does output are blank, but it appears like I'm doing everything right according to all the documentation I've read.
I've made sure I make the AVCaptureVideoDataOutput to something that could be read by a CGImage (kCVPixelFormatType_32BGRA). When I run this same code and have it output to a AVCaptureMovieFileOutput, the movie renders fine and everything looks good - but what I really want is a series of images.
#import "ScreenRecorder.h"
#import <QuartzCore/QuartzCore.h>
#interface ScreenRecorder() <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> {
BOOL _isRecording;
#private
AVCaptureSession *_session;
AVCaptureOutput *_movieFileOutput;
AVCaptureStillImageOutput *_imageFileOutput;
NSUInteger _frameIndex;
NSTimer *_timer;
NSString *_outputDirectory;
}
#end
#implementation ScreenRecorder
- (BOOL)recordDisplayImages:(CGDirectDisplayID)displayId toURL:(NSURL *)fileURL windowBounds:(CGRect)windowBounds duration:(NSTimeInterval)duration {
if (_isRecording) {
return NO;
}
_frameIndex = 0;
// Create a capture session
_session = [[AVCaptureSession alloc] init];
// Set the session preset as you wish
_session.sessionPreset = AVCaptureSessionPresetHigh;
// Create a ScreenInput with the display and add it to the session
AVCaptureScreenInput *input = [[[AVCaptureScreenInput alloc] initWithDisplayID:displayId] autorelease];
if (!input) {
[_session release];
_session = nil;
return NO;
}
if ([_session canAddInput:input]) {
[_session addInput:input];
}
input.cropRect = windowBounds;
// Create a MovieFileOutput and add it to the session
_movieFileOutput = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[((AVCaptureVideoDataOutput *)_movieFileOutput) setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys:#(kCVPixelFormatType_32BGRA),kCVPixelBufferPixelFormatTypeKey, nil]];
// ((AVCaptureVideoDataOutput *)_movieFileOutput).alwaysDiscardsLateVideoFrames = YES;
if ([_session canAddOutput:_movieFileOutput])
[_session addOutput:_movieFileOutput];
// Start running the session
[_session startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[fileURL path]])
{
NSError *err;
if (![[NSFileManager defaultManager] removeItemAtPath:[fileURL path] error:&err])
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
_outputDirectory = [[fileURL path] retain];
[[NSFileManager defaultManager] createDirectoryAtPath:_outputDirectory withIntermediateDirectories:YES attributes:nil error:nil];
// Set the recording delegate to self
dispatch_queue_t queue = dispatch_queue_create("com.schaefer.lolz", 0);
[(AVCaptureVideoDataOutput *)_movieFileOutput setSampleBufferDelegate:self queue:queue];
//dispatch_release(queue);
if (0 != duration) {
_timer = [[NSTimer scheduledTimerWithTimeInterval:duration target:self selector:#selector(finishRecord:) userInfo:nil repeats:NO] retain];
}
_isRecording = YES;
return _isRecording;
}
- (void)dealloc
{
if (nil != _session) {
[_session stopRunning];
[_session release];
}
[_outputDirectory release];
_outputDirectory = nil;
[super dealloc];
}
- (void)stopRecording {
if (!_isRecording) {
return;
}
_isRecording = NO;
// Stop recording to the destination movie file
if ([_movieFileOutput isKindOfClass:[AVCaptureFileOutput class]]) {
[_movieFileOutput performSelector:#selector(stopRecording)];
}
[_session stopRunning];
[_session release];
_session = nil;
[_timer release];
_timer = nil;
}
-(void)finishRecord:(NSTimer *)timer
{
[self stopRecording];
}
//AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0); // Lock the image buffer
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); // Get information of the image
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef image = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
_frameIndex++;
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
dispatch_async(dispatch_get_main_queue(), ^{
NSURL *URL = [NSURL fileURLWithPath:[_outputDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%d.jpg", (int)_frameIndex]]];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((CFURLRef)URL, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(destination, image, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"Failed to write image to %#", URL);
}
CFRelease(destination);
CFRelease(image);
});
}
#end
Your data isn't planar, so there is no base address for plane 0--there's no plane 0. (To be sure, you can check with CVPixelBufferIsPlanar.) You'll need CVPixelBufferGetBaseAddress to get a pointer to the first pixel. All the data will be interleaved.

How to take a screenshot when a webview finished rending

I want to take a screenshot when a webview finished rending.The follow is my code:
-(void)webView:(WebView *)sender didFinishLoadForFrame:(WebFrame *)frame
{
if (frame != [sender mainFrame]) {
return;
}
NSBitmapImageRep *rep = [[[NSBitmapImageRep alloc] initWithFocusedViewRect:[sender bounds]] autorelease];
if (rep){
NSImage *img = [[NSImage alloc] initWithData:[rep TIFFRepresentation]];
NSData* imgData = [img TIFFRepresentation];
NSArray* deskTopArrayPaths = NSSearchPathForDirectoriesInDomains(NSDesktopDirectory, NSUserDomainMask, YES);
NSString* deskTopPath = [deskTopArrayPaths objectAtIndex:0];
NSString* pngPath = [NSString stringWithFormat:#"%#/SaveWebPage.png",deskTopPath];
[[NSFileManager defaultManager] createFileAtPath:pngPath contents:imgData attributes:nil];
}
}
The normal image is that:
But my image is this:
Anyone tell me why and how could i improve this situation! Thank u!

Copy partial screenshot to Pasteboard

So, the code to copy part of my screen to the pasteboard works because it was successfully coping it to my photo album. But, I want to be able to paste the partial screenshot into a new SMS message. I know it will have to be done manually (long hold on message and Paste), but it either pasted nothing, or does not have the Paste option (as it's saving it as a String). The middle portion of the code is the part I'm struggling with. Any help would be great. I've changed the forPasteboardType to "image" but that does not work either.
//Capture part of Screen Shot
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(c, 0, 98); //
[self.view.layer renderInContext:c];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Send Screenshot to Pasteboard
UIPasteboard *pasteBoard = [UIPasteboard pasteboardWithName:UIPasteboardNameGeneral create:YES];
pasteBoard.persistent = YES;
NSData *data = UIImagePNGRepresentation(viewImage);
[pasteBoard setData:data forPasteboardType:(NSString *)kUTTypePNG];
/////// Open SMS
MFMessageComposeViewController *controller = [[[MFMessageComposeViewController alloc] init] autorelease];
if([MFMessageComposeViewController canSendText])
{
controller.body = #"Hello from me, paste image here -->";
controller.recipients = [NSArray arrayWithObjects:#"123456789", nil];
controller.messageComposeDelegate = self;
[self presentModalViewController:controller animated:YES];
}
////// End SMS
}
//Capture part of Screen Shot
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(c, 0, 98); //
[self.view.layer renderInContext:c];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Send Screenshot to Pasteboard
UIPasteboard *pasteBoard = [UIPasteboard pasteboardWithName:UIPasteboardNameGeneral create:YES];
pasteBoard.persistent = YES;
NSData *data = UIImagePNGRepresentation(viewImage);
[pasteBoard setData:data forPasteboardType:(NSString *)kUTTypePNG];
NSString *stringURL = #"sms:";
NSURL *url = [NSURL URLWithString:stringURL];
[[UIApplication sharedApplication] openURL:url];

desktop wallpaper [duplicate]

I am trying to change the desktop image; the procedure I've come up with is below. The first time this code is run, the resized image is displayed on screen as wallpaper, but the next time there is no reaction. What am I doing wrong?
-(IBAction)click:(id)sender
{
NSData *sourceData;
NSError *error;
NSFileManager *filemgr;
filemgr = [NSFileManager defaultManager];
screenArray = [NSScreen screens];
screenCount = [screenArray count];
unsigned index = 0;
for (index; index < screenCount; index++)
{
screenz = [screenArray objectAtIndex: index];
screenRect = [screenz visibleFrame];
}
NSLog(#"%fx%f",screenRect.size.width, screenRect.size.height);
arrCatDetails = [strCatDetails componentsSeparatedByString:appDelegate.strColDelimiter];
NSString *imageURL = [NSString stringWithFormat:#"upload/product/image/%#_%#_%d.jpg",[arrCatDetails objectAtIndex:0],appDelegate.str104by157Name,iSelectedImgIndex];
NSString *ima = [imageURL lastPathComponent];
NSString *str = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *dataFilePath = [str stringByAppendingPathComponent:ima];
NSString *imagePath = [NSString stringWithFormat:#"file://localhost%#",dataFilePath];
NSURL *url = [[NSURL alloc] init];
url = [NSURL URLWithString:imagePath];
sourceData = [NSData dataWithContentsOfURL:url];
sourceImage = [[NSImage alloc] initWithData: sourceData];
resizedImage = [[NSImage alloc] initWithSize: NSMakeSize(screenRect.size.width, screenRect.size.height)];
NSSize originalSize = [sourceImage size];
[resizedImage lockFocus];
[sourceImage drawInRect: NSMakeRect(0, 0, screenRect.size.width, screenRect.size.height) fromRect: NSMakeRect(0, 0, originalSize.width, originalSize.height) operation: NSCompositeSourceOver fraction: 1.0];
[resizedImage unlockFocus];
NSData *resizedData = [resizedImage TIFFRepresentation];
NSBitmapImageRep* theImageRepresentation = [NSBitmapImageRep imageRepWithData:resizedData];
newimage = #"editwall.jpg";
newFilePath = [str stringByAppendingPathComponent:newimage];
NSData* theImageData = [theImageRepresentation representationUsingType:NSJPEGFileType properties:nil];
[theImageData writeToFile: newFilePath atomically: YES];
if([filemgr fileExistsAtPath:newFilePath] == YES)
{
imagePath1 = [NSString stringWithFormat:#"file://localhost%#",newFilePath];
urlz = [NSURL URLWithString:imagePath1];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:nil, NSWorkspaceDesktopImageFillColorKey, [NSNumber numberWithBool:NO], NSWorkspaceDesktopImageAllowClippingKey, [NSNumber numberWithInteger:NSImageScaleProportionallyUpOrDown], NSWorkspaceDesktopImageScalingKey, nil];
[[NSWorkspace sharedWorkspace] setDesktopImageURL:urlz forScreen:[[NSScreen screens] lastObject] options:options error:&error];
}
else
{
NSLog(#"No");
}
[sourceImage release];
[resizedImage release];
}
Why not try -[NSWorkspace setDesktopImageURL:forScreen:options:error:]? Apple has a sample project called DesktopImage to give you some idea how to use it.
Edit (after reading your code more carefully):
The problem you're having may be because of your call to +[NSDictionary dictionaryWithObjectsAndKeys:] See the nil at the end of the list of arguments? That's how you tell NSDictionary that your argument list is done. You can't put nil in the list, because it will stop reading the list at that point. If you want to specify a key that has no value, you have to use [NSNull null].
An aside: you've got a memory management issue in your code:
// allocates memory for an NSURL
NSURL * url = [[NSURL alloc] init];
// allocates more memory for an NSURL, and leaks
// the earlier allocation
url = [NSURL URLWithString:imagePath];
Just do one or the other:
// If you do it this way, you will have to call
// [url release] later
NSURL * url = [[NSURL alloc] initWithString:imagePath];
// This memory will be released automatically
NSURL * otherUrl = [NSURL URLWithString:imagePath];

xcode iphone - jerky scroll UITableView CellForRowAtIndexPath

Almost sorted with my 1st app, just a simple news app but when I load it onto my iPhone the scroll seems jerky can someone have a look at my function and see if i'm doing something wrong.
I need the image on the right hand side thats why i'm using custom cells.
Thanks
For any help
#define DATELABEL_TAG 1 #define MAINLABEL_TAG 2 #define PHOTO_TAG 3
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
static NSString *MainNewsCellIdentifier = #"MainNewsCellIdentifier";
UILabel *mainLabel, *dateLabel;
UIImageView *photo;
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier: MainNewsCellIdentifier];
if (cell == nil)
{
cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleSubtitle reuseIdentifier: MainNewsCellIdentifier] autorelease];
cell.accessoryType = UITableViewCellAccessoryDetailDisclosureButton;
dateLabel = [[[UILabel alloc] initWithFrame:CGRectMake(15.0,15.0,170.0,15.0)] autorelease];
dateLabel.tag = DATELABEL_TAG;
dateLabel.font = [UIFont systemFontOfSize:10.0];
dateLabel.textAlignment = UITextAlignmentLeft;
dateLabel.textColor = [UIColor darkGrayColor];
dateLabel.autoresizingMask = UIViewAutoresizingFlexibleRightMargin; //| UIViewAutoresizingFlexibleHeight;
[cell.contentView addSubview:dateLabel];
mainLabel = [[[UILabel alloc] initWithFrame:CGRectMake(15.0,28.0,170.0,60.0)] autorelease];
mainLabel.tag = MAINLABEL_TAG;
mainLabel.font = [UIFont boldSystemFontOfSize:14.0];
mainLabel.textColor = [UIColor blackColor];
mainLabel.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleRightMargin;
mainLabel.numberOfLines = 0;
//mainLabel.backgroundColor = [UIColor greenColor];
[cell.contentView addSubview:mainLabel];
photo = [[[UIImageView alloc] initWithFrame:CGRectMake(190.0,15.0,85.0,85.0)] autorelease];
photo.tag = PHOTO_TAG;
photo.contentMode = UIViewContentModeScaleAspectFit;//UIViewContentModeScaleAspectFit; //
[cell.contentView addSubview:photo];
}
else {
dateLabel = (UILabel *)[cell.contentView viewWithTag:DATELABEL_TAG];
mainLabel = (UILabel *)[cell.contentView viewWithTag:MAINLABEL_TAG];
photo = (UIImageView *)[cell.contentView viewWithTag:PHOTO_TAG];
}
NSUInteger row = [indexPath row];
NSDictionary *stream = (NSDictionary *) [dataList objectAtIndex:row];
NSString *title = [stream valueForKey:#"title"];
NSString *titleString = #"";
if( ! [title isKindOfClass:[NSString class]] )
{
titleString = #"";
}
else
{
titleString = title;
}
CGSize maximumSize = CGSizeMake(180, 9999);
UIFont *dateFont = [UIFont fontWithName:#"Helvetica" size:14];
CGSize dateStringSize = [titleString sizeWithFont:dateFont
constrainedToSize:maximumSize
lineBreakMode:mainLabel.lineBreakMode];
CGRect dateFrame = CGRectMake(15.0, 28.0, 170.0, dateStringSize.height);
mainLabel.frame = dateFrame;
mainLabel.text = titleString;
dateLabel.text = [stream valueForKey:#"created"];
NSString *i = [NSString stringWithFormat:#"http://www.website.co.uk/images/%#", [stream valueForKey:#"image"]];
NSData *imageURL = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:i]];
UIImage *newsImage = [[UIImage alloc] initWithData:imageURL];
photo.image = newsImage;
[imageURL release];
[newsImage release];
return cell;
}
The problem is this:
NSString *i = [NSString stringWithFormat:#"http://www.website.co.uk/images/%#", [stream valueForKey:#"image"]];
NSData *imageURL = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:i]];
UIImage *newsImage = [[UIImage alloc] initWithData:imageURL];
You effectively say here, that as soon as the cell needs to be displayed, the image must be fetched and presented. This will cost some time - too much for a good user experience.
You should fetch the images before or while you present the table view, and cache them, e.g. in an array. Or you must handle things asynchronously, meaning that you do the loading in the background, and not wait with return cell; until the image is actually downloaded. This will be a little harder to get right.
Even if you asynchronously download images, you'll still have a jerky scroll.
Why? It's lazy image decompression. ios performs decompression at the moment it will be displayed on the screen. You have to manually decompress the images in a background thread.
Decompression does not merely mean instantiating a UIImage object. It can be somewhat complicated. The best solution, is to download SDWebImage and use the image decompressor that's included. SDWebImage will asynchronously download and perform decompression for you.
To read more about the issue see: http://www.cocoanetics.com/2011/10/avoiding-image-decompression-sickness/

Resources