CABasicAnimation not always working - macos

I'm just beginning to use CABasicAnimations. So far it seems to me like the same code won't necessarily work twice on anything. In one particular instance (the solution for which may cure all my ills!) I have made my own (indeterminate) progress indicator. Just a png from PhotoShop which is rotated until a task is done, it's initiated in the view's initWithRect:
CALayer *mainLayer = [CALayer layer];
[myView setWantsLayer:YES];
[myView setLayer:mainLayer];
progressLayer = [CALayer layer];
progressLayer.opacity = 0;
progressLayer.cornerRadius = 0.0;
progressLayer.bounds = CGRectMake(0.0,0.0,50.0,50.0);
NSDictionary* options = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceShouldCache,
(id)kCFBooleanTrue, (id)kCGImageSourceShouldAllowFloat,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
NULL];
CGImageSourceRef isr = CGImageSourceCreateWithURL((__bridge CFURLRef)[[NSBundle mainBundle] URLForImageResource:#"progress_indicator.png"], NULL);
progressLayer.contents = (__bridge id)CGImageSourceCreateImageAtIndex(isr, 0, (__bridge CFDictionaryRef)options);
[mainLayer addSublayer:progressLayer];
And then brought 'onscreen' in a seperate method with:
[CATransaction begin]; //I did this block to snap the indicator to the centre
[CATransaction setValue:(id)kCFBooleanTrue forKey:kCATransactionDisableActions];
progressLayer.anchorPoint = anchorMiddle; //make sure the png is in the view centre
progressLayer.position = viewCentre;
progressLayer.opacity = 1.0;
[CATransaction setValue:(id)kCFBooleanFalse forKey:kCATransactionDisableActions];
[CATransaction commit];
[CATransaction flush];
CABasicAnimation* rotationAnim = [CABasicAnimation animationWithKeyPath: #"transform.rotation.z"];
rotationAnim.fromValue = [NSNumber numberWithFloat:0.0];
rotationAnim.toValue = [NSNumber numberWithFloat:-2 * M_PI];
rotationAnim.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
rotationAnim.duration = 5;
rotationAnim.repeatCount = 10000;
rotationAnim.removedOnCompletion = NO;
rotationAnim.autoreverses = NO;
[progressLayer addAnimation:rotationAnim forKey:#"transform.rotation.z"];
It often works - but not always. In general CABasicAnimations are driving me slightly mad: I cut & paste code from the internet and sometimes they work sometimes not. My only thought is it's being blocked by other threads. I have a minimum of 4 processes despatched using GCD. Is it just the case that I've blocked up my MacBookPro?
Thanks,
Todd.

Oh dear. I think I just found the problem: I was calling the progress indicator from within a GCD block. I took the call out and into the main body of the code (as it were) and all seems good now....

Related

Centering text in OS X Screensaver

I'm working on implementing a (very) simple screensaver using the Screensaver framework in OS X 10.10. Positioning on the center of the screen and displaying a two-line text works without problem, but setting the alignment to NSCenterTextAlignment somehow doesn't (the text is always displayed left-aligned).
- (void)animateOneFrame
{
// calculate font size based on screen
NSSize size = [self bounds].size;
CGFloat fontsize = size.height / 11;
// set text
NSMutableParagraphStyle *centredStyle = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
[centredStyle setAlignment:NSCenterTextAlignment];
NSDictionary *textAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSFont fontWithName:#"Futura" size:fontsize], NSFontAttributeName,
[NSColor orangeColor], NSForegroundColorAttributeName,
centredStyle, NSParagraphStyleAttributeName,
nil];
NSString *theText = #"Simple text spanning\ntwo lines";
// position text on screen
NSRect boundingRect = [theText boundingRectWithSize:size options:0 attributes:textAttributes];
NSPoint point = NSMakePoint((size.width - boundingRect.size.width) / 2.0,
(size.height - boundingRect.size.height) / 2.0);
[theText drawAtPoint: point withAttributes: textAttributes];
}
Any pointers on how to solve this are appreciated.
PS: I know that I don't need to put everything into animateOneFrame but for the moment the goal is to get it working at all:-)

CABasicAnimation animates but the views frame is not persistent

I have been playing with CABasicAnimation and CAAnimationGroup today, and I am already in love with it. I have couple of basic animations happening, in which the circle shape shrinks and also downscale to rounded square shape, just like "Voice memo" application in iOS 7.
Below is the code for it.
CABasicAnimation *corner = [CABasicAnimation animationWithKeyPath:#"cornerRadius"];
corner.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
corner.fromValue = [NSNumber numberWithFloat:recordingShape.layer.cornerRadius];
corner.toValue = [NSNumber numberWithFloat:30.0f];
corner.duration = 1.0;
//shrinking - scaling
CABasicAnimation* shrink = [CABasicAnimation animationWithKeyPath:#"transform.scale"];
shrink.toValue = [NSNumber numberWithDouble:0.5];
shrink.duration = 0.5;
// Two animations concurrently so set up CAAnimationGroup
CAAnimationGroup *group = [CAAnimationGroup animation];
[group setDuration:0.5];
[group setAnimations:[NSArray arrayWithObjects:shrink, corner, nil]];
// Animate the layer
[[recordingShape layer] addAnimation:group forKey:#"bounceAndFade"];
The animation occurs nicely as expected, but after animation it goes back to its original state as round circle, could anyone guide me as of how can I persist the layer's frame?
Thanks.
Well I Didn't know you could setup a delegate for the CAAnimationGroup group, hence I change the actual cornerRadius and transform at animationDidStop call.

AvMutableComposition issues having black frame at the end

I capturing a video using AVCaptureConnection in my iOS app. After that I add some images in the video as CALayers. Everything is working fine but I get a black frame at the very end of the resulting video after adding images. There is no frame of actual audio/video that has been affected in this. For audio I am extracting it and changing its pitch and then add it using AVMutableComposition. Here is the code that I am using. Please help me with what I am doing wrong or do I need to add something else.
cmp = [AVMutableComposition composition];
AVMutableCompositionTrack *videoComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
[audioComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];
animComp = [AVMutableVideoComposition videoComposition];
animComp.renderSize = CGSizeMake(320, 320);
animComp.frameDuration = CMTimeMake(1,30);
animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
// to gather the audio part of the video
NSArray *tracksToDuck = [cmp tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *trackMixArray = [NSMutableArray array];
for (NSInteger i = 0; i < [tracksToDuck count]; i++) {
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];
[trackMix setVolume:5 atTime:kCMTimeZero];
[trackMixArray addObject:trackMix];
}
audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = trackMixArray;
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVMutableVideoCompositionLayerInstruction *layerVideoInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoComposition];
[layerVideoInstruction setOpacity:1.0 atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:layerVideoInstruction] ;
animComp.instructions = [NSArray arrayWithObject:instruction];
[self exportMovie:self];
This is my method for exporting the video
-(IBAction) exportMovie:(id)sender{
//successCheck = NO;
NSArray *docPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *tempPath = [docPaths objectAtIndex:0];
//NSLog(#"Temp Path: %#",tempPath);
NSString *fileName = [NSString stringWithFormat:#"%#/Final.MP4",tempPath];
NSFileManager *fileManager = [NSFileManager defaultManager] ;
if([fileManager fileExistsAtPath:fileName ]){
NSError *ferror = nil ;
[fileManager removeItemAtPath:fileName error:&ferror];
}
NSURL *exportURL = [NSURL fileURLWithPath:fileName];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:cmp presetName:AVAssetExportPresetMediumQuality] ;
exporter.outputURL = exportURL;
exporter.videoComposition = animComp;
//exporter.audioMix = audioMix;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:{
NSLog(#"Fail");
break;
}
case AVAssetExportSessionStatusCompleted:{
NSLog(#"Success video");
});
break;
}
default:
break;
}
}];
NSLog(#"outside");
}
there is a property of exportsession to give the time range ,
try giving time range little less than the actual time (few nano seconds less)
You can get the true video duration from AVAssetTrack. The duration of AVAsset is sometimes longer than AVAssetTrack' one.
Check durations out like this.
print(asset.duration.seconds.description)
print(videoTrack.timeRange.duration.description)
So you can change this line.
[videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
To like this.
[videoComposition insertTimeRange:sourceVideoTrack.timeRange, ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
For swift 5
videoComposition.insertTimeRange(sourceVideoTrack.timeRange, of: sourceVideoTrack, at: CMTime.zero)
Then you will avoid the last black frame :)
Hope this helps someone still suffer.
Just wanted to write this for people with my specific issue.
I was taking a video and trying to speed it up / slow it down by taking an AVMutableComposition and scaling the time range of the audio and video components via scaleTimeRange
Scaling the time range to 2x speed or 3x speed sometimes caused the last few frames of the video to be black. Fortunately, #Khushboo's answer fixed my problem as well.
However, instead of decreasing the exporter's timeRange by a few nanoseconds, I just made it the same as the composition's duration which ended up working perfectly.
exporter?.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: composition.duration)
Hope this helps!

CALayer thin white lines

I am animating a CALayer to move and for some reason I keep getting these thin white lines all across the screen... it's not my device I can assure you here is how I set up the CALayer:
dodgelayer=nil;
dodgelayer= [CALayer layer];
dodgelayer.backgroundColor = [UIColor blueColor].CGColor;
dodgelayer.frame = CGRectMake(190, 80, 50, 50);
dodgelayer.borderColor = [UIColor whiteColor].CGColor;
dodgelayer.borderWidth = 2.0;
dodgelayer.cornerRadius = 50.0;
and how I animate it:
CAKeyframeAnimation *anim = [CAKeyframeAnimation animationWithKeyPath:#"position"];
values = [NSArray arrayWithObjects:[NSValue valueWithCGPoint: CGPointMake(aLayer.frame.origin.x+25,aLayer.frame.origin.y+25)], [NSValue valueWithCGPoint: CGPointMake(point.x,point.y)], nil];
[anim setValues:values];
[anim setDuration:0.7];
anim.removedOnCompletion=NO;
anim.fillMode = kCAFillModeForwards;
anim.timingFunction=[CAMediaTimingFunction functionWithName: kCAMediaTimingFunctionEaseInEaseOut];
sublayer.shouldRasterize=YES;
[sublayer addAnimation:anim forKey:#"position"];
I honestly have no idea what is making these thing white lines appear, but I can tell you that they appear during animations... im stumped on this one any help would be appreciated
I just bumped into this problem myself.
The cause is the shouldResterize = YES.
Before calling this, you should set
sublayer.rasterizationScale = [[UIScreen mainScreen] scale];
and the lines will go away.

CALayer scroll view slowdown with many items

Hey, I'm having a performance problem with CALayers in a layer backed NSView inside an NSScrollView. I've got a scroll view that I populate with a bunch of CALayer instances, stacked one on top of the next. Right now all I am doing is putting a border around them so I can see them. There is nothing else in the layer.
Performance seems fine until I have around 1500 or so in the scroll view. When I put 1500 in, the performance is great as I scroll down the list, until I get to around item 1000. Then very suddenly the app starts to hang. It's not a gradual slowdown which is what I would expect if it was just reaching it's capacity. It's like the app hits a brick wall.
When I call CGContextFillRect in the draw method of the layers, the slowdown happens around item 300. I'm assuming this has something to do with maybe the video card memory filling up or something? Do I need to do something to free the resources of the CALayers when they are offscreen in my scroll view?
I've noticed that if I don't setNeedsDisplay on my layers, I can get to the end of 1500 items without slowdowns. This is not a solution however, as I have some custom drawing that I must perform in the layer. I'm not sure if that solves the problem, or just makes it show up with a greater number of items in the layer. Ideally I would like this to be fully scalable with thousands of items in the scroll view (within reason of course). Realistically, how many of these empty items should I expect to be able to display in this way?
#import "ShelfView.h"
#import <Quartz/Quartz.h>
#implementation ShelfView
- (void) awakeFromNib
{
CALayer *rootLayer = [CALayer layer];
rootLayer.layoutManager = self;
rootLayer.geometryFlipped = YES;
[self setLayer:rootLayer];
[self setWantsLayer:YES];
int numItemsOnShelf = 1500;
for(NSUInteger i = 0; i < numItemsOnShelf; i++) {
CALayer* shelfItem = [CALayer layer];
[shelfItem setBorderColor:CGColorCreateGenericRGB(1.0, 0.0, 0.0, 1.0)];
[shelfItem setBorderWidth:1];
[shelfItem setNeedsDisplay];
[rootLayer addSublayer:shelfItem];
}
[rootLayer setNeedsLayout];
}
- (void)layoutSublayersOfLayer:(CALayer *)layer
{
float y = 10;
int totalItems = (int)[[layer sublayers] count];
for(int i = 0; i < totalItems; i++)
{
CALayer* item = [[layer sublayers] objectAtIndex:i];
CGRect frame = [item frame];
frame.origin.x = self.frame.size.width / 2 - 200;
frame.origin.y = y;
frame.size.width = 400;
frame.size.height = 400;
[CATransaction begin];
[CATransaction setAnimationDuration:0.0];
[item setFrame:CGRectIntegral(frame)];
[CATransaction commit];
y += 410;
}
NSRect thisFrame = [self frame];
thisFrame.size.height = y;
if(thisFrame.size.height < self.superview.frame.size.height)
thisFrame.size.height = self.superview.frame.size.height;
[self setFrame:thisFrame];
}
- (BOOL) isFlipped
{
return YES;
}
#end
I found out it was because I was filling each layer with custom drawing, they seemed to all be cached as separate images, even though they shared a lot of common data, so I switched to just creating a dozen CALayers, filling their "contents" property, and adding them as sublayers to a main layer. This seemed to make things MUCH zippier.

Resources