i am using two UIView animations to move the two different balls to particular position. But the timing of collision varies. First time they collide but second time first ball come first or viceversa. Can any buddy explain how to make them collide at same point using uiview animation. is it the thread processing issue of uiview animation
Make sure you're animating the two views in the same block, as in
[UIView beginAnimations:nil context:nil];
viewOne.center = CGPointMake(40,40);
viewTwo.center = CGPointMake(80,40);
[UIView commitAnimations];
Related
I want to draw image with HardLight composite operation. I've created NSImageView with next draw code:
- (void)drawRect:(NSRect)dirtyRect {
//[super drawRect:dirtyRect];
if (self.image != NULL) {
[self.image drawInRect:self.bounds fromRect:NSZeroRect operation:NSCompositeHardLight fraction:1.0];
}
}
In usual case it works well. But it does not work over NSVisualEffectView.
How can I blend HardLight over NSVisualEffectView?
in the image linked below you can see rounded rectangle which blend HardLight over window background and colour image. But over NSVisualEffectView (red bottom rectangle) it draws just grey.
https://www.dropbox.com/s/bcpe6vdha6xfc5t/Screenshot%202015-03-27%2000.32.53.png?dl=0
Roughly speaking, image compositing takes none, one or both pixels from the source and destination, applies some composite operation and writes it to the destination. To get any effect that takes into account the destination pixel, that pixel’s color information must be known when the compositing operation takes place, which is in your implementation of -drawRect:.
I’m assuming you’re talking about behind window blending (NSVisualEffectBlendingModeBehindWindow) here. The problem with NSVisualEffectView is that it does not draw anything. Instead, it defines a region that tells the WindowServer process to do its vibrancy stuff in that region. This happens after your app draws its views.
Therefore a compositing operation in your app cannot take into account the pixels that the window server draws later. In short, this cannot be done.
I know how to draw a circle in cocos2d & I know how to do cocos2d animation (scale and fade) with a ccsprite(loaded from a png file).
But I am wondering is it possible to store a drawn circle(in draw function) somehow and do animation with it just we normally do with ccsprite.
Thanks
Have a look at the inner workings of your animations and you should be able to piece together the rest.
Take a look at CCScaleTo for example. If you look at its update: function, all it does is change the scale of the CCNode it links to over time.
You should make your circle by extending CCSprite (or CCNode) and overriding the draw function. Here you can just call super to handle the translation or if you need a bit more control, you should modify your translation matrix yourself to take the position, rotation, scale into account (e.g. glScalef(x, y, z)) with OpenGLES.
-(void) draw
{
[super draw];
//Your draw code for the circle.
}
I am writing a Cocoa application for OS X, where the user can draw squares on an NSView instance by clicking with the mouse. Currently I am making the squares disappear after 2 seconds, using the performSelector:withObject:afterDelay: method of NSObject, to force a redraw of the view, with no square included.
However, instead of just disappearing, I would like the squares to fade out gradually. I've tried using an NSTimer to periodically force a redraw, with the opacity of the square decreasing to 0 over 2 seconds, but this seems rather inelegant and probably inefficient, especially if I have a lot of squares.
Is there an idiomatic way to do this?
UPDATE: just to clarify, I want each square drawn in the view to have an independent fade starting from the point at which it's drawn, I'm not looking to fade out the entire view.
The solution I've ended up using is to create a CALayer instance for each square, rather than using NSRectFill to draw the squares. The opacity of each CALayer instance can then be independently animated using a CABasicAnimation instance. E.g.
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:#"opacity"];
[animation setFromValue:[NSNumber numberWithFloat:1.]];
[animation setToValue:[NSNumber numberWithFloat:0.]];
[animation setDuration:2.];
[layer setOpacity:0.];
[layer addAnimation:animation forKey:#"opacity"];
NSAnimationContext is probably what you're looking for.
I am resizing a control via
[UIView beginAnimations]
[self setTransform:CGAffineTransformMakeScale(0.5f, 0.5f)];
[UIView commitAnimations]
The animation looks beautiful on the iPhone simulator but on my 2G test device it is unbearably slow and choppy. How can I improve the animation on an older iPhone? Or do I have to disable animated transitions on older devices? Thanks.
My problem was too many non-opaque UIViews on the screen. I radically reduced the number through some redesign and it is very zippy now.
I have a situation where I have many CALayers which animate in a "turn based" fashion. I animate the position on each of those CALayers, but they have the exact same duration. Once all of those CALayers are finished animating, a new "turn" is initiated and they animate changing positions again.
The whole idea is that with a linear interpolation between positions, and at a constant speed, a turn based transition between state to state looks like a real time animation. This, however, is hard to achieve with many different CALayers.
CAAnimationGroup is used to group together animations on a single CALayer. But I was wondering, is there a simple solution to group animations, which are supposed to have the same durations, on several CALayers together?
Edited to include a reply to Kevin Ballard's question
My problem lies in this. I'm creating animations for each of my CALayers, then putting
those in an NSArray. Once I get the callback that the individual animation has ended, I remove it form the NSArray. Once it's empty, I again create animations for them all.
With more than a few layers, there's a noticeable delay between where all of the animations finished and the new ones start.
I imagine that if I could group all of these animations into a single one, a lot more layers could be animated without a delay between animations. Thereby not ruining the illusions of a contiguous animation.
If you attach animations to multiple CALayers within a single method, they will all commence at (effectively) the same time. I use this approach in a puzzle game with dropping balls, at the end of the animations I attach the next stage of the animation to any ball that needs further animation.
I'm animating upto 60 CALayers at a time and not experiencing any delays between stages of the animation, but I don't cache the animations in an array of any sort, I'm not sure of thge overhead you have there.
My animations are relatively simple, and created and attached to each CALayer on the fly. My sprites are 60px square and use a couple dozen possible images to represent their content.
In some cases there are multiply animations that I can create with different starting times(using beginTime), I bundle them up with a CAAnimationGroup - but you may not be able to precalculate subsequent animations.
If you wrap your animations in a CATransaction, CG will make sure that they all run during the same iteration of the main loop.
[CATransaction begin];
// all your animations
[CATransaction commit];
If you add 3 animations to 3 different layers all at the same time, and they have the same duration, I would expect them all to animate together. What behavior are you seeing?
cp21yos: can you elaborate on your method? I am trying to do something similar, which involves animating several layers at a time, more than once. You said: "at the end of the animations I attach the next stage of the animation ". Can you explain that? When I try to put logic to perform additional animations in an animationDidStop event, only the last animation is occuring, instead of the whole sequence of animations.