Cocos2d: Animation based on accelerometer - animation

Anyone know of any good up to date tutorials out there that show how can one animate a sprite based on accelerometer movement. I want to animate a bird to sway to the position the device was pointed to. For example if the player decides to move the bird to the left via the accelerometer I would like for my bird to play an animation that is swaying to the left.
// Accelerometer
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
birdSpeedY = 9.0 + acceleration.x*15;
birdSpeedX = -acceleration.y*20;
}
// Updating bird based on accelerometer
-(void)updateBird {
float maxY = winSize.height - bird.contentSize.height/2;
float minY = bird.contentSize.height/2;
float newY = bird.position.y + birdSpeedY;
newY = MIN(MAX(newY, minY), maxY);
float maxX = winSize.width - bird.contentSize.width/2;
float minX = bird.contentSize.width/2;
float newX = bird.position.x + birdSpeedX;
newX = MIN(MAX(newX, minX), maxX);
bird.position = ccp(newX, newY);
}
// Making background scroll automatically
-(void)update:(ccTime)dt {
[self updateBird];
CGPoint backgroundScrollVel = ccp(-100, 0);
parallaxNode.position = ccpAdd(parallaxNode.position, ccpMult(backgroundScrollVel, dt));
}
-(id)init {
self = [super init];
if (self != nil) {
winSize = [CCDirector sharedDirector].winSize;
CCSpriteFrameCache *cache=[CCSpriteFrameCache sharedSpriteFrameCache];
[cache addSpriteFramesWithFile:#"birdAtlas.plist"];
NSMutableArray *framesArray=[NSMutableArray array];
for (int i=1; i<10; i++) {
NSString *frameName=[NSString stringWithFormat:#"bird%d.png", i];
id frameObject=[cache spriteFrameByName:frameName];
[framesArray addObject:frameObject];
}
// animation object
id animObject=[CCAnimation animationWithFrames:framesArray delay:0.1];
// animation action
id animAction=[CCAnimate actionWithAnimation:animObject restoreOriginalFrame:NO];
animAction=[CCRepeatForever actionWithAction:animAction];
bird=[CCSprite spriteWithSpriteFrameName:#"bird1.png"];
bird.position=ccp(60,160);
CCSpriteBatchNode *batchNode=[CCSpriteBatchNode batchNodeWithFile:#"birdAtlas.png"];
[self addChild:batchNode z:100];
[batchNode addChild:bird];
[bird runAction:animAction];
self.isAccelerometerEnabled = YES;
[self scheduleUpdate];
[self addScrollingBackgroundWithTileMapInsideParallax];
}
return self;
}
- (void) dealloc
{
[super dealloc];
}
#end

You can try the Accelerometer methods with it and change the position of the Sprite using ccp(). You also need to know is that project for the Landscape or Portrait in the Mode.
You Can Try the Stuff below
- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration
{
[lbl setString:[NSString stringWithFormat:#"X=>%.2lf Y=>%.2lf",(double)acceleration.x,(double)acceleration.y]];
double x1= -acceleration.y *10;
double y1= acceleration.x *15;
if(acceleration.x >0.05)
{
y1*=spped_incr; // Make Movement Here
}
[Sprite_Name runAction:[CCMoveTo actionWithDuration:0.1f position:ccpAdd(ccp(x1,y1), Sprite_Name.position)]];
}
The Above Stuff is for the Landscape Mode.... if You need in Portrait Mode you need to change the Axis and use the TRY & Error Method.

Related

Detecting and ignoring touch on non-transparent part of MKOverlay drawn Image in iOS8

I have a overlay image (.png) on my map that consists of a transparent bit in the middle, and colored sides so the user can only focus on the middle part. However do to the shape of that middle bit, quite a bit is visible at some sides.
I'm trying to detect a tap on the OverlayView so I can ignore it and only accept touches in the designated area.
I followed the following tut at Ray Wenderlich's site for adding the overlay:
The image overlay is drawn like this:
#implementation PVParkOverlayView
- (instancetype)initWithOverlay:(id<MKOverlay>)overlay overlayImage:(UIImage *)overlayImage {
self = [super initWithOverlay:overlay];
if (self) {
_overlayImage = overlayImage;
}
return self;
}
- (void)drawMapRect:(MKMapRect)mapRect zoomScale:(MKZoomScale)zoomScale inContext:(CGContextRef)context {
CGImageRef imageReference = self.overlayImage.CGImage;
//UIImage *imageTest = _overlayImage;
MKMapRect theMapRect = self.overlay.boundingMapRect;
CGRect theRect = [self rectForMapRect:theMapRect];
//orientation testing
//CGContextRotateCTM(context, 0);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextTranslateCTM(context, 0.0, -theRect.size.height);
CGContextDrawImage(context, theRect, imageReference);
}
I have a gesture recognizer on my mapview and am trying to detect the tap there:
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint tapPoint = [gestureRecognizer locationInView:self.mapView];
CLLocationCoordinate2D tapCoord = [self.mapView convertPoint:tapPoint toCoordinateFromView:self.mapView];
MKMapPoint mapPoint = MKMapPointForCoordinate(tapCoord);
CGPoint mapPointAsCGP = CGPointMake(mapPoint.x, mapPoint.y);
for (id<MKOverlay> overlay in self.mapView.overlays) {
if([overlay isKindOfClass:[PVParkOverlay class]]){
NSLog(#"overlay is present");
/*
MKPolygon *polygon = (MKPolygon*) overlay;
CGMutablePathRef mpr = CGPathCreateMutable();
MKMapPoint *polygonPoints = polygon.points;
for (int p=0; p < polygon.pointCount; p++){
MKMapPoint mp = polygonPoints[p];
if (p == 0)
CGPathMoveToPoint(mpr, NULL, mp.x, mp.y);
else
CGPathAddLineToPoint(mpr, NULL, mp.x, mp.y);
}
if(CGPathContainsPoint(mpr , NULL, mapPointAsCGP, FALSE)){
// ... found it!
NSLog(#"I've found it!");
}
//CGPathRelease(mpr);
*/
}
}
I know that the overlay is there, but since it is a drawn image I can't find a way to convert this to polygon points to use this code (if even possible).
Any other methods I can use for this?
I also found following sample code but the viewForOverlay method is deprecated:
- (void)mapTapped:(UITapGestureRecognizer *)recognizer
{
MKMapView *mapView = (MKMapView *)recognizer.view;
id<MKOverlay> tappedOverlay = nil;
for (id<MKOverlay> overlay in mapView.overlays)
{
MKOverlayView *view = [mapView viewForOverlay:overlay];
if (view)
{
// Get view frame rect in the mapView's coordinate system
CGRect viewFrameInMapView = [view.superview convertRect:view.frame toView:mapView];
// Get touch point in the mapView's coordinate system
CGPoint point = [recognizer locationInView:mapView];
// Check if the touch is within the view bounds
if (CGRectContainsPoint(viewFrameInMapView, point))
{
tappedOverlay = overlay;
break;
}
}
}
NSLog(#"Tapped view: %#", [mapView viewForOverlay:tappedOverlay]);
}

Cocos2d - only one scheduler called

I'm trying to trace 2 paths on screen through a series of vertices (connect the dots style). Each should be a different color, and each has its own list of vertices.
I started out by creating a class which can trace a path, then creating 2 instances of this class, one for each path. I overrode the draw method. It worked just fine except for some reason only the first instance of the class called the draw method. I figured it was a problem with OpenGL so I did it again using CCDrawNode and it still had the same bug.
Only one instance (blackPath) draws any objects on screen. In fact the scheduled updateEndpoint: method is not even called for the whitePath object, although it is successfully created.
My Drawer.m Class:
const float size = 10;
const float speed = 5;
ccColor4F pathColor;
int numPoints;
NSArray * path;
CGPoint endPoint;
#implementation Drawer
-(id)initWithPath:(NSArray*)p andColorIsBlack:(BOOL)isBlack{
self = [super init];
// Record input
path = p.copy;
pathColor = ccc4f(1.0f, 1.0f, 1.0f, 1.0f);
if(isBlack){
pathColor = ccc4f(0.0f, 0.0f, 0.0f, 1.0f);
}
// Set variables
numPoints = 1;
endPoint = [[path firstObject] position];
NSLog(#"Drawer initialized with path of length %u and color %hhd (isblack)", p.count, isBlack);
[self schedule:#selector(updateEndpoint:)];
return self;
}
-(void)updateEndpoint:(ccTime)dt{
NSLog(#"(%f, %f, %f, %f) Path", pathColor.r, pathColor.g, pathColor.b, pathColor.a);
[self drawDot:endPoint radius:size color:pathColor];
CGPoint dest = [[path objectAtIndex:numPoints] position];
float dx = dest.x - endPoint.x;
float dy = dest.y - endPoint.y;
// new coords are current + distance * sign of distance
float newX = endPoint.x + MIN(speed, fabsf(dx)) * ((dx>0) - (dx<0));
float newY = endPoint.y + MIN(speed, fabsf(dy)) * ((dy>0) - (dy<0));
endPoint = ccp(newX, newY);
if(endPoint.x == dest.x && endPoint.y == dest.y){
if(numPoints < path.count-1){
numPoints+=1;
}
else{
[self unschedule:#selector(updateEndpoint:)];
}
}
}
And here is where I instantiate the objects:
-(id) init{
self = [super init];
[self addAllCards];
[self addScore];
xShrinkRate = [[Grid getInstance] sqWidth] / shrinkTime;
yShrinkRate = [[Grid getInstance] sqHeight] / shrinkTime;
dropList = [NSMutableArray new];
notDropList = [NSMutableArray new];
[self schedule:#selector(dropCard:) interval:0.075];
[self schedule:#selector(shrinkCards:)];
Drawer * whitePath = [[Drawer alloc] initWithPath:[[Score getInstance] whitePath] andColorIsBlack:false];
[self addChild:whitePath];
Drawer * blackPath = [[Drawer alloc] initWithPath:[[Score getInstance] blackPath] andColorIsBlack:true];
[self addChild:blackPath];
return self;
}
Change the (non-const) global variables to instance variables of the class like so:
#implementation Drawer
{
ccColor4F pathColor;
int numPoints;
NSArray * path;
CGPoint endPoint;
}

How do I implement scrollWheel elastic scrolling on OSX

I am trying to create a custom view, which is similar to NSScrollView, but is based on CoreAnimation CAScrollLayer/CATiledLayer. Basically, my app requires a lot of near realtime CGPath drawing, and animates these paths using shape layers (it's similar to the way GarageBand animates while recording). The first prototype I created used NSScrollView, but I could not get more than 20 frames per second with it (The reason was that NSRulerView updates by drawing every time the scrollEvent happens, and the entire call flow from -[NSClipView scrollToPoint] to -[NSScrollView reflectScrolledClipView:] is extremely expensive and inefficient).
I've created a custom view that uses CAScrollLayer as scrolling mechanism and CATiledLayer as the traditional documentView (for infinite scrolling option), and now I can get close to 60fps. However, I'm having hard time implementing scrollWheel elastic scrolling, and I'm not sure how to do it. Here's the code I have so far, and would really appreciate if someone can tell me how to implement elasticScrolling.
-(void) scrollWheel:(NSEvent *)theEvent{
NSCAssert(mDocumentScrollLayer, #"The Scroll Layer Cannot be nil");
NSCAssert(mDocumentLayer, #"The tiled layer cannot be nil");
NSCAssert(self.layer, #"The base layer of view cannot be nil");
NSCAssert(mRulerLayer, #"The ScrollLayer for ruler cannot be nil");
NSPoint locationInWindow = [theEvent locationInWindow];
NSPoint locationInBaseLayer = [self convertPoint:locationInWindow
fromView:nil];
NSPoint locationInRuler = [mRulerLayer convertPoint:locationInBaseLayer
fromLayer:self.layer];
if ([mRulerLayer containsPoint:locationInRuler]) {
return;
}
CGRect docRect = [mDocumentScrollLayer convertRect:[mDocumentLayer bounds]
fromLayer:mDocumentLayer];
CGRect scrollRect = [mDocumentScrollLayer visibleRect];
CGPoint newOrigin = scrollRect.origin;
CGFloat deltaX = [theEvent scrollingDeltaX];
CGFloat deltaY = [theEvent scrollingDeltaY];
if ([self isFlipped]) {
deltaY *= -1;
}
scrollRect.origin.x -= deltaX;
scrollRect.origin.y += deltaY;
if ((NSMinX(scrollRect) < NSMinX(docRect)) ||
(NSMaxX(scrollRect) > NSMaxX(docRect)) ||
(NSMinY(scrollRect) < NSMinY(docRect)) ||
(NSMaxY(scrollRect) > NSMaxX(docRect))) {
mIsScrollingPastEdge = YES;
CGFloat heightPhase = 0.0;
CGFloat widthPhase = 0.0;
CGSize size = [self frame].size;
if (NSMinX(scrollRect) < NSMinX(docRect)) {
widthPhase = ABS(NSMinX(scrollRect) - NSMinX(docRect));
}
if (NSMaxX(scrollRect) > NSMaxX(docRect)) {
widthPhase = ABS(NSMaxX(scrollRect) - NSMaxX(docRect));
}
if (NSMinY(scrollRect) < NSMinY(docRect)) {
heightPhase = ABS(NSMinY(scrollRect) - NSMinY(docRect));
}
if (NSMaxY(scrollRect) > NSMaxY(docRect)) {
heightPhase = ABS(NSMaxY(scrollRect) - NSMaxY(docRect));
}
if (widthPhase > size.width/2.0) {
widthPhase = size.width/2.0;
}
if (heightPhase > size.width/2.0) {
heightPhase = size.width/2.0;
}
deltaX = deltaX*(1-(2*widthPhase/size.width));
deltaY = deltaY*(1-(2*heightPhase/size.height));
}
newOrigin.x -= deltaX;
newOrigin.y += deltaY;
if ( mIsScrollingPastEdge &&
(([theEvent phase] == NSEventPhaseEnded) ||
([theEvent momentumPhase] == NSEventPhaseEnded)
)
){
CGPoint confinedScrollPoint = [mDocumentScrollLayer bounds].origin;
mIsScrollingPastEdge = NO;
CGRect visibleRect = [mDocumentScrollLayer visibleRect];
if (NSMinX(scrollRect) < NSMinX(docRect)){
confinedScrollPoint.x = docRect.origin.x;
}
if(NSMinY(scrollRect) < NSMinY(docRect)) {
confinedScrollPoint.y = docRect.origin.y;
}
if (NSMaxX(scrollRect) > NSMaxX(docRect)) {
confinedScrollPoint.x = NSMaxX(docRect) - visibleRect.size.width;
}
if (NSMaxY(scrollRect) > NSMaxY(docRect)){
confinedScrollPoint.y = NSMaxY(docRect) - visibleRect.size.height;
}
[mDocumentScrollLayer scrollToPoint:confinedScrollPoint];
CGPoint rulerPoint = [mRulerLayer bounds].origin;
rulerPoint.x = [mDocumentLayer bounds].origin.x;
[mRulerLayer scrollToPoint:rulerPoint];
return;
}
CGPoint rulerPoint = [mDocumentScrollLayer convertPoint:newOrigin
toLayer:mRulerLayer];
rulerPoint.y = [mRulerLayer bounds].origin.y;
if (!mIsScrollingPastEdge) {
[CATransaction setDisableActions:YES];
[mDocumentScrollLayer scrollToPoint:newOrigin];
[CATransaction commit];
}else{
[mDocumentScrollLayer scrollToPoint:newOrigin];
[mRulerLayer scrollToPoint:rulerPoint];
}
}
Take a look at TUIScrollView at https://github.com/twitter/twui.
The Core concept of adding ElasticScrolling is to use a SpringSolver.
This has a ElasticScrollView that you can reuse:
https://github.com/eonist/Element

Rotating UIView with 1 finger iPhone , iPad

I implemented a UIPanGestureRecognizer since I wish to use one finger to rotate a UIView along its axis. A button within the uiview begins the gesture at which point the UIView rotates. Problem is that it only rotates correctly if the button is in the 1st quadrant, top left. Any other quadrant and it rotates erratically. Can someone tell me what is wrong with my math. By the way ang calculates the angle using the superview's coordinates since the users finger might be outside the rotating views bounds, but that might not be necessary.
thank you
- (void)rotateItem:(UIPanGestureRecognizer *)recognizer
{
NSLog(#"Rotate Item");
float ang = atan2([recognizer locationInView:self.superview].y - self.center.y, [recognizer locationInView:self.superview].x - self.center.x);
float angleDiff = deltaAngle - ang;
self.transform = CGAffineTransformRotate(startTransform, -angleDiff);
CGFloat radians = atan2f(self.transform.b, self.transform.a);
NSLog(#"rad is %f", radians);
}
#pragma mark - Touch Methods
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)recognizer
{
if (recognizer == rotateGesture) {
NSLog(#"rotate gesture started");
deltaAngle = atan2([recognizer locationInView:self].y-self.center.y, [recognizer locationInView:self].x-self.center.x);
startTransform = self.transform;
}
return YES;
}
I did some logging and it seems that the center of my uiview was changing during the touch drag event. Hence I stored the center of the uiview with the touches began method and used it instead.
- (void)rotateItem:(UIPanGestureRecognizer *)recognizer
{
NSLog(#"Rotate Item");
CGPoint superPoint = [self convertPoint:itemCenter toView:self.superview];
float ang = atan2([recognizer locationInView:self.superview].y - superPoint.y, [recognizer locationInView:self.superview].x - superPoint.x);
float angleDiff = deltaAngle - ang;
self.transform = CGAffineTransformRotate(startTransform, -angleDiff);
}
#pragma mark - Touch Methods
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)recognizer
{
if (recognizer == rotateGesture) {
NSLog(#"rotate gesture started");
deltaAngle = atan2([recognizer locationInView:self.superview].y-self.center.y, [recognizer locationInView:self.superview].x-self.center.x);
startTransform = self.transform;
}
return YES;
}

How would I get a UIImageView to always face up based off the accelerometer?

I would like to make it so when the user rotates the device (to any angle, not just landscape/portrait) the UIImageView would always be facing upwards. How would I do this?
Thanks in advance
#interface FirstViewController : UIViewController <UIAccelerometerDelegate> {
IBOutlet UIImageView *imageView;
CGPoint delta;
CGPoint translation;
float ballRadius;
}
//implementation
- (void)viewDidLoad
{
[super viewDidLoad];
// for the line in the middle of the camera
UIAccelerometer *accel = [UIAccelerometer sharedAccelerometer];
accel.delegate = self;
accel.updateInterval = 1.0f/60.0f;
}
- (void)accelerometer:(UIAccelerometer *)acel
didAccelerate:(UIAcceleration *)acceleration {
// Get the current device angle
float xx = -[acceleration x];
float yy = [acceleration y];
float angle = atan2(yy, xx);
// Add 1.5 to the angle to keep the image constantly horizontal to the viewer.
[imageView setTransform:CGAffineTransformMakeRotation(angle+1.5)];
}

Resources