I'm attempting to rotate a box2d body that's tied to a cocos2d sprite via box2d's GetUserData() in my iPhone application. Specifically, I'm attempting to grab the latest touch location and rotate my box2d body in that direction.
I'm fairly inexperienced when it comes to box2d, so any advice would be appreciated. Below is a quick stab at how I imagine I'd manipulate the players box2d body. I'd like clarification on:
1) If this is the correct way of doing things.
2) How I'd calculate the angle between the player and the last touch location in order to rotate my player in that direction.
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
b2Body *pBody = self.playerBody;
if(pBody != NULL) {
for(UITouch *touch in touches) {
CGPoint location = [touch locationInView: [touch view]];
location = [[CCDirector sharedDirector] convertToGL: location];
CCSprite *myActor = (CCSprite*)pBody->GetUserData();
pBody->SetTransform(pBody->GetPosition(), angleToRotateByInRadians);
}
}
}
Get the angle (in radians) between two points:
atan2(pointOne.x - pointTwo.x, pointOne.y - pointTwo.y)
Related
So I have sprites, which are generated randomly in the screen and are then given random movements via SKAction.
For example:
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithImageNamed:#"Icicle.png"];
sprite.position = CGPointMake (randomX, 0); //randomX is a random integer between 0 and screen width
SKAction *action = [SKAction moveToY:-200 duration:2];
[sprite runAction:action];
[self addChild:sprite];
Now in my touches began method:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
}
What I am seeing is that my nodes are not getting selected very accurately. The Icicle images have a buffer width of 20 pixels around all four sides.
Does being in the middle of an SKAction, make node selection via tapping inaccurate?
Moreover, any suggestions on how to select nodes that are constantly moving via a touchesbegan method?
How can I drag and move sprites with touch in sprite kit but in a grid manner? I can already move them on touch, but I need to simulate a tiledmap...
Override the method in your SKScene:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
Then when you receive messages in this, just check the position with:
UITouch *touch = [touches anyObject];
if(touch){//
CGPoint location = [touch locationInNode:self];
.. moving code here
}
Once you have your position, just move it based upon which grid item the touch is currently within:
//Grid of 20x20 pixels, with entire grid starting at 0,0
static const CGFloat GridSize = 20;
int gridColumn = (int)(location.x/GridSize); //we can lose the precision here
int gridRow = (int)(location.y/GridSize); //we can lose the precision here
node.position = CGPointMake(gridColumn*GridSize, gridRow*GridSize);
Of course you could animate the move with an SKAction, but if you're tracking a touch, you probably want it in realtime.
Im wondering how to do this. How can i make it so that when a user taps and HOLDS my sprite they can drag it across the x axis only, maintaining the same y value?
Follow this tutorial
http://www.raywenderlich.com/2343/how-to-drag-and-drop-sprites-with-cocos2d
When it sets the position of the sprite instead of setting the y position to that of the touch, just set it to a fixed value where you want it.
Where ever you handle your touches:
if (CGRectContainsPoint(sprite.rect, touchLocation)){
sprite.position = ccp(touchLocation.x,sprite.position.y);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *Drag = [[event allTouches] anyObject];
CGPoint CurrentPosition = [Drag locationInView:self.view];
CurrentPosition.x = Sprite.center.x;
Sprite.center = CurrentPosition;
}
I created a slider like this: https://github.com/Antem/Custom-Slider-xCode
I implemented a UIPanGestureRecognizer since I wish to use one finger to rotate a UIView along its axis. A button within the uiview begins the gesture at which point the UIView rotates. Problem is that it only rotates correctly if the button is in the 1st quadrant, top left. Any other quadrant and it rotates erratically. Can someone tell me what is wrong with my math. By the way ang calculates the angle using the superview's coordinates since the users finger might be outside the rotating views bounds, but that might not be necessary.
thank you
- (void)rotateItem:(UIPanGestureRecognizer *)recognizer
{
NSLog(#"Rotate Item");
float ang = atan2([recognizer locationInView:self.superview].y - self.center.y, [recognizer locationInView:self.superview].x - self.center.x);
float angleDiff = deltaAngle - ang;
self.transform = CGAffineTransformRotate(startTransform, -angleDiff);
CGFloat radians = atan2f(self.transform.b, self.transform.a);
NSLog(#"rad is %f", radians);
}
#pragma mark - Touch Methods
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)recognizer
{
if (recognizer == rotateGesture) {
NSLog(#"rotate gesture started");
deltaAngle = atan2([recognizer locationInView:self].y-self.center.y, [recognizer locationInView:self].x-self.center.x);
startTransform = self.transform;
}
return YES;
}
I did some logging and it seems that the center of my uiview was changing during the touch drag event. Hence I stored the center of the uiview with the touches began method and used it instead.
- (void)rotateItem:(UIPanGestureRecognizer *)recognizer
{
NSLog(#"Rotate Item");
CGPoint superPoint = [self convertPoint:itemCenter toView:self.superview];
float ang = atan2([recognizer locationInView:self.superview].y - superPoint.y, [recognizer locationInView:self.superview].x - superPoint.x);
float angleDiff = deltaAngle - ang;
self.transform = CGAffineTransformRotate(startTransform, -angleDiff);
}
#pragma mark - Touch Methods
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)recognizer
{
if (recognizer == rotateGesture) {
NSLog(#"rotate gesture started");
deltaAngle = atan2([recognizer locationInView:self.superview].y-self.center.y, [recognizer locationInView:self.superview].x-self.center.x);
startTransform = self.transform;
}
return YES;
}
I am fairly new to touch event and i have a problem. I use the code above to drag two images on screen. The code works however when the the second finger touches the screen the first movement stops. So, the problem is related to multi-touch. I also, do not know how to calculate the second touches co-ordinates. I enabled multitouch in the view and in both images. I would be great full if somebody could help me move each image with each finger.
Thanks in advance!
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == image) {
image.center = location;
} else if ([touch view] == image2) {
image2.center = location;
}}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];}
I would recommend using a custom UIGestureRecognizer for that. It will give you an nice encapsulated way to manage what you want. Dragging two images at the same time is in essence a gesture.