how do I get multiple coordinates for multi touch from a UIGesture - uigesturerecognizer

I have three gestures: 2-finger tap, 3-finger tap, and 4-finger tap. I need to get coordinates accordingly.
I have tried the following to get the coordinated of 2 fingers tap but app keeps crashing:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSArray *twoTouch = [touches allObjects];
UITouch *tOne = [twoTouch objectAtIndex:0];
UITouch *tTwo = [twoTouch objectAtIndex:1];
CGPoint firstTouch = [tOne locationInView:[tOne view]];
CGPoint secondTouch = [tTwo locationInView:[tTwo view]];
NSLog(#"point one: %#", firstTouch);
NSLog(#"point two: %#", secondTouch);
[twoTouch release];
}

First of all, your application is not checking if there actually are two touches.
If you tap the screen with one finger you will get one touch in the "touches".
Try something like this.
if(touches.count > 1 && touches.count < 3)
{
// Your code for two touches.
}
Otherwise, the part where your program crashes is the [twoTouch objectAtIndex:1] because objectAtIndex:1 doesn't exist.
(I know this is a really old question but I answered it anyways.)

Related

Audio on TouchesEnded

On TouchesEnded I want to do two things:
1) return the view to its original position - this works fine.
2) I want to play an audio sound if the touch ended on the YES vies and a different sound for the NO views.
How can I do this?
At this moment my sound for NO will play for every touch ended - and this is not good.
I am new to this so please explain the basics :) my current code below.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
yesletter1.frame = keepYesLetter1Place;
yesletter2.frame = keepYesLetter2Place;
yesletter3.frame = keepYesLetter3Place;
yesletter4.frame = keepYesLetter4Place;
notletter1.frame = keepNoLetter1Place;
notletter2.frame = keepNoLetter2Place;
notletter3.frame = keepNoLetter3Place;
notletter4.frame = keepNoLetter4Place;
NSString *path = [[NSBundle mainBundle]
pathForResource:#"no"ofType:#"wav"];
AVAudioPlayer* theAudio = [[AVAudioPlayer alloc]
initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
[theAudio play];
}
Solved this issue not the way i wanted but thing work now.
I have 2 different sounds for some objects one sound and the other another sound.
What i did is added a flag for each object and did if(flag1 == 0 || flag2 == 0 ect...) play sound 1 else play sound 2.
in another method which does something else i played with the flags.
I am sure there is a better solution for this :) but things work now.

Flipped NSView mouse coordinates

I have a subclass of NSView that re-implements a number of the mouse event functions. For instance in mouseDown to get the point from the NSEvent I use:
NSEvent *theEvent; // <- argument to function
NSPoint p = [theEvent locationInWindow];
p = [self convertPoint:p fromView:nil];
However the coordinates seem to be flipped, (0, 0) is in the bottom left of the window?
EDIT: I have already overridden the isFlipped method to return TRUE, but it has only affected drawing. Sorry, can't believe I forgot to put that straight away.
What do you mean by flipped? Mac uses a LLO (lower-left-origin) coordinate system for everything.
EDIT I can't reproduce this with a simple project. I created a single NSView implemented like this:
#implementation FlipView
- (BOOL)isFlipped {
return YES;
}
- (void)mouseDown:(NSEvent *)theEvent {
NSPoint p = [theEvent locationInWindow];
p = [self convertPoint:p fromView:nil];
NSLog(#"%#", NSStringFromPoint(p));
}
#end
I received the coordinates I would expect. Removing the isFlipped switched the orientation as expected. Do you have a simple project that demonstrates your problmem?
I found this so obnoxious - until one day I just sat down and refused to get up until I had something that worked perfectly . Here it is.. called via...
-(void) mouseDown:(NSEvent *)click{
NSPoint mD = [NSScreen wtfIsTheMouse:click
relativeToView:self];
}
invokes a Category on NSScreen....
#implementation NSScreen (FlippingOut)
+ (NSPoint)wtfIsTheMouse:(NSEvent*)anyEevent
relativeToView:(NSView *)view {
NSScreen *now = [NSScreen currentScreenForMouseLocation];
return [now flipPoint:[now convertToScreenFromLocalPoint:event.locationInWindow relativeToView:view]];
}
- (NSPoint)flipPoint:(NSPoint)aPoint {
return (NSPoint) { aPoint.x,
self.frame.size.height - aPoint.y };
}
- (NSPoint)convertToScreenFromLocalPoint:(NSPoint)point
relativeToView:(NSView *)view {
NSPoint winP, scrnP, flipScrnP;
if(self) {
winP = [view convertPoint:point toView:nil];
scrnP = [[view window] convertBaseToScreen:winP];
flipScrnP = [self flipPoint:scrnP];
flipScrnP.y += [self frame].origin.y;
return flipScrnP;
} return NSZeroPoint;
}
#end
Hope this can prevent just one minor freakout.. somewhere, someday. For the children, damnit. I beg of you.. for the children.
This code worked for me:
NSPoint location = [self convertPoint:theEvent.locationInWindow fromView:nil];
location.y = self.frame.size.height - location.y;
This isn't "flipped", necessarily, that's just how Quartz does coordinates. An excerpt from the documentation on Quartz 2D:
A point in user space is represented by a coordinate pair (x,y), where x represents the location along the horizontal axis (left and right) and y represents the vertical axis (up and down). The origin of the user coordinate space is the point (0,0). The origin is located at the lower-left corner of the page, as shown in Figure 1-4. In the default coordinate system for Quartz, the x-axis increases as it moves from the left toward the right of the page. The y-axis increases in value as it moves from the bottom toward the top of the page.
I'm not sure what your question is, though. Are you looking for a way to get the "flipped" coordinates? If so, you can subclass your NSView, overriding the -(BOOL)isFlipped method to return YES.

CAShapeLayer Slow User Interaction

I have a CAShapeLayer and it has to do a simple task of moving on the screen, guided by the user's finger.
The problem is that the movement is too slow. The layer does move, but there is a lag and it feels slow.
I have another test app where an UIImage is moved and there is no lag at all and the image moves instantly.
What can I do to overcome this?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
currentPoint = [[touches anyObject] locationInView:self];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint activePoint = [[touches anyObject] locationInView:self];
CGPoint newPoint = CGPointMake(activePoint.x - currentPoint.x,activePoint.y - currentPoint.y);
curLayer.position = CGPointMake(shapeLayer.position.x+newPoint.x,shapeLayer.position.y+newPoint.y);
currentPoint = activePoint;
}
Thanks!
Keep in mind that when you set the position on a layer (assuming it's not the root layer of a UIView on which actions are disabled by default), it implicitly animates to the new position, which takes 0.25 seconds. If you want to make it snappier, temporarily disable actions on the layer like this:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint activePoint = [[touches anyObject] locationInView:self];
CGPoint newPoint = CGPointMake(activePoint.x -
currentPoint.x,activePoint.y - currentPoint.y);
[CATransaction begin];
[CATransaction setDisableActions:YES];
curLayer.position = CGPointMake(shapeLayer.position.x +
newPoint.x, shapeLayer.position.y + newPoint.y);
[CATransaction commit];
currentPoint = activePoint;
}
This should cause it to jump to the new position rather than animate. If that doesn't help, then let me take a look at your layer init code so I can see what properties and dimensions it has. Properties such as cornerRadius, for example, can affect performance.
Try setting shouldRasterize to YES on your CAShapeLayer, particularly if it is usually drawn at the same scale. If your app runs on high-DPI devices, you may also need to set rasterizationScale to match the layer’s contentsScale.
While rasterizing your shape can make it faster to move the shape around, you’ll probably want to temporarily disable rasterization while you’re animating the layer’s path or size.

Enabling multi-touch in game?

I am creating a simple pong game in Xcode. I have a 2 player version and both paddles can be moved and function perfectly. The game is flawless except for the fact that you cannot move both paddles at the same time, rendering the 2 player mode useless.
How can I enable both paddles to be moved at the same time?
I've already tried selecting the "Multiple Touch" button in Interface Builder but it does nothing and im not quite sure it is even the correct route into enabling multi touch as I want it.
Also, my game is a View-Based Application if that matters.
Thanks!
EDIT: I mis-read the question. Added code snippet.
Here's the code I use to extract all touches when I get a touchesBegan event:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSArray *touchesArray = [touches allObjects];
NSUInteger nNumTouches = [touchesArray count];
UITouch *touch;
CGPoint ptTouch;
for (int nTouch = 0; nTouch < nNumTouches; nTouch++)
{
touch = [touchesArray objectAtIndex:nTouch];
ptTouch = [touch locationInView:self.view];
// Do stuff with the touch
}
}
and similarly for touchesEnded and touchesMoved
From my experience (which isn't that much). You can create 2 UIViews for the 2 paddles, touching in one view will move one paddle, while touching in the other will move the other paddle. I hope this helps.
If you don't know how to split the views, you can simply make it identify 2 touches
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch1 = [[event touchesForView:zone1] anyObject];
UITouch *touch2 = [[event touchesForView:zone2] anyObject];
if (touch1)
touchOffset1 = paddle1.center.x - [touch1 locationInView:touch1.view].x;
if (touch2)
touchOffset2 = paddle2.center.x - [touch2 locationInView:touch2.view].x;
}
This you can use, it probably isn't the most productive, but it does work if you can't figure out how to split the touches.
self.view.multipleTouchEnabled = YES;
by default it is No

Zoom in/out gesture UIView

i've been trying to capture gesture zoom in/out in a UIView.
Code:
NSSet *allTouches = [event allTouches];
NSArray *twoTouches = [allTouches allObjects];
UITouch *first = [twoTouches objectAtIndex:0];
UITouch *second = [twoTouches objectAtIndex:1];
CGPoint firstPoint = [first locationInView:self];
CGPoint secondPoint = [second locationInView:self];
CGFloat initialDistance = [distanceBetweenPoints(firstPoint, secondPoint)];
I'm using the function distanceBetweenPoints, the problem is that firstPoint or secondPoint always is 0.00, 0.00 and for that reason the result is the value of one of them.
I need that the booth values should be distint to zero for obtain the real distance.
The frame of the view is: (0, 0, 320, 417).
The functionality that i'm developing is something like zoom in/out of google maps.
Thanks,
The problem was that the uiview need it the attribute self.multipleTouchEnabled = YES; For that reason the multitouch worked but not so well.

Resources