Dragging two images using multi touch - xcode

I am fairly new to touch event and i have a problem. I use the code above to drag two images on screen. The code works however when the the second finger touches the screen the first movement stops. So, the problem is related to multi-touch. I also, do not know how to calculate the second touches co-ordinates. I enabled multitouch in the view and in both images. I would be great full if somebody could help me move each image with each finger.
Thanks in advance!
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == image) {
image.center = location;
} else if ([touch view] == image2) {
image2.center = location;
}}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];}

I would recommend using a custom UIGestureRecognizer for that. It will give you an nice encapsulated way to manage what you want. Dragging two images at the same time is in essence a gesture.

Related

Moving Sprites are not being selected via nodeAtPoint Method

So I have sprites, which are generated randomly in the screen and are then given random movements via SKAction.
For example:
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithImageNamed:#"Icicle.png"];
sprite.position = CGPointMake (randomX, 0); //randomX is a random integer between 0 and screen width
SKAction *action = [SKAction moveToY:-200 duration:2];
[sprite runAction:action];
[self addChild:sprite];
Now in my touches began method:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
}
What I am seeing is that my nodes are not getting selected very accurately. The Icicle images have a buffer width of 20 pixels around all four sides.
Does being in the middle of an SKAction, make node selection via tapping inaccurate?
Moreover, any suggestions on how to select nodes that are constantly moving via a touchesbegan method?

How can I move multiple UIImage views around a view controller?

How can I move multiple UIImage views around a view controller?
I have managed to use this code;
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch began --------- |n");
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch moved --------- |n");
UITouch *myTouch = [touches anyObject];
startPoint = [myTouch locationInView:self.view];
ball.center = CGPointMake(startPoint.x, startPoint.y);
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch end --------- |n");
}
This code above has made it so that I can move one UIImage view around with touch, however, I want to be able to move 100's around. This code also, currently, when you move your finger around an area of the screen the image jumps to your finger.
Is this the best way to do it? Or is a pan gesture better?
Please help me make it so I can move multiple images around my view controller with touch and if you use the code above, please stop the image from jumping!
Please help!
.H FILE FOR ANSWER REFERRAL;
#interface CMViewController : UIViewController {
CGPoint startPoint;
}
#property CGPoint startPoint;
#property (strong, nonatomic) IBOutlet UIImageView *smyImageView;
#property (strong, nonatomic) IBOutlet UIImageView *smyImageView1;
#end
Thanks
You can use below. I used this for multiple images moving on screen. It's working for me.
UIPanGestureRecognizer *span=[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(onsPan:)];
[smyImageView addGestureRecognizer:span];
UIPanGestureRecognizer *span1=[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(onsPan1:)];
[smyImageView1 addGestureRecognizer:span1];
Moving (Pan):
- (void)onsPan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
- (void)onsPan1:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
This will teach you everything you need to know for what you want to do.
http://www.raywenderlich.com/44270/sprite-kit-tutorial-how-to-drag-and-drop-sprites
This is another answer that may help you

Dragging a sprite right and left?

Im wondering how to do this. How can i make it so that when a user taps and HOLDS my sprite they can drag it across the x axis only, maintaining the same y value?
Follow this tutorial
http://www.raywenderlich.com/2343/how-to-drag-and-drop-sprites-with-cocos2d
When it sets the position of the sprite instead of setting the y position to that of the touch, just set it to a fixed value where you want it.
Where ever you handle your touches:
if (CGRectContainsPoint(sprite.rect, touchLocation)){
sprite.position = ccp(touchLocation.x,sprite.position.y);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *Drag = [[event allTouches] anyObject];
CGPoint CurrentPosition = [Drag locationInView:self.view];
CurrentPosition.x = Sprite.center.x;
Sprite.center = CurrentPosition;
}
I created a slider like this: https://github.com/Antem/Custom-Slider-xCode

Can't figure out iPad landscape view coordinates

Being new to objective-C but tinkering I was trying out dragging with the iPad, and the following code from my viewcontroller.m works okay in portrait, but not in landscape:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:mainView.superview];
testViewToMove.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
mainView being the default view that was created with the app, and testViewToMove being a UIView I made for kicks.
In landscape, the coordinates don't seem to translate. I'm sure this is something obvious, but I don't get it, what's going on here?
I tried setting the view to landscape, and tinkering with some other settings to no avail... I also saw this, but to my own dismay I haven't been able to get it to work.
Seems I just needed to change mainView.superview to self.view in the first method, after applying the fix linked above.

xcode Removing Some Subviews from view

Greetings all,
I am a noob and I have been trying to work through this for a few days.
I am adding images to a view via UItouch. The view contains a background on top of which the new images are add. How do I clear the images I am adding from the subview, without getting rid of the UIImage that is the background. Any assistance is greatly appreciated. Thanks in Advance.
here is the code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *) event {
NSUInteger numTaps = [[touches anyObject] tapCount];
if (numTaps==2) {
imageCounter.text =#"two taps registered";
//__ remove images
UIView* subview;
while ((subview = [[self.view subviews] lastObject]) != nil)
[subview removeFromSuperview];
return;
}else {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
CGRect myImageRect = CGRectMake((touchPoint.x -40), (touchPoint.y -45), 80.0f, 90.0f);
UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:[UIImage imageNamed:#"pg6_dog_button.png"]];
myImage.opaque = YES; // explicitly opaque for performance
[self.view addSubview:myImage];
[myImage release];
[imagesArray addObject:myImage];
NSNumber *arrayCount =[self.view.subviews count];
viewArrayCount.text =[NSString stringWithFormat:#"%d",arrayCount];
imageCount=imageCount++;
imageCounter.text =[NSString stringWithFormat:#"%d",imageCount];
}
}
What you need is a way of distinguishing the added UIImageView objects from the background UIImageView. There are two ways I can think of to do this.
Approach 1: Assign added UIImageView objects a special tag value
Each UIView object has a tag property which is simply an integer value that can be used to identify that view. You could set the tag value of each added view to 7 like this:
myImage.tag = 7;
Then, to remove the added views, you could step through all of the subviews and only remove the ones with a tag value of 7:
for (UIView *subview in [self.view subviews]) {
if (subview.tag == 7) {
[subview removeFromSuperview];
}
}
Approach 2: Remember the background view
Another approach is to keep a reference to the background view so you can distinguish it from the added views. Make an IBOutlet for the background UIImageView and assign it the usual way in Interface Builder. Then, before removing a subview, just make sure it's not the background view.
for (UIView *subview in [self.view subviews]) {
if (subview != self.backgroundImageView) {
[subview removeFromSuperview];
}
}
A more swiftly code for approach #1 in only one functional line of code :
self.view.subviews.filter({$0.tag == 7}).forEach({$0.removeFromSuperview()})

Resources