How can I move multiple UIImage views around a view controller?
I have managed to use this code;
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch began --------- |n");
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch moved --------- |n");
UITouch *myTouch = [touches anyObject];
startPoint = [myTouch locationInView:self.view];
ball.center = CGPointMake(startPoint.x, startPoint.y);
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
printf("touch end --------- |n");
}
This code above has made it so that I can move one UIImage view around with touch, however, I want to be able to move 100's around. This code also, currently, when you move your finger around an area of the screen the image jumps to your finger.
Is this the best way to do it? Or is a pan gesture better?
Please help me make it so I can move multiple images around my view controller with touch and if you use the code above, please stop the image from jumping!
Please help!
.H FILE FOR ANSWER REFERRAL;
#interface CMViewController : UIViewController {
CGPoint startPoint;
}
#property CGPoint startPoint;
#property (strong, nonatomic) IBOutlet UIImageView *smyImageView;
#property (strong, nonatomic) IBOutlet UIImageView *smyImageView1;
#end
Thanks
You can use below. I used this for multiple images moving on screen. It's working for me.
UIPanGestureRecognizer *span=[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(onsPan:)];
[smyImageView addGestureRecognizer:span];
UIPanGestureRecognizer *span1=[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(onsPan1:)];
[smyImageView1 addGestureRecognizer:span1];
Moving (Pan):
- (void)onsPan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
- (void)onsPan1:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
This will teach you everything you need to know for what you want to do.
http://www.raywenderlich.com/44270/sprite-kit-tutorial-how-to-drag-and-drop-sprites
This is another answer that may help you
Related
Im trying to rotate an image around in a circle using a simple single view application in xcode. I have a circle image on the main storyboard and a button. Im using the following code, but the circle image drifts down to the right then back up to the left as its spinning. I'm not sure what it's missing.
Thanks for you help.
ViewController.h
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController
{
IBOutlet UIImageView *theImageView;
IBOutlet UIButton *theButton;
NSTimer *theTimer;
float angle;
BOOL runStop;
}
#property (atomic, retain) IBOutlet UIImageView *theImageView;
#property (atomic, retain) IBOutlet UIButton *theButton;
-(IBAction)runRoulette:(id)sender;
#end
ViewController.m
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
#synthesize theButton, theImageView;
- (void)viewDidLoad
{
angle = 0;
runStop = FALSE;
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
-(void)rotateRoulette
{
theImageView.center = CGPointMake(self.theImageView.center.x, self.theImageView.center.y);
theImageView.transform=CGAffineTransformMakeRotation (angle);
angle+=0.001;
}
-(IBAction)runRoulette:(id)sender
{
if(!runStop)
{
theTimer = [NSTimer scheduledTimerWithTimeInterval:1.0/600.0 target:self selector:#selector(rotateRoulette) userInfo:nil repeats:YES];
}
else
{
[theTimer invalidate];
theTimer = nil;
}
runStop = !runStop;
}
#end
I found a very simple way to make my animation work how i wanted it too -
theImageView.center = CGPointMake(160.0, 364.0);
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:5];
//theImageView.center=CGPointMake(self.theImageView.center.x, self.theImageView.center.y);
theImageView.transform = CGAffineTransformMakeRotation(M_PI/2.65);
[UIView commitAnimations];
It spins the theImageView M_PI/2.65 (M_PI/2.65 is the roataion amount) for a count of 5.
I have a ViewController and an UIView and would like to do KVO on an UIView property of type CGRect. For some reason it not working as it should.
My code looks like this:
#interface MyView : UIView
#property (nonatomic, assign) CGRect myRect;
#end
#implementation MyView
#synthesize myRect;
...
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
...
//Changing structures values
myRect.origin.x+=moveVector.x;
myRect.origin.y+=moveVector.y;
//assigning new structure
[self setMyRect:CGRectMake(20, 20, 50, 50)];
//Both doesn't call
}
I want to observe the CGRect with my viewController. Not sure how strict i should be with MVC here. Btw the CGRect in MyView is just a visible square, which i can move around, so i would say its not a model and should stay in MyView. Please correct em if iam wrong.
Here is my ViewController:
#class MyView;
#interface MyViewController : UIViewController
#property (nonatomic, retain) MyView *overlayView;
#end
#implementation MyViewController
#synthesize overlayView;
- (void)viewDidLoad
{
[super viewDidLoad];
overlayView = [[CameraOverlayView alloc] initWithFrame:CGRectMake(0, 0, 320, 480)];
[overlayView addObserver:self
forKeyPath:#"myRect"
options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld)
context:NULL];
/* [self addObserver:self
forKeyPath:#"overlayView.myRect"
options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld)
context:NULL];*/
self.view = overlayView;
}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void*)context {
if( [keyPath isEqualToString:#"myRect"] ) {
NSLog(#"observeValueForKeyPath");
CGRect rect = [[change valueForKey:NSKeyValueChangeNewKey] CGRectValue];
}
}
observeValueForKeyPath is never called, regardless of how i change myRect in MyView.
setMyRect: crashes during runtime and i dont really know why. I though synthesize would take care of setters and getters as well as the Keyvalues and changes.
Iam also not sure which way of addObserver: is the better way, as i commented out my second attempt to register an Observer.
What iam doing wrong? Isn't it possible to Observe structures? Or only with self written setters and getter?
Can it be that UIView is not KVC-Compliant, if so, why does this work?
Callback by KVO on a UIView frame property
Thanks for any help :-)
My solution is, Iam using a delegate.
The Controller is my delegate for view, i think its the way it should be,
otherwise UIView would be KVC-Complaint.
I am fairly new to touch event and i have a problem. I use the code above to drag two images on screen. The code works however when the the second finger touches the screen the first movement stops. So, the problem is related to multi-touch. I also, do not know how to calculate the second touches co-ordinates. I enabled multitouch in the view and in both images. I would be great full if somebody could help me move each image with each finger.
Thanks in advance!
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == image) {
image.center = location;
} else if ([touch view] == image2) {
image2.center = location;
}}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];}
I would recommend using a custom UIGestureRecognizer for that. It will give you an nice encapsulated way to manage what you want. Dragging two images at the same time is in essence a gesture.
Imagine that i drag an ImageView in TouchesMoved,how can i recognize that in which direction i dragged it from code ?
You can store a temporary Point from touchesBegan. Add a iVar in "YourImageView" interface.
#interface YourImageView : UIImageView
{
CGPoint previousPt;
//Other iVars.
}
#end
#implementation YourImageView
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
previousPt = [[touches anyObject] locationInView:self.view];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
const CGPoint p = [[touches anyObject] locationInView:self.view];
if (previousPt.x > p.x)
{
//Dragged right to left
}
else if (previousPt.x < p.x)
{
//Dragged left to right
}
else
{
//no move
}
//Do the same thing for y-direction
}
I would extract a method out of it. But I hope you got the idea.
If you subtract the old position from the new position of the ImageView, then you will get a direction vector. If you want a unit vector, just normalize it.
Greetings all,
I am a noob and I have been trying to work through this for a few days.
I am adding images to a view via UItouch. The view contains a background on top of which the new images are add. How do I clear the images I am adding from the subview, without getting rid of the UIImage that is the background. Any assistance is greatly appreciated. Thanks in Advance.
here is the code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *) event {
NSUInteger numTaps = [[touches anyObject] tapCount];
if (numTaps==2) {
imageCounter.text =#"two taps registered";
//__ remove images
UIView* subview;
while ((subview = [[self.view subviews] lastObject]) != nil)
[subview removeFromSuperview];
return;
}else {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
CGRect myImageRect = CGRectMake((touchPoint.x -40), (touchPoint.y -45), 80.0f, 90.0f);
UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:[UIImage imageNamed:#"pg6_dog_button.png"]];
myImage.opaque = YES; // explicitly opaque for performance
[self.view addSubview:myImage];
[myImage release];
[imagesArray addObject:myImage];
NSNumber *arrayCount =[self.view.subviews count];
viewArrayCount.text =[NSString stringWithFormat:#"%d",arrayCount];
imageCount=imageCount++;
imageCounter.text =[NSString stringWithFormat:#"%d",imageCount];
}
}
What you need is a way of distinguishing the added UIImageView objects from the background UIImageView. There are two ways I can think of to do this.
Approach 1: Assign added UIImageView objects a special tag value
Each UIView object has a tag property which is simply an integer value that can be used to identify that view. You could set the tag value of each added view to 7 like this:
myImage.tag = 7;
Then, to remove the added views, you could step through all of the subviews and only remove the ones with a tag value of 7:
for (UIView *subview in [self.view subviews]) {
if (subview.tag == 7) {
[subview removeFromSuperview];
}
}
Approach 2: Remember the background view
Another approach is to keep a reference to the background view so you can distinguish it from the added views. Make an IBOutlet for the background UIImageView and assign it the usual way in Interface Builder. Then, before removing a subview, just make sure it's not the background view.
for (UIView *subview in [self.view subviews]) {
if (subview != self.backgroundImageView) {
[subview removeFromSuperview];
}
}
A more swiftly code for approach #1 in only one functional line of code :
self.view.subviews.filter({$0.tag == 7}).forEach({$0.removeFromSuperview()})