UIScrollView loading an image - cocoa

i wanted to find an "easy" way to use the pinch/zoom function on my app!
so i decided to use a UIScrollView.
so far so good.
i load my image from an sqlite db like so:
- (void)viewWillAppear:(BOOL)animated {
imageView.image = entity.Aattribute;
}
- (void)viewDidLoad {
self.title = #"Title";
imageView = [[UIImageView alloc] initWithFrame:[UIScreen mainScreen].applicationFrame];
imageView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
imageView.contentMode = UIViewContentModeScaleAspectFit;
imageView.backgroundColor = [UIColor blackColor];
myScrollView.contentSize = CGSizeMake(imageView.frame.size.width, imageView.frame.size.height);
myScrollView.maximumZoomScale = 4.0;
myScrollView.minimumZoomScale = 0.75;
myScrollView.clipsToBounds = YES;
myScrollView.delegate = self;
[myScrollView addSubview:imageView];
self.view = myScrollView;
}
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return imageView;
}
any help would be appreciated!
thank you for your time!
EDIT:
i m just gonna answer my own question here! (i m getting better at this!!! LOL)
up above is the working code. i have edited it so if anyone needs this can refer to it!
thanks for the answers!

This looks mostly correct. A couple of notes:
you've misspelled "viewDidLoad". This could be a side effect of typing it into the browser window. If it's wrong in your code, then it's certainly not helping things be better.
viewForZoomingInScrollView: is expecting you to return a UIView. myImage sounds suspiciously unlike a UIView. You should be returning the imageView instance variable. That's what's getting moved around.
Edit
From the documentation:
The UIScrollView class can have a
delegate that must adopt the
UIScrollViewDelegate protocol. For
zooming and panning to work, the
delegate must implement both
viewForZoomingInScrollView: and
scrollViewDidEndZooming:withView:atScale:;
in addition, the maximum
(maximumZoomScale) and minimum (
minimumZoomScale) zoom scale must be
different.
So it looks like you still need to implement another delegate method.

Related

xcode UITapGestureRecognizer on scrollview not calling until second tap

I have the following code to dismiss the keyboard if the user taps the background. It works fine if the scrollview is in the PointZero position, but if the user scrolls the view and then selects the textview, it doesn't call the "dismissKeyboard' method until the 2nd background tap.
On the first tap (for some reason) moves the scrollview offset to align with the scrollview frame to the screen bottom. The second tap will dismiss the keyboard and run the code below. I know it has to do with the scrollview. Any help would be appreciated.
Thanks
- (void)viewDidLoad {
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(dismissKeyboard)];
tapGesture.cancelsTouchesInView = NO;
[_scrollView addGestureRecognizer:tapGesture];
}
-(void)dismissKeyboard {
[self.view endEditing:YES];
}
- (void)keyboardWasShown:(NSNotification *)notification {
scrollViewRect = _scrollView.contentOffset.y;
NSDictionary* info = [notification userInfo];
CGSize keyboardSize = [[info objectForKey:UIKeyboardFrameBeginUserInfoKey] CGRectValue].size;
keyboardSize.height += 10;
CGFloat viewBottom = CGRectGetMaxY(self.scrollView.frame);
if ([_itemNotes isFirstResponder]) {
CGFloat notesBottom = CGRectGetMaxY(_itemNotes.frame);
viewBottom -= notesBottom;
if (viewBottom < keyboardSize.height) {
keyboardSize.height -= viewBottom;
CGPoint scrollPoint = CGPointMake(0.0, keyboardSize.height);
[self.scrollView setContentOffset:scrollPoint animated:YES];
}
else {
[self.scrollView setContentOffset:CGPointZero animated:YES];
}
}
else {
[self.scrollView setContentOffset:CGPointZero animated:YES];
}
}
- (void)keyboardWillBeHidden:(NSNotification *)notification {
CGPoint scrollPoint = CGPointMake(0.0, scrollViewRect);
[self.scrollView setContentOffset:scrollPoint animated:YES];
}
EDIT:
So I figured out a solution but it seems like there must be a better way to handle this. The problem was because I was setting the contentOffset of the scrollView so that the contentSize was beyond the screen boundaries. Thus the first tap was moving the scrollView contentOffset back within the screen boundaries and the second was performing the tap gesture. I will post my solution below hoping that someone has a better answer.
I would recommend setting
_scrollView.layer.borderColor = [UIColor redColor].CGColor;
_scrollView.layer.borderWidth = 1;
This will show you exactly where your scrollview boundaries are, which may not be where you think they are, or may be covered by something else. Also, when I open the keyboard, I generally set the scrollview frame bottom to the top of the keyboard. Otherwise, you may have content below the keyboard you can't get to. Not sure if this is exactly related to your issues.
I am assuming there must be a better solution to this but I was able to solve the issue by extending the contentSize when the keyboard is displayed and then shrinking it back down when the keyboard is hidden.
Set a float (scrollViewHeight) to hold the original content size for the reset.
//add this right before setting the content offset
scrollViewHeight = _scrollView.contentSize.height;
_scrollView.contentSize = CGSizeMake(_scrollView.frame.size.width , scrollViewHeight + keyboardSize.height);
//add this right before reseting the content offset
_scrollView.contentSize = CGSizeMake(_scrollView.frame.size.width , scrollViewHeight);
It really seems like there must be a better way that I'm not aware of. I will have to go back through the documentation to see if there is another way.

UIImage Animation Reverting Back To Original Position

Im pretty new to coding but I'm starting to get the hang of the basics.
how to make an image stay in its new position after an animation?
Example:
I'm giving an animating object a random position, however, the animation causes the object not to animate at the random position, but instead animate at the position it was given in the view controller. This also happens when I animate a completly different object.
Code I used:
int Random1x;
int Random1y;
IBOutlet UIButton *Start;
IBOutlet UIImageview *Object2;
-(void)ObjectMoving;
-(void)Object2Animate;
-(IBAction)Start:(id)sender{
[self ObjectMoving];
[self Object2Animate];
}
-(void)Object2Animate {
Object2.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"2.png"],
[UIImage imageNamed: #"3.png"],
[UIImage imageNamed: #"4.png"],
[UIImage imageNamed: #"1.png"], nil];
Object2.animationDuration = .5
[Object2 setanimationRepeatCount: 0]
[Object2 startAnimating];
}
-(void)ObjectMoving {
Random1y = arc4random() % 466;
Random1y = Random1y + 60;
Random1x = arc4random() % 288;
Object2.center = CGPointMake(Random1x, Random1y);
}
I'd greatly appreciate help, thank you!
If you go to your story board file and click on the View Controller and then file inspector you will see a box for Auto Layout
Post back if that worked.
If you do need to use auto layout then you would have to figure out a different way of moving the image.

thumbnailImageAtTime: now deprecated - What's the alternative?

Until iOS7 update I was using...
UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
...with great success, so that my app could show a still of the video that the user had just taken.
I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of
- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option
though how do I return the image from it so I can place it within the videoReview button image?
Thanks in advance, Jim.
****Edited question, after trying notification centre method***
I used the following code -
[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
I made the NSArray times of two NSNumber objects 1 & 2.
I then tried to capture the notification in the following method
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{
UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];
Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.
If you can see from my coding where I've went wrong your help would be appreciated again. Cheers
Managed to find a great way using AVAssetImageGenerator, please see code below...
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;
Within header file, please import
#import <AVFoundation/AVFoundation.h>
It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.
Hope this helps anyone else who had the same problem.
* **Update for Swift 5.1 ****
Useful function...
func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.
The problem is that you have to specify float values in requestThumbnailImagesAtTimes.
For example, this will work
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];
but this won't work:
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14] timeOption:MPMovieTimeOptionNearestKeyFrame];
The way to do it, at least in iOS7 is to use floats for your times
NSNumber *timeStamp = #1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];
Hope this helps
Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
NSDictionary * userInfo = [note userInfo];
UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
if(image!=NULL)
[thumbView setImage:image];
}
I've just looked for a solution for this problem myself and got good help from your question.
Got your code above to work with one small change, removed a colon...
Change
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
to
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.
The code in Swift 2.1 would look like this:
do{
let asset1 = AVURLAsset(URL: url)
let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
generate1.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMake(3, 1) //TO CATCH THE THIRD SECOND OF THE VIDEO
let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
print(error)
}

image cropping an AVCaptureSession Image

So I have been at it all day to no luck and it has been needless to say quite frustrating, I have looked up many examples and downloadable categories which all tout being able to crop images flawlessly. Which they do, However the minute i try to do it from an image genrated via AVCaptureSession it does not work as well. I consulted both these sources
http://codefuel.wordpress.com/2011/04/22/image-cropping-from-a-uiscrollview/
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
and the project from the first link seems to work directly as advertised but as soon as i hack it to do the same magic on an av capture image...nope...
does anyone have insight into this? Also here is my code for reference.
- (IBAction)TakePhotoPressed:(id)sender
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
//NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
//NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"%f",image.size.width);
NSLog(#"%f",image.size.height);
float scale = 1.0f/_scrollView.zoomScale;
CGRect visibleRect;
visibleRect.origin.x = _scrollView.contentOffset.x * scale;
visibleRect.origin.y = _scrollView.contentOffset.x * scale;
visibleRect.size.width = _scrollView.bounds.size.width * scale;
visibleRect.size.height = _scrollView.bounds.size.height * scale;
UIImage* cropped = [self cropImage:image withRect:visibleRect];
[croppedImage setImage:cropped];
[image release];
}
];
[croppedImage setHidden:NO];
}
cropImage function used above.
-(UIImage*)cropImage :(UIImage*)originalImage withRect :(CGRect) rect
{
CGRect transformedRect=rect;
if(originalImage.imageOrientation==UIImageOrientationRight)
{
transformedRect.origin.x = rect.origin.y;
transformedRect.origin.y = originalImage.size.width-(rect.origin.x+rect.size.width);
transformedRect.size.width = rect.size.height;
transformedRect.size.height = rect.size.width;
}
CGImageRef cr = CGImageCreateWithImageInRect(originalImage.CGImage, transformedRect);
UIImage* cropped = [UIImage imageWithCGImage:cr scale:originalImage.scale orientation:originalImage.imageOrientation];
[croppedImage setFrame:CGRectMake(croppedImage.frame.origin.x,
croppedImage.frame.origin.y,
cropped.size.width,
cropped.size.height)];
CGImageRelease(cr);
return cropped;
}
I am also tempted for verbosity and arming whomever might help me in my plight with as much information as possible to post my init of my scrollView and avcapture session. However That may be a bit too much so if you want to see it just ask.
Now as for results of what the code actually does?..
What it looks like before i take the picture
And After...
EDIT:
Well I have a few views now and no comment's so either no one has figured it out or it's so simple they thought i would have figured it out again...In any case i have not made any progress. So for anyone interested here is a small sample app with the code all set up and you can see what i am doing
https://docs.google.com/open?id=0Bxr4V3a9QFM_NnoxMkhzZTVNVEE
It seems that this little conundrum did not only have me stumped as after nearly a week,but a scant few of whoever viewed my question had no suggestions either. I must say for this particular problem i could not get it to work in this way, I pondered and tinkered and mused for a while to no avail. Until i did this
[self HideElements];
UIGraphicsBeginImageContext(chosenPhotoView.frame.size);
[chosenPhotoView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self ShowElements];
And that's it, less code and it worked pretty much instantly. So instead of trying to crop an image via the scrollview I take a screenshot of the screen at that time then crop the image using the scrollviews frame variables. And the hide/show element functions hide any overlapping elements on the picture i want.

CGGradient isn't visible (not using interface builder) and UIButtons can't be triggered

I have created a view that contains a CGGradient:
// Bar ContextRef
CGRect bar = CGRectMake(0, screenHeight-staffAlignment, screenWidth, barWidth);
CGContextRef barContext = UIGraphicsGetCurrentContext();
CGContextSaveGState(barContext);
CGContextClipToRect(barContext,bar);
// Bar GradientRef
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGFloat components[16] = { 1.0,1.0,1.0,0.0, 0.0,0.0,0.0,1.0, 0.0,0.0,0.0,1.0, 1.0,1.0,1.0,0.0};
CGFloat locations[4] = {0.95,0.85,0.15,0.05};
size_t count = 4;
CGGradientRef gradientRef = CGGradientCreateWithColorComponents(colorSpace, components, locations, count);
// Draw Bar
CGPoint startPoint = {0.0,0.0};
CGPoint endPoint = {screenWidth,0.0};
CGContextDrawLinearGradient(barContext, gradientRef, startPoint, endPoint, 0);
CGContextRestoreGState(barContext);
This code is called in the drawRect method of the UIView. I then use a UIViewController to access the created view.
- (void)loadView {
MainPageView *mpView = [[MainPageView alloc] initWithFrame:[window bounds]];
[self setView:mpView];
[mpView release];
}
and displayed on the screen through the appDelegate:
mpViewController = [[MainPageViewController alloc] init];
[window addSubview:[mpViewController view]];
[window makeKeyAndVisible];
The UIView contains more objects, such as UIButtons, that are visible. I am assuming because they are added as a subview. But I can't work out how to add the CGGradient as a subview? Does it need to be? Is there another reason CGGradient is not visible?
I also don't get the functionality on the UIButtons. I guess that is because of where I have added the UIButtons to the view. Do the buttons need to be added in the UIViewController or the appDelegate to have functionality. Sorry to ask what would seem like simple questions but I am trying to accomplish the programming without the Interface Builder and material on that is scarce. If anyone could point me in the right direction on both these problems I would really appreciate it.
Thanks!
The functionality on the buttons was lost because the frame was too large but the buttons were still visible because the background was clearColor

Resources