How update image from URL in OS X app? - macos

i have some problem. So i have code which update song name and picture from php. Song name work and also updated but picture not work, in php file all work but in my project - no. How make update picture from url after 10 sec for example. Thanks.
-(void)viewWillDraw {
NSURL *artistImageURL = [NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?image"];
NSImage *artistImage = [[NSImage alloc] initWithContentsOfURL:artistImageURL];
[dj setImage:artistImage];
dispatch_queue_t queue = dispatch_get_global_queue(0,0);
dispatch_async(queue, ^{
NSError* error = nil;
NSString* text = [NSString stringWithContentsOfURL:[NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?artist"]
encoding:NSASCIIStringEncoding
error:&error];
dispatch_async(dispatch_get_main_queue(), ^{
[labelName setStringValue:text];
});
});
}

You should really consider placing this code someplace other than -viewWillDraw. This routine can be called multiple times for the same NSView under some circumstances and, more importantly, you need to call [super viewWillDraw] to make sure that things will actually draw correctly (if anything is drawn in the view itself).
For periodic updates (such as every 10 seconds), you should consider using NSTimer to trigger the retrieval of the next object.
As for the general question of why your image isn't being drawn correctly, you should probably consider putting the image retrieval and drawing code into the same structure as your label retrieval and drawing code. This will get the [dj setImage: artistImage] method call outside of the viewWillDraw chain which is likely causing some difficulty here.

Related

Render a CVPixelBuffer to an NSView (macOS)

I have a CVPixelBuffer that I'm trying to efficiently draw on screen.
The not-efficient way of turning into an NSImage works but is very slow, dropping about 40% of my frames.
Therefore, I've tried rendering it on-screen using CIContext's drawImage:inRect:fromRect. The CIContext was initialized with a NSOpenGLContext who's view was set to my VC's view. When I have a new image, I call the drawImage method which doesn't spit out any errors... but doesn't display anything on screen either (it did log errors when my contexts were not correctly setup).
I've tried to find an example of how this is done on MacOS, but everything seems to be for iOS nowadays.
EDIT:
Here's some of the code I am using. I've left out irrelevant sections
On viewDidLoad I init the GL and CI contexts
NSOpenGLPixelFormatAttribute pixelFormatAttr[] = {
kCGLPFAAllRenderers, 0
};
NSOpenGLPixelFormat *glPixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes: pixelFormatAttr];
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:glPixelFormat shareContext:nil];
glContext.view = self.view;
self.ciContext = [CIContext contextWithCGLContext:glContext.CGLContextObj pixelFormat:glPixelFormat.CGLPixelFormatObj colorSpace:nil options:nil];
Then, when a new frame is ready, I do:
dispatch_async(dispatch_get_main_queue(), ^{
[vc.ciContext drawImage:ciImage inRect:vc.view.bounds fromRect:ciImage.extent];
vc.isRendering = NO;
});
I am not sure I'm calling draw in the right place, but I can't seem to find out where is this supposed to go.
If the CVPixelBuffer has the kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey attribute, the backing IOSurface (retrieved via CVPixelBufferGetIOSurface) can be passed directly to the contents property of a CALayer.
This is probably the most efficient way to display a CVPixelBuffer.

AVAudioEngine incorrect time management and callback for AVAudioPlayerNode

I have a serious issue with the new audio engine in iOS8. I have an application, which is built with AVAudioPlayer and I am trying to figure out a way to migrate to the new architecture, however I bumped into the following problem (which I'm sure you would agree, is a serious and basic obstacle):
My header file:
AVAudioEngine *engine;
AVAudioMixerNode *mainMixer;
AVAudioPlayerNode *player;
My m file (inside the viewDidLoad):
engine = [[AVAudioEngine alloc] init];
player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"mp3"]];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileUrl error:nil];
NSLog(#"duration: %.2f", file.length / file.fileFormat.sampleRate);
[player scheduleFile:file atTime:nil completionHandler:^{
AVAudioTime *nodeTime = player.lastRenderTime;
AVAudioTime *playerTime = [player playerTimeForNodeTime:nodeTime];
float secs = (float)playerTime.sampleTime / file.fileFormat.sampleRate;
NSLog(#"finished at: %.2f", secs);
}];
mainMixer = [engine mainMixerNode];
[engine connect:player to:mainMixer format:file.processingFormat];
[engine startAndReturnError:nil];
[player play];
The above code initializes the engine and a node, then starts playing back whatever file I'm using. First it prints out the duration of the music file, then, after finishing playback, in the callback function, it prints the current time of the player. These two should be the same or in a worst case scenario, very, very close to each other, but this is not the case, the difference between these two values is very big, e.g.
duration: 148.51
finished at: 147.61
Am I doing something wrong? This should be fairly straight forward, I've tried with different file formats, file lengths, tens of music files, but the difference is always around or just under 1 second.
Update:
As of iOS 11 you can specify the completionCallbackType: AVAudioPlayerNodeCompletionCallbackType
dataConsumed:
A completion handler indicating that the buffer or file data has been consumed by the player.
dataRendered:
The buffer or file data that has been rendered by the player.
dataPlayedBack:
A completion handler indicating that the buffer or file has finished playing.
More info: Documentation
Original:
According to Apple's documentation for scheduleFile:atTime:completionHandler:
It is possible for the completionHandler to be called before rendering
begins or before the file is played completely.

AFNetworking and multiple UIImageViews pulling from same URL

I have an issue where im loading 3, sometimes 4 of the same images using
[imageFile setImageWithURL:[NSURL URLWithString:friendAvatar] placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
Im trying to see if theres a way to load this into some kind of NSData and use it later on, kind of like how im doing below, but using AFNetworking.
dispatch_async( dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0 ), ^(void)
{
NSURL *url3 = [NSURL URLWithString:friendAvatar];
NSData *data = [NSData dataWithContentsOfURL:url3];
UIImage *img = [[UIImage alloc] initWithData:data];
dispatch_async( dispatch_get_main_queue(), ^(void){
imageFile.image = img;
bgImageFile.image = img;
});
});
Also im not calling the image loads all in the same method, 2 call under the cellForRowAtIndexPath once the users friends list has been populated, then the 3rd gets loaded when i swipe over a cell to show its hidden (under) layer, the 4th repeat image gets called when pressing a button that is showed once the cell has been swiped which leads to a chatroom view between me and that friend.
Hopefully i got to my point on what im trying to acheive. And any help pointing in the right direction is very much appreciated.
Update:
This is my current code, this is what i mean by im pulling the same image several times.
Inside the cellForRowAtIndexPath
[imageFile setImageWithURLRequest:request placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
[bgImageFile setImageWithURL:[NSURL URLWithString:friendAvatar] placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
The method bottomDrawerWillAppear that is called contains
UIImageView *drawerBGImg = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,75)];
NSString *friendAvatar = [NSString stringWithFormat:#"%#%#%#", #"http://v9a2a7.com/user_photos/", [MyClass friendID], #".jpg"];
[drawerBGImg setImageWithURL:[NSURL URLWithString:friendAvatar]];
And on a seperate class and seperate view viewMessageViewController
NSString *friendAvatar = [NSString stringWithFormat:#"%#%#%#", #"http://v9a2a7.com/user_photos/", email, #".jpg"];
[bgImage setImageWithURL:[NSURL URLWithString:friendAvatar]];
I have not confirmed that the viewMessageViewsController's forces the image to be pulled from the server, but i know for a fact on the cellForRowAtIndexPath makes 3 requests for the same image which results in using 3X the amount of data usage
Hope this clears things up.
It sounds like you want to cache the image so you don't have to keep loading it. Assuming that's what you mean...
The AFNetworking method [UIImageView -setImageWithURL:placeholderImage:] already caches this image for you. The second time you call it, the image will be loaded from the cache.
The only reason it would get reloaded from the server a second time is if your app receives a low memory warning since the last download. (AFImageCache, an NSCache subclass, will automatically invalidate some or all of the cache.)
It uses the URL as the key, so as long as the URL is identical, the image will only get loaded from the server once.

Animating an NSStatusItemView

Struggling with this one. I have a custom NSStatusItemView that I'm trying to animate. I've added the following code to my status item view to kick off the animation:
- (void)setAnimated
{
CABasicAnimation *anim = [CABasicAnimation animationWithKeyPath:#"opacity"];
anim.duration = 1.0;
anim.repeatCount = HUGE_VALF;
anim.autoreverses = YES;
anim.fromValue=[NSNumber numberWithFloat:1.0];
anim.toValue=[NSNumber numberWithFloat:0.0];
[self.layer addAnimation: anim forKey: #"animateOpacity"];
[self setWantsLayer:YES];
[self setNeedsDisplay:YES];
}
When I call this method, nothing happens. Yet if I move this code to my drawRect method, then the view properly animates at launch. Not entirely sure what I need to do to be able to tell it to start animated after the fact but the above method is not doing it and I have no idea why! Any ideas?
Ok answer myself so the googles have record of the answer!
The problem was, kinda of, a lack of understanding of drawRect. When my setAnimated method calls setNeedsDisplay, it calls drawRect again, effectively undoing what is done in the setAnimated method.
There were two things I did to properly fix this. First, I modified the setAnimated method to accept a BOOL argument and set a isAnimated property on the view to that value. I then, in drawRect, check this BOOL value and do the animation if it is set to YES.
Secondly, it seems, you need to call [self setWantsLayer: YES] the first time the view is drawn. So I call this in drawRect the very first time it is run, so that later animation will work.

UIProgressView not displaying progress accurately

I am using UIProgresView for displaying downloading and in label the percentage of content downloads.The function is executing every time. When I displaying the progress value it is showing properly in the console. But it is not displaying on the screen, It is display at only 0% and 100%. I am using ASIHTTP framework too.
Please help in that.
CODE from comment:
for (float i=0; i< [topicNew count]; i++)
{
NSDictionary *new= [topicNew objectAtIndex:i];
NSString *imageName = [[[NSString alloc] initWithFormat:#"%#.%#.%#.png", appDelegate.subject,topicNamed,[new objectForKey:kWordTitleKey]] autorelease];
NSString *imagePath = [[self applicationDocumentsDirectory] stringByAppendingPathComponent:imageName];
NSData *imageData = [self ParsingImagePath:[new objectForKey:kWordImagePathKey]];
[progressView setProgress:i/[topicNew count]];
[lblpercent setText:[[NSString stringWithFormat:#"%.0f",i/[topicNew count]] stringByAppendingString:#"%"]];
... More code here...
It looks like you are threadlocking your main thread. Meaning, you are updating the progressView but the mainthread never gets out of your for loop to display the updated change until the for loop is finished, which by then looks like your progress is set to 100%.
You need to either use a timer like (NSTimer) or some kind of background thread that processes your image files. Then you need to call your instance of UIProgressView from the background thread when you want to update your progress (end of the for loop for example)
float p = i/[topicNew count];
[progressView performSelectorOnMainThread:#selector(setProgress:)
withObject:[NSNumber numberWithFloat:p]
waitUntilDone:NO];
and pass in your updated progress to the progressView. Make sure you are not calling the progressView from the background thread directly, as UIKit is not thread safe. Only call it from your main thread, or using the performSelectorOnMainThread:withObject:waitUntilDOne: method.
Hopefully that gets you in the right directly

Resources