How do I check whether an NSData object contains a sub-NSData? - cocoa

I have an NSData object which contains some data I need. What I wanted to do is to find out the position of data "FF D8" (start of JPEG data)
How can I achieve work like this?

First get the range, then get the data:
// The magic start data object is only created once safely and
// then reused each time
static NSData* magicStartData = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
static const uint8_t magic[] = { 0xff, 0xd8 };
magicStartData = [NSData dataWithBytesNoCopy:(void*)magic length:2 freeWhenDone:NO];
});
// assume data is the NSData with embedded data
NSRange range = [data rangeOfData:magicStartData options:0 range:NSMakeRange(0, [data length])];
if (range.location != NSNotFound) {
// This assumes the subdata doesn't have a specific range and is just everything
// after the magic, otherwise adjust
NSData* subdata = [data subdataWithRange:NSMakeRange(range.location, [data length] - range.location)];
}

Try NSData rangeOfData:options:range::
NSData *data = /* Your data here */;
UInt8 bytes_to_find[] = { 0xFF, 0xD8 };
NSData *dataToFind = [NSData dataWithBytes:bytes_to_find
length:sizeof(bytes_to_find)];
NSRange range = [data rangeOfData:dataToFind
options:kNilOptions
range:NSMakeRange(0u, [data length])];
if (range.location == NSNotFound) {
NSLog(#"Bytes not found");
}
else {
NSLog(#"Bytes found at position %lu", (unsigned long)range.location);
}

Related

AVFoundation image captured is dark

On osx i use AVFoundation to capture image from a USB camera, all work fine, but the image I get is darker compared to live video.
Device capture configuration
-(BOOL)prepareCapture{
captureSession = [[AVCaptureSession alloc] init];
NSError *error;
imageOutput=[[AVCaptureStillImageOutput alloc] init];
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
videoOutput=[[AVCaptureMovieFileOutput alloc] init];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:MyVideoDevice error:&error];
if (videoInput) {
[captureSession beginConfiguration];
[captureSession addInput:videoInput];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
//[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
[captureSession addOutput:imageOutput];
[captureSession addOutput:videoOutput];
[captureSession commitConfiguration];
}
else {
// Handle the failure.
return NO;
}
return YES;
}
Add view for live preview
-(void)settingPreview:(NSView*)View{
// Attach preview to session
previewView = View;
CALayer *previewViewLayer = [previewView layer];
[previewViewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)];
AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
[newPreviewLayer setFrame:[previewViewLayer bounds]];
[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[previewViewLayer addSublayer:newPreviewLayer];
//[self setPreviewLayer:newPreviewLayer];
[captureSession startRunning];
}
Code to capture the image
-(void)captureImage{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in imageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:
^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments =
CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
// Do something with the attachments.
}
// Continue as appropriate.
//IMG is a global NSImage
IMG = [self imageFromSampleBuffer:imageSampleBuffer];
[[self delegate] imageReady:IMG];
}];
}
Create a NSImage from sample buffer data, i think the problem is here
- (NSImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
//UIImage *image = [UIImage imageWithCGImage:quartzImage];
NSImage * image = [[NSImage alloc] initWithCGImage:quartzImage size:NSZeroSize];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
Solution found
The problem was in imageFromSampleBuffer
I used this code and the picture is perfect
// Continue as appropriate.
//IMG = [self imageFromSampleBuffer:imageSampleBuffer];
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
IMG = [[NSImage alloc] initWithSize: [imageRep size]];
[IMG addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
}
Code found in this answer
In my case, you still need to call captureStillImageAsynchronouslyFromConnection: multiple times to force the built-in camera to expose properly.
int primeCount = 8; //YMMV
for (int i = 0; i < primeCount; i++) {
[imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {}];
}
[imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
IMG = [[NSImage alloc] initWithSize: [imageRep size]];
[IMG addRepresentation: imageRep];
}
}];

Capturing blank stills from a AVCaptureScreenInput?

I'm working on sampling the screen using AVCaptureScreenInput and outputting it using a AVCaptureVideoDataOutput, and it's not working. The images it does output are blank, but it appears like I'm doing everything right according to all the documentation I've read.
I've made sure I make the AVCaptureVideoDataOutput to something that could be read by a CGImage (kCVPixelFormatType_32BGRA). When I run this same code and have it output to a AVCaptureMovieFileOutput, the movie renders fine and everything looks good - but what I really want is a series of images.
#import "ScreenRecorder.h"
#import <QuartzCore/QuartzCore.h>
#interface ScreenRecorder() <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> {
BOOL _isRecording;
#private
AVCaptureSession *_session;
AVCaptureOutput *_movieFileOutput;
AVCaptureStillImageOutput *_imageFileOutput;
NSUInteger _frameIndex;
NSTimer *_timer;
NSString *_outputDirectory;
}
#end
#implementation ScreenRecorder
- (BOOL)recordDisplayImages:(CGDirectDisplayID)displayId toURL:(NSURL *)fileURL windowBounds:(CGRect)windowBounds duration:(NSTimeInterval)duration {
if (_isRecording) {
return NO;
}
_frameIndex = 0;
// Create a capture session
_session = [[AVCaptureSession alloc] init];
// Set the session preset as you wish
_session.sessionPreset = AVCaptureSessionPresetHigh;
// Create a ScreenInput with the display and add it to the session
AVCaptureScreenInput *input = [[[AVCaptureScreenInput alloc] initWithDisplayID:displayId] autorelease];
if (!input) {
[_session release];
_session = nil;
return NO;
}
if ([_session canAddInput:input]) {
[_session addInput:input];
}
input.cropRect = windowBounds;
// Create a MovieFileOutput and add it to the session
_movieFileOutput = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[((AVCaptureVideoDataOutput *)_movieFileOutput) setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys:#(kCVPixelFormatType_32BGRA),kCVPixelBufferPixelFormatTypeKey, nil]];
// ((AVCaptureVideoDataOutput *)_movieFileOutput).alwaysDiscardsLateVideoFrames = YES;
if ([_session canAddOutput:_movieFileOutput])
[_session addOutput:_movieFileOutput];
// Start running the session
[_session startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[fileURL path]])
{
NSError *err;
if (![[NSFileManager defaultManager] removeItemAtPath:[fileURL path] error:&err])
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
_outputDirectory = [[fileURL path] retain];
[[NSFileManager defaultManager] createDirectoryAtPath:_outputDirectory withIntermediateDirectories:YES attributes:nil error:nil];
// Set the recording delegate to self
dispatch_queue_t queue = dispatch_queue_create("com.schaefer.lolz", 0);
[(AVCaptureVideoDataOutput *)_movieFileOutput setSampleBufferDelegate:self queue:queue];
//dispatch_release(queue);
if (0 != duration) {
_timer = [[NSTimer scheduledTimerWithTimeInterval:duration target:self selector:#selector(finishRecord:) userInfo:nil repeats:NO] retain];
}
_isRecording = YES;
return _isRecording;
}
- (void)dealloc
{
if (nil != _session) {
[_session stopRunning];
[_session release];
}
[_outputDirectory release];
_outputDirectory = nil;
[super dealloc];
}
- (void)stopRecording {
if (!_isRecording) {
return;
}
_isRecording = NO;
// Stop recording to the destination movie file
if ([_movieFileOutput isKindOfClass:[AVCaptureFileOutput class]]) {
[_movieFileOutput performSelector:#selector(stopRecording)];
}
[_session stopRunning];
[_session release];
_session = nil;
[_timer release];
_timer = nil;
}
-(void)finishRecord:(NSTimer *)timer
{
[self stopRecording];
}
//AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0); // Lock the image buffer
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); // Get information of the image
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef image = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
_frameIndex++;
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
dispatch_async(dispatch_get_main_queue(), ^{
NSURL *URL = [NSURL fileURLWithPath:[_outputDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%d.jpg", (int)_frameIndex]]];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((CFURLRef)URL, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(destination, image, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"Failed to write image to %#", URL);
}
CFRelease(destination);
CFRelease(image);
});
}
#end
Your data isn't planar, so there is no base address for plane 0--there's no plane 0. (To be sure, you can check with CVPixelBufferIsPlanar.) You'll need CVPixelBufferGetBaseAddress to get a pointer to the first pixel. All the data will be interleaved.

Objective C: Search in a tableview with NSASCIIStringEncoding

I am searching into a UITableView using this:
titles = [NSArray arrayWithArray:[datamanager titlesForEntriesBetween:(NSInteger)[slider minSelectedValue] and:(NSInteger)[slider maxSelectedValue]containing:searchText]];
How can I encode array value with NSASCIIStringEncoding during the search process?
(Array contains "tĂȘte" for example.. and when I search "tete" nothing matches.. so I will encode array value just for my search)
I would add change the third parameter to your datamanager function:
- (NSArray*)titlesForEntriesBetween:(NSInteger)startIndex
and:(NSInteger)stopIndex
withFunction:(BOOL(^)(NSString*))block {
NSMutableArray *retVal = [NSMutableArray array];
for(NSInteger i = startIndex; i <= stopIndex; ++i) {
NSString *string = [array_ objectAtIndex:i];
if (block(string)) {
[retVal insertObject:string];
}
}
return retVal;
}
And then I would call the function like this:
titles = [datamanager titlesForEntriesBetween:(NSInteger)[slider minSelectedValue] and:(NSInteger)[slider maxSelectedValue] withFunction:^(BOOL)(NSString *str) {
NSData *data = [str dataUsingEncoding:NSASCIIStringEncoding];
NSString *simpleString = [[[NSString alloc] initWithData:data usingEncoding: NSASCIIStringEncoding] autorelease];
return [simpleString isEqualToString:str];
}]];
Note: I just typed this in, I haven't tried to compile/run this.

Is it possible to store an image as a string in cocoa?

I want to hard code the binary data for an image in a class file. When the class gets initialized, create an NSImage from that data. Storing the image in the resources folder is not an option.
Is this possible and how?
use NSData rather than NSString.
NSImage is NSCoding compliant - it knows how to archive itself, and how to create/read image representations of other file formats.
if you want to work with another image representation, you can use CGImage apis to create a CGImage, which can then be used to create a NSImage.
//get the image
NSImage *newImage = [[NSImage alloc] initWithContentsOfFile:#"~/Desktop/testImg.png"];
//convert to BitmapImageRep
NSBitmapImageRep *bitmap = [[newImage representations] objectAtIndex:0];
//convert to NSData
NSData *data = [bitmap representationUsingType: NSPNGFileType properties: nil];
//base64 encode and now I have the string.
NSString *imageString = [data encodeBase64WithNewlines:NO];
NSLog(#"image %#", imageString);
//No that I have the string, I can hard code it into my source code (paste it in).
//When I want to create an image out of it I just get the imageString and convert it to an image
NSData *revData = [imageString decodeBase64WithNewlines:NO];
newImage = [[NSImage alloc] initWithData:revData];
I have 2 NSData Categories I use here (encodeBase64WithNewlines:NO and decodeBase64WithNewlines:NO) You will have to include libcrypto.dylib for them to work. I think I copied them from Cocoa Dev
- (NSString *) encodeBase64WithNewlines: (BOOL) encodeWithNewlines
{
// Create a memory buffer which will contain the Base64 encoded string
BIO * mem = BIO_new(BIO_s_mem());
// Push on a Base64 filter so that writing to the buffer encodes the data
BIO * b64 = BIO_new(BIO_f_base64());
if (!encodeWithNewlines)
BIO_set_flags(b64, BIO_FLAGS_BASE64_NO_NL);
mem = BIO_push(b64, mem);
// Encode all the data
BIO_write(mem, [self bytes], [self length]);
int flushResult = BIO_flush(mem);
if(flushResult != 0){
//throw some warning?
}
// Create a new string from the data in the memory buffer
char * base64Pointer;
long base64Length = BIO_get_mem_data(mem, &base64Pointer);
NSData * base64data = [NSData dataWithBytesNoCopy:base64Pointer length:base64Length freeWhenDone:NO];
NSString * base64String = [[NSString alloc] initWithData:base64data encoding:NSUTF8StringEncoding];
// Clean up and go home
BIO_free_all(mem);
return [base64String autorelease];
}
- (NSData *)decodeBase64WithNewLines:(BOOL)encodedWithNewlines
{
// Create a memory buffer containing Base64 encoded string data
BIO * mem = BIO_new_mem_buf((void *) [self bytes], [self length]);
// Push a Base64 filter so that reading from the buffer decodes it
BIO * b64 = BIO_new(BIO_f_base64());
if (!encodedWithNewlines)
BIO_set_flags(b64, BIO_FLAGS_BASE64_NO_NL);
mem = BIO_push(b64, mem);
// Decode into an NSMutableData
NSMutableData * data = [NSMutableData data];
char inbuf[512];
int inlen;
while ((inlen = BIO_read(mem, inbuf, sizeof(inbuf))) > 0)
[data appendBytes: inbuf length: inlen];
// Clean up and go home
BIO_free_all(mem);
return data;
}

converting CMSampleBufferRef to UIImage

i always get : CGImageCreate: invalid image size: 0 x 0.
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just videos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
// For this example, we're only interested in the first item.
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:0]
options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [[alAsset defaultRepresentation] retain];
NSURL *url = [representation url];
AVURLAsset *avAsset = [[AVURLAsset URLAssetWithURL:url options:nil] retain];
AVAssetReader *assetReader = [[AVAssetReader assetReaderWithAsset:avAsset error:nil] retain];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
AVAssetReaderTrackOutput *assetReaderOutput = [[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:nil] retain];
if (![assetReader canAddOutput:assetReaderOutput]) {printf("could not read reader output\n");}
[assetReader addOutput:assetReaderOutput];
[assetReader startReading];
CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
UIImage* image = imageFromSampleBuffer(nextBuffer);
}
}];
}
failureBlock: ^(NSError *error) {NSLog(#"No groups");}];
the imageFromSampleBuffer comes directly from apple:
UIImage* imageFromSampleBuffer(CMSampleBufferRef nextBuffer) {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(nextBuffer);
printf("total size:%u\n",CMSampleBufferGetTotalSampleSize(nextBuffer));
// Lock the base address of the pixel buffer.
//CVPixelBufferLockBaseAddress(imageBuffer,0);
// Get the number of bytes per row for the pixel buffer.
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height.
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
printf("b:%d w:%d h:%d\n",bytesPerRow,width,height);
// Create a device-dependent RGB color space.
static CGColorSpaceRef colorSpace = NULL;
if (colorSpace == NULL) {
colorSpace = CGColorSpaceCreateDeviceRGB();
if (colorSpace == NULL) {
// Handle the error appropriately.
return nil;
}
}
// Get the base address of the pixel buffer.
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the data size for contiguous planes of the pixel buffer.
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
// Create a Quartz direct-access data provider that uses data we supply.
CGDataProviderRef dataProvider =
CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL);
// Create a bitmap image from data supplied by the data provider.
CGImageRef cgImage =
CGImageCreate(width, height, 8, 32, bytesPerRow,
colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
dataProvider, NULL, true, kCGRenderingIntentDefault);
CGDataProviderRelease(dataProvider);
// Create and return an image object to represent the Quartz image.
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return image;
}
i try to get the length and width, basically it will print out the size of the sample buffer, knowing that the buffer itself is not inexistant, but i get no UIImage
for AVAssetReaderTrackOutput *assetReaderOutput...
NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary];
[outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
I understand you want to read first image from all your local videos?
You can use simple way to do all of this.
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just videos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
// For this example, we're only interested in the first item.
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:0]
options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [[alAsset defaultRepresentation] retain];
NSURL *url = [representation url];
AVURLAsset *avAsset = [[AVURLAsset URLAssetWithURL:url options:nil] retain];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:avAsset];
CMTime thumbTime = CMTimeMakeWithSeconds(1, 30);
NSError *error;
CMTime actualTime;
[imageGenerator setMaximumSize:MAXSIZE];
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:thumbTime actualTime:&actualTime error:&error];
}
}];
}
failureBlock: ^(NSError *error) {NSLog(#"No groups");}];

Resources