How to identify a NSData's image format? - image

If I get a NSData which I know it's a image's data.But I don't know what format it is.
So how can I identify which image format is it?Jpeg or PNG?
PS:iOS

I used Mats answer to build a simple category on NSData which tells me if its content is JPEG or PNG based on its first 4 bytes:
#interface NSData (yourCategory)
- (BOOL)isJPG;
- (BOOL)isPNG;
#end
#implementation NSData (yourCategory)
- (BOOL)isJPG
{
if (self.length > 4)
{
unsigned char buffer[4];
[self getBytes:&buffer length:4];
return buffer[0]==0xff &&
buffer[1]==0xd8 &&
buffer[2]==0xff &&
buffer[3]==0xe0;
}
return NO;
}
- (BOOL)isPNG
{
if (self.length > 4)
{
unsigned char buffer[4];
[self getBytes:&buffer length:4];
return buffer[0]==0x89 &&
buffer[1]==0x50 &&
buffer[2]==0x4e &&
buffer[3]==0x47;
}
return NO;
}
#end
And then, simply do a :
CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData((CFDataRef) imgData);
CGImageRef imgRef = nil;
if ([imgData isJPG])
imgRef = CGImageCreateWithJPEGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);
else if ([imgData isPNG])
imgRef = CGImageCreateWithPNGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);
UIImage* image = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CGDataProviderRelease(imgDataProvider);

You could look at the first bytes and make a guess. There are many lists of magic numbers available on the internet, e.g. http://www.astro.keele.ac.uk/oldusers/rno/Computing/File_magic.html.

Here's a Swift version of the #apouche's answer:
extension NSData {
func firstBytes(length: Int) -> [UInt8] {
var bytes: [UInt8] = [UInt8](count: length, repeatedValue: 0)
self.getBytes(&bytes, length: length)
return bytes
}
var isJPEG: Bool {
let signature:[UInt8] = [0xff, 0xd8, 0xff, 0xe0]
return firstBytes(4) == signature
}
var isPNG: Bool {
let signature:[UInt8] = [0x89, 0x50, 0x4e, 0x47]
return firstBytes(4) == signature
}
}

Can you create an image from that and then just ask that NSImage what format it is?
You can use -initWithData to create the NSImage, for more, see http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/ApplicationKit/Classes/NSImage_Class/Reference/Reference.html

You can create CGImageSourceRef and then ask it for image type
CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
if(imageSource)
{
// this is the type of image (e.g., public.jpeg - kUTTypeJPEG )
// <MobileCoreServices/UTCoreTypes.h>
CFStringRef UTI = CGImageSourceGetType(imageSource);
CFRelease(imageSource);
}
imageSource = nil;

If you use UIImage imageWithData, you don't need to know if the data is png or jpg.
// read from file
NSData * thumbnailData = [decoder decodeObjectForKey:kThumbnailKey];
[UIImage imageWithData:thumbnailData];

Related

CMSampleBufferGetImageBuffer return nil for captured JPEG stillImage

I work on capture Mac screen using JPEG format, and then get the pixelBuffer and imageBuffer of the captured JPEG samplebuffer.
But, the pixelBuffer is always nil, while when I convert the JPEG buffer to NSImage, the image can be got and displayed successfully.
-(void)createSession
{
if(self.session == nil)
{
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPresetPhoto;
CGDirectDisplayID displayId = [self getDisplayID];
self.input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
[self.session addInput:self.input];
self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = #{AVVideoCodecKey : AVVideoCodecJPEG};
[self.imageOutput setOutputSettings:outputSettings];
[self.session addOutput:self.imageOutput];
[self.session startRunning];
}
}
-(void)processSampleBuffer
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.imageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[self.imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError * error) {
if(imageDataSampleBuffer != nil)
{
NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
self.image = [[NSImage alloc] initWithData:imageData];
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
NSLog(#"width %zu height %zu", width,height);
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
float width1 = CVPixelBufferGetWidth(pixelBuffer);
float height1 = CVPixelBufferGetHeight(pixelBuffer);
NSLog(#"Pixelbuffer width %f height %f", width1,height1);
}
else
{
NSLog(#"error");
}
}];
}
in processSampleBuffer
the self.image can get a NSImage and displayed in NSImageView successfully.
but the imageBuffer and pixelBuffer are both nil.
it confused me a lot, could someone help to have a look ?
OK, finally, I found the answer here, hope it can help others.
https://developer.apple.com/documentation/avfoundation/avcapturephoto/2873914-pixelbuffer?language=objc
Discussion
If you requested photo capture in a RAW format, or in a processed format without compression such as TIFF, you can use this property to access the underlying sample buffer.
If you requested capture in a compressed format such as JPEG or HEVC/HEIF, this property's value is nil. Use the fileDataRepresentation or CGImageRepresentation method to obtain compressed image data.

How do you get the image data from NSAttributedString

I have an NSTextView. I paste an image into it and see it. When I get the NSTextAttachment for the NSAttributedString of the text view, it's file wrapper is nil. How do I get the image data that was pasted into the text view?
I'm using a category on NSAttributedString to get the text attachments. I would prefer not to write to disk if it's possible.
- (NSArray *)allAttachments
{
NSError *error = NULL;
NSMutableArray *theAttachments = [NSMutableArray array];
NSRange theStringRange = NSMakeRange(0, [self length]);
if (theStringRange.length > 0)
{
NSUInteger N = 0;
do
{
NSRange theEffectiveRange;
NSDictionary *theAttributes = [self attributesAtIndex:N longestEffectiveRange:&theEffectiveRange inRange:theStringRange];
NSTextAttachment *theAttachment = [theAttributes objectForKey:NSAttachmentAttributeName];
if (theAttachment != NULL){
NSLog(#"filewrapper: %#", theAttachment.fileWrapper);
[theAttachments addObject:theAttachment];
}
N = theEffectiveRange.location + theEffectiveRange.length;
}
while (N < theStringRange.length);
}
return(theAttachments);
}
Enumerate the attachments. [NSTextStorage enumerateAttribute:...]
Get the attachment's filewrapper.
Write to a URL.
[textStorage enumerateAttribute:NSAttachmentAttributeName
inRange:NSMakeRange(0, textStorage.length)
options:0
usingBlock:^(id value, NSRange range, BOOL *stop)
{
NSTextAttachment* attachment = (NSTextAttachment*)value;
NSFileWrapper* attachmentWrapper = attachment.fileWrapper;
[attachmentWrapper writeToURL:outputURL options:NSFileWrapperWritingAtomic originalContentsURL:nil error:nil];
(*stop) = YES; // stop so we only write the first attachment
}];
This sample code will only write the first attachment to outputURL.
You can get the contained NSImage from the attachment cell.
Minimalistic example:
// assuming we have a NSTextStorage* textStorage object ready to go,
// and that we know it contains an attachment at some_index
// (in real code we would probably enumerate attachments).
NSRange range;
NSDictionary* textStorageAttrDict = [textStorage attributesAtIndex:some_index
longestEffectiveRange:&range
inRange:NSMakeRange(0,textStorage.length)];
NSTextAttachment* textAttachment = [textStorageAttributesDictionary objectForKey:#"NSAttachment"];
NSTextAttachmentCell* textAttachmentCell = textAttachment.attachmentCell;
NSImage* attachmentImage = textAttachmentCell.image;
EDITING:
OS X only (AppKit version)
#EmeraldWeapon's answer is good for Objective-C, but falls down in Swift, as in Swift the attachmentCell is not an NSTextAttachmentCell, but rather an NSTextAttachmentCellProtocol? (which does not provide .image) - so you need to cast it to a concrete instance before accessing the .image:
func firstImage(textStorage: NSTextStorage) -> NSImage? {
for idx in 0 ..< textStorage.string.count {
if
let attr = textStorage.attribute(NSAttributedString.Key.attachment, at: idx, effectiveRange: nil),
let attachment = attr as? NSTextAttachment,
let cell = attachment.attachmentCell as? NSTextAttachmentCell,
let image = cell.image {
return image
}
}
return nil
}

How do I check whether an NSData object contains a sub-NSData?

I have an NSData object which contains some data I need. What I wanted to do is to find out the position of data "FF D8" (start of JPEG data)
How can I achieve work like this?
First get the range, then get the data:
// The magic start data object is only created once safely and
// then reused each time
static NSData* magicStartData = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
static const uint8_t magic[] = { 0xff, 0xd8 };
magicStartData = [NSData dataWithBytesNoCopy:(void*)magic length:2 freeWhenDone:NO];
});
// assume data is the NSData with embedded data
NSRange range = [data rangeOfData:magicStartData options:0 range:NSMakeRange(0, [data length])];
if (range.location != NSNotFound) {
// This assumes the subdata doesn't have a specific range and is just everything
// after the magic, otherwise adjust
NSData* subdata = [data subdataWithRange:NSMakeRange(range.location, [data length] - range.location)];
}
Try NSData rangeOfData:options:range::
NSData *data = /* Your data here */;
UInt8 bytes_to_find[] = { 0xFF, 0xD8 };
NSData *dataToFind = [NSData dataWithBytes:bytes_to_find
length:sizeof(bytes_to_find)];
NSRange range = [data rangeOfData:dataToFind
options:kNilOptions
range:NSMakeRange(0u, [data length])];
if (range.location == NSNotFound) {
NSLog(#"Bytes not found");
}
else {
NSLog(#"Bytes found at position %lu", (unsigned long)range.location);
}

CGImageSourceRef memoryleak

NSDictionary* result = nil;
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)[self TIFFRepresentation], NULL);
if ( NULL == source )
{
}
else
{
CFDictionaryRef metadataRef = CGImageSourceCopyPropertiesAtIndex (source, 0, NULL);
if (metadataRef)
{
NSDictionary* immutableMetadata = (__bridge NSDictionary *)metadataRef;
if (immutableMetadata)
{
result = [NSDictionary dictionaryWithDictionary : (__bridge NSDictionary *)metadataRef];
}
CFRelease(metadataRef);
metadataRef = nil;
}
CFRelease(source);
source = nil;
}
return result;
I am using XCode with ARC.
This code causes my app to leak memory when i run it on many images in a loop.
Does anybody know what i did wrong?
wrapping #autoreleasepool around code solved the problem. Images were about 1.2MB

Removing url fragment from NSURL

I'm writing a Cocoa application, which uses NSURLs -- I need to remove the fragment portion of the URL (the #BLAH part).
example: http://example.com/#blah should end up as http://example.com/
I found some code in WebCore that seems to do it by using CFURL functionality, but it never finds the fragment portion in the URL. I've encapsulated it in a extension category:
-(NSURL *)urlByRemovingComponent:(CFURLComponentType)component {
CFRange fragRg = CFURLGetByteRangeForComponent((CFURLRef)self, component, NULL);
// Check to see if a fragment exists before decomposing the URL.
if (fragRg.location == kCFNotFound)
return self;
UInt8 *urlBytes, buffer[2048];
CFIndex numBytes = CFURLGetBytes((CFURLRef)self, buffer, 2048);
if (numBytes == -1) {
numBytes = CFURLGetBytes((CFURLRef)self, NULL, 0);
urlBytes = (UInt8 *)(malloc(numBytes));
CFURLGetBytes((CFURLRef)self, urlBytes, numBytes);
} else
urlBytes = buffer;
NSURL *result = (NSURL *)CFMakeCollectable(CFURLCreateWithBytes(NULL, urlBytes, fragRg.location - 1, kCFStringEncodingUTF8, NULL));
if (!result)
result = (NSURL *)CFMakeCollectable(CFURLCreateWithBytes(NULL, urlBytes, fragRg.location - 1, kCFStringEncodingISOLatin1, NULL));
if (urlBytes != buffer) free(urlBytes);
return result ? [result autorelease] : self;
}
-(NSURL *)urlByRemovingFragment {
return [self urlByRemovingComponent:kCFURLComponentFragment];
}
This is used as such:
NSURL *newUrl = [[NSURL URLWithString:#"http://example.com/#blah"] urlByRemovingFragment];
unfortunately, newUrl ends up being "http://example.com/#blah" because the first line in urlByRemovingComponent always returns kCFNotFound
I'm stumped. Is there a better way of going about this?
Working Code, thanks to nall
-(NSURL *)urlByRemovingFragment {
NSString *urlString = [self absoluteString];
// Find that last component in the string from the end to make sure to get the last one
NSRange fragmentRange = [urlString rangeOfString:#"#" options:NSBackwardsSearch];
if (fragmentRange.location != NSNotFound) {
// Chop the fragment.
NSString* newURLString = [urlString substringToIndex:fragmentRange.location];
return [NSURL URLWithString:newURLString];
} else {
return self;
}
}
How about this:
NSString* s = #"http://www.somewhere.org/foo/bar.html/#label";
NSURL* u = [NSURL URLWithString:s];
// Get the last path component from the URL. This doesn't include
// any fragment.
NSString* lastComponent = [u lastPathComponent];
// Find that last component in the string from the end to make sure
// to get the last one
NSRange fragmentRange = [s rangeOfString:lastComponent
options:NSBackwardsSearch];
// Chop the fragment.
NSString* newURLString = [s substringToIndex:fragmentRange.location + fragmentRange.length];
NSLog(#"%#", s);
NSLog(#"%#", newURLString);
This is quite an old question, and it has already been answered, but for another simple option this is how I did it:
NSString* urlAsString = [myURL absoluteString];
NSArray* components = [urlAsString componentsSeparatedByString:#"#"];
NSURL* myURLminusFragment = [NSURL URLWithString: components[0]];
if there is no fragment, urlMinusFragment will be the same as myURL
Swift 3.0
this will remove the fragment
if let fragment = url.fragment{
url = URL(string: url.absoluteString.replacingOccurrences(of: "#\(fragment)", with: "")!
}

Resources