NSImage from byte array - cocoa

I'm trying to display an image in a NSImageView, with an image contained in a Byte array. How can I do this? From what I understand I need to convert my byte[] to an NSData variable and feed that to an NSImage. Is this correct? How do I do it? I've tried casting and that doesn't work, and there doesn't seem to be any conversion built in...
I have tried the following:
Casting:
NSData bytesAsMacVariable = (NSData) imageAsBytes;
Also tried
NSData bytesAsMacVariable = imageAsBytes as NSData;
Finally, tried to pass a byte[] as if it was a NSData.
NSImage imageToShow = new NSImage(imageAsBytes);
None of these will work, and as far as I can see, neither NSImage or NSData has a member function that accepts byte[] for conversion...

You're casting to the object type, but you should cast to pointer-to-object type.
Try something more like
NSData *imageData = [NSData dataWithBytes:byteArray length:arrayLength];
NSImage *image = [[NSImage alloc] initWithData:imageData];
[imageView setImage:image];
[image release];
The pointers are very important.

You can use an NSMutableData, like this:
new NSImage (new NSMutableData (imageAsBytes));

Related

How to convert jpeg image to PICT image on MAC using Cocoa

How to convert JPEG image to PICT image using cocoa.Some script is given below.
NSData *imgData = [NSData datawithContentsOfFile:#"/var/root/Desktop/1.jpeg"];
NSPICTImageRep *imagerep = [NSPICTImageRep imageRepWithData:imgData];
NSData *data = [imageRep PICTRepresentation];
[data writeTofile:#"/var/root/Desktop/save.pict" atomically:No];
This script is not work. and any other alternate method which convert jpeg image to pict image without Applescript.
.
There's a couple problems with your code.
#1) are you certain of the location of that "1.jpeg" file?
#2) you're not looking at the error result of your "writeToFile". On my machine, I can not write to anything inside the "/var/root" directory.
Once you fix up the source and destination paths, you should change your code to something like this:
NSData *imgData = [NSData datawithContentsOfFile:#"/Users/anuj/Desktop/1.jpeg"];
NSPICTImageRep *imagerep = [NSPICTImageRep imageRepWithData:imgData];
NSData *data = [imageRep PICTRepresentation];
NSLog(#"my image data size is %ld", [data length]);
if([data length] > 0)
{
BOOL success = [data writeTofile:#"/Users/anuj/Desktop/save.pict" atomically:NO];
if(success)
NSLog(#"successfully wrote the file");
else
NSLog(#"did not write the file");
}
else
{
NSLog(#"didn't convert the image to a Pict");
}

Getting bitmap data from JPEG image using Cocoa

I need to extract the raw RGB bitmap data from a JPEG or PNG file, with all the bits in the file, not a window or color converted version.
I'm new to Cocoa, but it looks like I open an image using NSImage like this:
NSString* imageName=[[NSBundle mainBundle] pathForResource:#"/Users/me/Temp/oxberry.jpg" ofType:#"JPG"];
NSImage* tempImage=[[NSImage alloc] initWithContentsOfFile:imageName];
NSBitmapImageRep* imageRep=[[[NSBitmapImageRep alloc] initWithData:[tempImage TIFFRepresentation]] autorelease];
unsigned char* bytes=[imageRep bitmapData];
int bits=[imageRep bitsPerPixel];
Then to get the bitmap data there seems to be lots of options: Bitmapimage, CGImage, etc.
What is the simplest approach and if there was a code snippet, that would be great.
Thanks!
You're on the right track. As you noticed, there are lot of ways to do this.
Once you have an NSImage, you can create a bitmap representation, and access its bytes directly. An easy way to get a NSBitmapImageRep is to do this:
NSBitmapImageRep* imageRep = [[[NSBitmapImageRep alloc] initWithData:[tempImage TIFFRepresentation]] autorelease];
unsigned char* bytes = [imageRep bitmapData];
int bitsPerPixel = [imageRep bitsPerPixel];
// etc
Going through the TIFFRepresentation step is safer than accessing the NSImage's representations directly.

Loading a GIF file into CGImage

obviously both creation functions for PNG and JPEG doesn't help for GIF files,
How is it possible (and easy?) to load a gif file into a CGImage.
For a Mac, the easiest way is:
NSData *data = [NSData dataWithContentsOfFile:#"mygif.gif"];
CGImageRef myCGImage = [[NSBitmapImageRep imageRepWithData: data] CGImage];
Building on JG's answer, for reference.
A solid way to get the "name" of the file is:
NSString *fileName = [[NSBundle mainBundle] pathForResource:#"imageName" ofType:#"jpg"];
And then continue with JG's solution:
CGDataProviderRef dataProvider = CGDataProviderCreateWithFilename([fileName cStringUsingEncoding:NSUTF8StringEncoding]);
self.maskedImage = CGImageCreateWithJPEGDataProvider(dataProvider, NULL, NO, kCGRenderingIntentDefault);

converting a pointer to unsigned char to an NSImage

I have a structure which includes a pointer to a data set, which in this case is a 16-bit grayscale image. I want to convert this data to an NSImage so that I can display it, and then save it as a .TIF file. The route from the manuals appears to be something like:
(Create *myNSImData from frame->image, which is a pointer)
NSImage *TestImage = [[NSImage alloc] initWithData : myNSImData];
(display TestImage, save it, whatever else)
[TestImage release];
I am lost as to how to create the NSData object and assure it contains the array of 16-bit data. Attempts to recast the pointer give warnings and no data. I could simply increment the pointers, transferring one byte at a time from frame->image to the data object, but I don't understand how to communicate the array structure to the data object. Any ideas?
Thanks
MORE ATTEMPTS USING YOUR SUGGESTION
I can convert this data to a .TIF file in the following manner:
for (uint32 row = 0 ; row < MaxHeight ; row++)
{
for (uint32 column = 0;column< MaxWidth;column++)
{
tempData = (uint8_t)*frame->image; //first byte
*frame->image++;
buf[2 * column + 1] = (unsigned char) tempData;
tempData = (uint8_t)*frame->image; //second byte
*frame->image++;
buf[2 * column] = (unsigned char) tempData;
}
TIFFWriteScanline(tiffile,buf,row,0);
}
With the .TIF file thus generated, I can create an NSImage and display it:
NSImage *TestImage = [[[NSImage alloc] initWithContentsOfFile:inFilePath] autorelease];
[viewWindow setImage: TestImage];
My question now becomes - can I create an NSData object that I can display in the same way? I have tried the following (product is the height*width of the image):
NSData *ReadImage = [[[NSData alloc] initWithBytes: frame->image length:2*product] autorelease] ;
NSImage *NewImage = [[[NSImage alloc] initWithData:ReadImage] autorelease];
NSSize newSize;
newSize.height = MaxHeight; //height of the image
newSize.width = MaxWidth; //width of the image
[NewImage setSize:newSize];
[viewWindow setImage: NewImage];
When I try this, nothing displays. I have also tried creating an array of uint16_t which has the data, and serving up the pointer to that - again, nothing displays. Any ideas? E.g. do I have to tell the NSData that I am using 2 bytes per pixel, or something like that? Thanks Monty Wood
To create an NSData object containing a block to which you have a pointer, you should use one of the three methods that start with initWithBytes:, or, to create an autoreleased NSData object, use one of the class methods that start with dataWithBytes:
UPDATE: I think that if you want to create an NSImage directly from an NSData, the data needs to include the appropriate headers/magic numbers so that NSImage can figure out what the representation is. You should look at NSBitmapImageRep and the Images chapter of the Cocoa Drawing Guide for raw image data.

How to- NSAttributedString to CGImageRef

I'm writing a QuickLook plugin. Well, everything works. Just want to try it make better ;).
Thus the question.
Here is a function that returns thumbnail image and that I'm using now.
QLThumbnailRequestSetImageWithData(
QLThumbnailRequestRef thumbnail,
CFDataRef data,
CFDictionaryRef properties);
);
http://developer.apple.com/mac/library/documentation/UserExperience/Reference/QLThumbnailRequest_Ref/Reference/reference.html#//apple_ref/c/func/QLThumbnailRequestSetImageWithData
Right now I'm creating a TIFF -> encapsulated it into NSData. An example
// Setting CFDataRef
CGSize thumbnailMaxSize = QLThumbnailRequestGetMaximumSize(thumbnail);
NSMutableAttributedString *attributedString = [[[NSMutableAttributedString alloc]
initWithString:#"dummy"
attributes:[NSDictionary dictionaryWithObjectsAndKeys:
[NSFont fontWithName:#"Monaco" size:10], NSFontAttributeName,
[NSColor colorWithCalibratedRed:0.0 green:0.0 blue:0.0 alpha:1.0], NSForegroundColorAttributeName,
nil]
] autorelease];
NSImage *thumbnailImage = [[[NSImage alloc] initWithSize:NSMakeSize(thumbnailMaxSize.width, thumbnailMaxSize.height)] autorelease];
[thumbnailImage lockFocus];
[[NSColor whiteColor] set];
NSRectFill(NSMakeRect(0, 0, thumbnailMaxSize.width, thumbnailMaxSize.height));
[attributedString drawInRect:NSMakeRect(0, 0, thumbnailMaxSize.width, thumbnailMaxSize.height)];
[thumbnailImage unlockFocus];
(CFDataRef)[thumbnailImage TIFFRepresentation]; // This is data
// Setting CFDictionaryRef
(CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:#"kUTTypeTIFF", (NSString *)kCGImageSourceTypeIdentifierHint, nil ]; // this is properties
However QuickLook provides another function to return thumbnail image, namely
QLThumbnailRequestSetImage(
QLThumbnailRequestRef thumbnail,
CGImageRef image,
CFDictionaryRef properties);
);
http://developer.apple.com/mac/library/documentation/UserExperience/Reference/QLThumbnailRequest_Ref/Reference/reference.html#//apple_ref/c/func/QLThumbnailRequestSetImage
I have a feeling that passing CGImage to the QL instead of TIFF data would help in speeding things up.
However- I have never worked with CG context before. I know, the documentation is there :), but anyways- could anyone give an example how to turn that NSAttributed string into CGImageRef. An example is worth 10 times reading the documentation ;)
Any help appreciated. Thanks in advance!
could anyone give an example how to turn that NSAttributed string into CGImageRef.
You can't turn a string into an image; they're two completely different kinds of data, and one is two dimensional (characters over time) while the other is at-least-three dimensional (color over x and y).
What you need to do is draw the string and produce an image of the drawing. That's what you're doing now with NSImage: Creating an image and drawing the string into it.
You're asking about creating a CGImage. Creating a bitmap context, using Core Text to draw the string into it, and creating an image of the contents of the bitmap context is one way to do that.
However, you're already much closer to another solution, assuming you can require Snow Leopard. Instead of asking the NSImage for a TIFF representation, ask it for a CGImage.

Resources