How implementa antialiasing in Mac app using Metal? - macos

I'm making an app for MacOS, but I'm having problems with the antialiasing implementation in Metal.
When compiling I get the following error:
validateAttachmentOnDevice:540: failed assertion
MTLRenderPassDescriptor render targets have inconsistent sample
counts.
This is the code I use to implement the textures:
- (void)makeTextures {
if ([depthTexture width] != drawableSize.width || [depthTexture height] != drawableSize.height)
{
MTLTextureDescriptor *colorDescriptor = [MTLTextureDescriptor new];
colorDescriptor.textureType = MTLTextureType2DMultisample;
colorDescriptor.sampleCount = 4;
colorDescriptor.pixelFormat = vista.colorPixelFormat;
colorDescriptor.width = drawableSize.width;
colorDescriptor.height = drawableSize.height;
colorDescriptor.usage = MTLTextureUsageRenderTarget;
colorDescriptor.storageMode = MTLStorageModePrivate;
msaaColorTexture = [view.device newTextureWithDescriptor:colorDescriptor];
MTLTextureDescriptor *resolveDescriptor = [MTLTextureDescriptor new];
resolveDescriptor.textureType = MTLTextureType2D;
resolveDescriptor.pixelFormat = vista.colorPixelFormat;
resolveDescriptor.width = drawableSize.width;
resolveDescriptor.height = drawableSize.height;
resolveDescriptor.storageMode = MTLStorageModePrivate;
msaaResolveColorTexture = [view.device newTextureWithDescriptor:resolveDescriptor];
MTLTextureDescriptor *desc = [MTLTextureDescriptor new];
desc.textureType = MTLTextureType2DMultisample;
desc.sampleCount = 4;
desc.pixelFormat = MTLPixelFormatDepth32Float;
desc.width = drawableSize.width;
desc.height = drawableSize.height;
desc.usage = MTLTextureUsageRenderTarget;
desc.storageMode = MTLStorageModePrivate;
depthTexture = [view.device newTextureWithDescriptor:desc];
}
}
If at the time of creating the MTLRenderPipelineDescriptor I add .samplecount = 4;, the error disappears, but nothing is drawing in the view.
This are de PipelineDescriptor creation:
MTLRenderPipelineDescriptor *pipelineStateDescriptor = [MTLRenderPipelineDescriptor new];
pipelineStateDescriptor.label = #"Simple Pipeline";
pipelineStateDescriptor.vertexFunction = vertexFunction;
pipelineStateDescriptor.fragmentFunction = fragmentFunction;
pipelineStateDescriptor.colorAttachments[0].pixelFormat = vista.colorPixelFormat;
pipelineStateDescriptor.depthAttachmentPixelFormat = MTLPixelFormatDepth32Float;
// No error but black view.
//pipelineStateDescriptor.sampleCount = 4;
NSError *error;
pipelineState = [view.device newRenderPipelineStateWithDescriptor:pipelineStateDescriptor error:&error];
// Depth stencil.
MTLDepthStencilDescriptor *depthStencilDescriptor = [MTLDepthStencilDescriptor new];
depthStencilDescriptor.depthCompareFunction = MTLCompareFunctionLess;
depthStencilDescriptor.depthWriteEnabled = YES;
depthStencilState = [view.device newDepthStencilStateWithDescriptor:depthStencilDescriptor];
And passDescriptor:
if (passDescriptor != nil) {
passDescriptor.colorAttachments[0].texture = msaaColorTexture;
passDescriptor.colorAttachments[0].resolveTexture = msaaResolveColorTexture;
passDescriptor.colorAttachments[0].clearColor = MTLClearColorMake(1.0f, 1.0f, 1.0f, 1.0f);
passDescriptor.colorAttachments[0].storeAction = MTLStoreActionMultisampleResolve;
passDescriptor.colorAttachments[0].loadAction = MTLLoadActionClear;
passDescriptor.depthAttachment.texture = depthTexture;
passDescriptor.depthAttachment.clearDepth = 1.0;
passDescriptor.depthAttachment.loadAction = MTLLoadActionClear;
passDescriptor.depthAttachment.storeAction = MTLStoreActionStore;

Related

iOS 8 SpriteKit rendering issues

Upon testing my app in iOS 8, I ran into some serious rendering issues. Instead of wasting text trying to write a comprehensible description, I'll just show you a short video:
This is what it's supposed to work like. iOS 7.
And this is how it breaks in iOS 8.
A slight breakdown of what you saw: The tiles zooming in is an SKAction running on each tile individually. The tiles spawn with an alpha of 0 with the SKAction fading them in, one at a time. If I spawn the tiles with an alpha of 1.0, they show up, but don't perform any action. Other actions randomly don't run either. So there at least appears to be some issue with SKAction. I cannot figure out what's making the rest of the scene disappear when a new round occurs, though. There are no SKActions relating to those nodes or textures at all.
Does anyone know if there are new limits or issues with SpriteKit in iOS 8? Especially relating to SKAction it seems, but anything that might relate to this at all?
Edit: I just want to clarify that even though the video for iOS 8 is the simulator, I've had a report from a beta tester of a problem of this exact description on a real iOS 8 device. The iOS 7 video is a screen capture from my own iOS 7 device.
Edit: here's some code relating to this:
-(void)startNewRoundWithGridSize:(CGFloat)newGridSize andWormCount:(NSInteger)wormCount{
gameData.currentWormScore = (int)currentRound;
[gameData checkAchievements];
previousRandomShake = sharedUtilities.currentTime + 10;
[self randomizeShakeType];
if (gameData.isComputer || gameData.isPhone5) {
directionImage.position = wormDirectionPhone5GamePos;
} else if (gameData.isPhone4) {
directionImage.position = wormDirectionPhone4GamePos;
} else if (gameData.isPad) {
directionImage.position = wormDirectionPadGamePos;
}
directionImage.hidden = VISIBLE;
[directionImage setScale:0.33];
roundCounter.hidden = VISIBLE;
timerBar.hidden = VISIBLE;
int additionalTime = [settingsDict[#"subsequentRoundTimerPoints"] integerValue] + 0.5 * timerValue;
if (currentRound == 1) {
timerValue = [settingsDict[#"firstRoundTimerPoints"] integerValue];
} else {
timerValue = additionalTime;
}
if (gameData.godMode) {
timerValue = 600000000;
}
gridSize = newGridSize;
if (gameData.isPhone4 || gameData.isPhone5 || gameData.isComputer) {
tileBaseSize = 80;
} else if (gameData.isPad) {
tileBaseSize = 192;
}
roundCounter.intValue = currentRound;
SKNode* oldWorld = [self childNodeWithName:#"wormWorld"];
if (oldWorld.parent) {
[oldWorld removeFromParent];
}
[startButton disableButton];
[backButton disableButton];
[instructionsButton disableButton];
startButton.hidden = HIDDEN;
backButton.hidden = HIDDEN;
instructionsButton.hidden = HIDDEN;
wormArray = nil;
wormArray = [[NSMutableArray alloc] init];
tilesArray = nil;
tilesArray = [[NSMutableArray alloc] init];
SKSpriteNode* alphaNode = [SKSpriteNode spriteNodeWithColor:[SKColor whiteColor] size:CGSizeMake(self.size.width, self.size.width)];
alphaNode.anchorPoint = CGPointZero;
alphaNode.position = CGPointMake(0, 0);
CGFloat worldYPos = (self.size.height - self.size.width) / 2 ;
SKCropNode* wormWorld = [SKCropNode node];
wormWorld.position = CGPointMake(0, worldYPos);
wormWorld.name = #"wormWorld";
wormWorld.maskNode = alphaNode;
wormWorld.zPosition = zGameLayer;
[self addChild:wormWorld];
SKSpriteNode* wormAreaBackground = [SKSpriteNode spriteNodeWithColor:[SKColor colorWithRed:0 green:0 blue:0 alpha:0.45] size:CGSizeMake(self.size.width, self.size.width)];
wormAreaBackground.anchorPoint = CGPointZero;
wormAreaBackground.name = #"wormAreaBackground";
wormAreaBackground.zPosition = 0;
[wormWorld addChild:wormAreaBackground];
SKNode* wormScaler = [SKNode node];
wormScaler.name = #"wormScaler";
wormScaler.xScale = 4 / gridSize;
wormScaler.yScale = 4 / gridSize;
wormScaler.zPosition = 1;
[wormWorld addChild:wormScaler];
//tiles
CGFloat totalDuration = 1;
CGFloat duration = 0.15;
SKAction* tileScaleIn = [SKAction sequence:#[
[SKAction scaleTo:5 duration:0],
[SKAction scaleXTo:1.0f y:1.0f duration:0.1],
]];
SKAction* tileFadeIn;
if (gameData.debugMode) {
tileFadeIn = [SKAction fadeAlphaTo:0.5f duration:duration];
} else {
tileFadeIn = [SKAction fadeAlphaTo:1.0f duration:duration];
}
SKAction* tileAnimation = [SKAction group:#[
tileScaleIn,
tileFadeIn,
]];
for (int i = 0; i < gridSize; i++) {
int y = i * tileBaseSize + 0.5 * tileBaseSize; //start new row of tiles and set y pos
for (int h = 0; h < gridSize; h++) {
int x = h * tileBaseSize + 0.5 * tileBaseSize; //set x of current tile
SGG_TileNode* tile = [SGG_TileNode node];
if (gameData.debugMode) {
tile.debugMode = YES;
}
tile.position = CGPointMake(x, y);
tile.gridSize = CGSizeMake(gridSize, gridSize);
tile.tileIndex = i * gridSize + h;
tile.name = #"tile";
tile.zPosition = zGameLayer + 15;
tile.alpha = 0.0;
[wormScaler addChild:tile];
[tilesArray addObject:tile];
SKAction* waitABit = [SKAction waitForDuration:((totalDuration - 0.2) / (gridSize * gridSize)) * tile.tileIndex];
SKAction* totalAction = [SKAction sequence:#[
waitABit,
tileAnimation,
]];
[tile runAction:totalAction withKey:#"startingZoom"];
}
}
gameStarting = YES;
}
I know it's rather long. :(
Edit: Okay, upon some further testing, (in iOS 8 and 8.1) I've added an SKAction to the scene before the game gets started. It works fine until the game starts, but once it starts, the action just stops. Once again, this only happens in iOS 8+ and not iOS 7. Is there some limitation on the number of SKActions that can run simultaneously or something? Do SKActions stop when reaching some sort of memory limit (that would include art assets and other objects in memory)? And if it is, why would it only affect iOS 8+? And no, I'm not pausing the scene by accident. To be absolutely sure, I added
self.paused = NO;
To the update method to be certain the game isn't paused.

SKPhysicsJoint is joining at what appears to be incorrect coordinates

In my simple archery game, I have defined an arrow node and a target node with associated physics bodies.
When I attempt to join them together using an SKPhysicsJointFixed (I have also tried other types), the behaviour is inaccurate with the joint seemingly created at random points before actually hitting the target node.
I have played with friction and restitution values and also SKShapeNode (with a CGPath) and SKSpriteNode (with a rectangle around the sprite) to define the target but the problem occurs with both.
When just using collisions, the arrows bounce off the correct locations of the target, which seems OK. It is only when the join occurs that the results become random on-screen, usually 10-20 points away from the target node "surface".
static const uint32_t arrowCategory = 0x1 << 1;
static const uint32_t targetCategory = 0x1 << 2;
- (SKSpriteNode *)createArrowNode
{
SKSpriteNode *arrow = [[SKSpriteNode alloc] initWithImageNamed:#"Arrow.png"];
arrow.position = CGPointMake(165, 110);
arrow.name = #"arrowNode";
arrow.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:arrow.frame.size];
arrow.physicsBody.angularVelocity = -0.25;
arrow.physicsBody.usesPreciseCollisionDetection = YES;
arrow.physicsBody.restitution = 0.0;
arrow.physicsBody.friction = 0.0;
arrow.physicsBody.categoryBitMask = arrowCategory;
arrow.physicsBody.collisionBitMask = targetCategory;
arrow.physicsBody.contactTestBitMask = /*arrowCategory | */targetCategory | bullseyeCategory;
return arrow;
}
-void(createTargetNode)
{
SKSpriteNode *sprite = [[SKSpriteNode alloc] initWithImageNamed:#"TargetOutline.png"];
sprite.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:sprite.size];
sprite.position = CGPointMake(610, 100);
sprite.name = #"targetNode";
sprite.physicsBody.usesPreciseCollisionDetection = NO;
// sprite.physicsBody.affectedByGravity = NO;
// sprite.physicsBody.mass = 20000;
sprite.physicsBody.dynamic = NO;
sprite.physicsBody.friction = 0.0;
sprite.physicsBody.restitution = 0.0;
sprite.physicsBody.categoryBitMask = targetCategory;
sprite.physicsBody.contactTestBitMask = targetCategory | arrowCategory;
[self addChild:sprite];
}
- (void)didBeginContact:(SKPhysicsContact *)contact
{
SKPhysicsBody *firstBody, *secondBody;
if (contact.bodyA.categoryBitMask < contact.bodyB.categoryBitMask)
{
firstBody = contact.bodyA;
secondBody = contact.bodyB;
}
else
{
firstBody = contact.bodyB;
secondBody = contact.bodyA;
}
if ((firstBody.categoryBitMask & arrowCategory) != 0 &&
(secondBody.categoryBitMask & targetCategory) != 0)
{
CGPoint contactPoint = contact.contactPoint;
NSLog(#"contactPoint is %#", NSStringFromCGPoint(contactPoint));
float contact_x = contactPoint.x;
float contact_y = contactPoint.y;
SKPhysicsJoint *joint = [SKPhysicsJointFixed jointWithBodyA:firstBody bodyB:secondBody anchor:(CGPoint)contactPoint ];
[self.physicsWorld addJoint:joint];
CGPoint bullseye = CGPointMake(590, 102.5);
NSLog(#"Center is %#", NSStringFromCGPoint(bullseye));
CGFloat distance = SDistanceBetweenPoints(contactPoint, bullseye);
NSLog(#"Distance to bullseye is %f", distance);
}

iOS 8 UIImage Metadata

This is my first question
I've a a "little" problem:
when i read UIImage metadata on iOS 7 I use this code and it works great
#pragma mark - Image Picker Controller delegate methods
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage* image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self metaDataFromAssetLibrary:info];
[picker dismissViewControllerAnimated:YES completion:NULL];
}
From imagePickerController i choose the image and call metaDataFromAssetLibrary method
- (void) metaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
NSMutableDictionary *imageMetadata = nil;
NSDictionary *metadata = asset.defaultRepresentation.metadata;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSLog (#"imageMetaData from AssetLibrary %#",imageMetadata);
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
on Xcode5 and iOS7 console return something like this
imageMetaData from AssetLibrary {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 1;
PixelHeight = 2448;
PixelWidth = 3264;
"{Exif}" = {
ApertureValue = "2.52606882168926";
BrightnessValue = "2.211389961389961";
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
DateTimeDigitized = "2014:06:05 08:54:09";
DateTimeOriginal = "2014:06:05 08:54:09";
ExifVersion = (
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.05";
FNumber = "2.4";
Flash = 24;
FlashPixVersion = (
1,
0
);
FocalLenIn35mmFilm = 35;
FocalLength = "4.28";
ISOSpeedRatings = (
125
);
LensMake = Apple;
LensModel = "iPhone 4S back camera 4.28mm f/2.4";
LensSpecification = (
"4.28",
"4.28",
"2.4",
"2.4"
);
MeteringMode = 3;
PixelXDimension = 3264;
PixelYDimension = 2448;
SceneCaptureType = 0;
SceneType = 1;
SensingMethod = 2;
ShutterSpeedValue = "4.321928460342146";
SubjectArea = (
1643,
1079,
610,
612
);
SubsecTimeDigitized = 347;
SubsecTimeOriginal = 347;
WhiteBalance = 0;
};
"{GPS}" = {
Altitude = 26;
AltitudeRef = 1;
DateStamp = "2014:06:05";
DestBearing = "177.086387434555";
DestBearingRef = M;
ImgDirection = "357.0864197530864";
ImgDirectionRef = M;
Latitude = "43.80268";
LatitudeRef = N;
Longitude = "11.0635195";
LongitudeRef = E;
Speed = 0;
SpeedRef = K;
TimeStamp = "06:54:08";
};
"{MakerApple}" = {
1 = 0;
3 = {
epoch = 0;
flags = 1;
timescale = 1000000000;
value = 27688938393500;
};
4 = 1;
5 = 186;
6 = 195;
7 = 1;
8 = (
"-0.6805536",
"0.02519802",
"-0.755379"
);
};
"{TIFF}" = {
DateTime = "2014:06:05 08:54:09";
Make = Apple;
Model = "iPhone 4S";
Orientation = 1;
ResolutionUnit = 2;
Software = "8.0";
XResolution = 72;
YResolution = 72;
};
}
But on Xcode6 and iOS 8 console return only this
imageMetaData from AssetLibrary {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 1;
PixelHeight = 768;
PixelWidth = 1020;
"{Exif}" = {
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
ExifVersion = (
2,
2,
1
);
FlashPixVersion = (
1,
0
);
PixelXDimension = 1020;
PixelYDimension = 768;
SceneCaptureType = 0;
};
"{TIFF}" = {
Orientation = 1;
ResolutionUnit = 2;
XResolution = 72;
YResolution = 72;
};
}
someone knows this problem?
any solution or suggestion?
Thank you so much
P.S.: excuse me for my terrible english ;-)
After upgrade to Xcode6, you have to get the image in dismissViewController's completion block, or more complex method: get the image from asset.
Please try the following code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:YES completion::^{
UIImage* image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self metaDataFromAssetLibrary:info];
}];
Also you can reference john.k.doe's answer in
didFinishPickingMediaWithInfo return nil photo

UInt8 EXC_BAD_ACCESS

I have a method that will add a filter to an image. This worked fine until a couple of months ago, now when I try to use this method the application will crash on the images buffer. I create the buffer and set it to the image's data, accessing the specific index later causes a bad access crash. I have looked for the past hour or two, and now I am convinced there is something im overlooking. I think something is being released that should not be. I am using the ios DP 4 preview of xcode, and I think this problem started with the update to the beta, but I am really not sure.
This is the line it crashes on located near the middle of the first for loop
m_PixelBuf[index+2] = m_PixelBuf[index+2]/*aRed*/;
Normally it is set to aRed Which I have checked, and it should not go out of the buffers boundaries.
-(void)contrastWithContrast:(float )contrast colorWithColor:(float )color{
drawImage.image = original;
UIImage * unfilteredImage2 = [[[UIImage alloc]initWithCGImage:drawImage.image.CGImage] autorelease];
CGImageRef inImage = unfilteredImage2.CGImage;
CGContextRef ctx;
CFDataRef m_DataRef;
m_DataRef = CGDataProviderCopyData(CGImageGetDataProvider(inImage));
UInt8 * m_PixelBuf = (UInt8 *) CFDataGetBytePtr(m_DataRef);
int length = CFDataGetLength(m_DataRef);
NSLog(#"Photo Length: %i",length);
//////Contrast/////////////
//NSLog(#"Contrast:%f",contrast);
int aRed;
int aGreen;
int aBlue;
for (int index = 0; index < length; index += 4){
aRed = m_PixelBuf[index+2];
aGreen = m_PixelBuf[index+1];
aBlue = m_PixelBuf[index];
aRed = (((aRed-128)*(contrast+100) )/100) + 128;
if (aRed < 0) aRed = 0; if (aRed>255) aRed=255;
m_PixelBuf[index+2] = m_PixelBuf[index+2]/*aRed*/;//Always crashes here
aGreen = (((aGreen-128)*(contrast+100) )/100) + 128;
if (aGreen < 0) aGreen = 0; if (aGreen>255) aGreen=255;
m_PixelBuf[index+1] = aGreen;
aBlue = (((aBlue-128)*(contrast+100) )/100) + 128;
if (aBlue < 0) aBlue = 0; if (aBlue>255) aBlue=255;
m_PixelBuf[index] = aBlue;
}
ctx = CGBitmapContextCreate(m_PixelBuf,
CGImageGetWidth( inImage ),
CGImageGetHeight( inImage ),
CGImageGetBitsPerComponent(inImage),
CGImageGetBytesPerRow(inImage ),
CGImageGetColorSpace(inImage ),
CGImageGetBitmapInfo(inImage) );
CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [[UIImage alloc]initWithCGImage:imageRef];
drawImage.image = rawImage;
[rawImage release];
CGContextRelease(ctx);
CFRelease(imageRef);
CFRelease(m_DataRef);
unfilteredImage2 = [[[UIImage alloc]initWithCGImage:drawImage.image.CGImage] autorelease];
inImage = unfilteredImage2.CGImage;
m_DataRef = CGDataProviderCopyData(CGImageGetDataProvider(inImage));
m_PixelBuf = (UInt8 *) CFDataGetBytePtr(m_DataRef);
length = CFDataGetLength(m_DataRef);
///////Color////////////////
for (int index = 0; index < length; index += 4)
{
//Blue
if((m_PixelBuf[index] + ((int)color * 2))>255){
m_PixelBuf[index] = 255;
}else if((m_PixelBuf[index] + ((int)color * 2))<0){
m_PixelBuf[index] = 0;
}
else{
m_PixelBuf[index]=m_PixelBuf[index] + ((int)color * 2);
}
//Green
if((m_PixelBuf[index+1] + ((int)color * 2))>255){
m_PixelBuf[index+1] = 255;
}else if((m_PixelBuf[index+1] + ((int)color * 2))<0){
m_PixelBuf[index+1] = 0;
}
else{
m_PixelBuf[index+1]=m_PixelBuf[index+1] + ((int)color * 2);
}
//Red
if((m_PixelBuf[index+2] + ((int)color * 2))>255){
m_PixelBuf[index+2] = 255;
}else if((m_PixelBuf[index+2] + ((int)color * 2))<0){
m_PixelBuf[index+2] = 0;
}
else{
m_PixelBuf[index+2]=m_PixelBuf[index+2] + ((int)color * 2);
}
//m_PixelBuf[index+3]=255;//Alpha
}
ctx = CGBitmapContextCreate(m_PixelBuf,
CGImageGetWidth( inImage ),
CGImageGetHeight( inImage ),
CGImageGetBitsPerComponent(inImage),
CGImageGetBytesPerRow(inImage ),
CGImageGetColorSpace(inImage ),
CGImageGetBitmapInfo(inImage) );
imageRef = CGBitmapContextCreateImage (ctx);
rawImage = [[UIImage alloc]initWithCGImage:imageRef];
drawImage.image = rawImage;
[rawImage release];
CGContextRelease(ctx);
CFRelease(imageRef);
CFRelease(m_DataRef);
//drawImage.image = unfilteredImage2;
willUpdate = YES;
}
sorry for any extra comments/info I just copied the whole method in.
Thanks,
Storealutes
I had same problem.
You should use below code to get pointer to pixel buffer instead of CFDataGetBytePtr().
CGImageRef cgImage = originalImage.CGImage;
size_t width = CGImageGetWidth(cgImage);
size_t height = CGImageGetHeight(cgImage);
char *buffer = (char*)malloc(sizeof(char) * width * height * 4);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef cgContext = CGBitmapContextCreate(buffer, width, height, 8, width * 4, colorSpace, kCGImageAlphaPremultipliedLast);
CGContextSetBlendMode(cgContext, kCGBlendModeCopy);
CGContextDrawImage(cgContext, CGRectMake(0.0f, 0.0f, width, height), cgImage);
free(buffer);
CGContextRelease(cgContext);
CGColorSpaceRelease(colorSpace);

UIImagePickerController returning incorrect image orientation

I'm using UIImagePickerController to capture an image and then store it. However, when i try to rescale it, the orientation value i get out of this image is incorrect. When i take a snap by holding the phone Up, it gives me orientation of Left. Has anyone experienced this issue?
The UIImagePickerController dictionary shows following information:
{
UIImagePickerControllerMediaMetadata = {
DPIHeight = 72;
DPIWidth = 72;
Orientation = 3;
"{Exif}" = {
ApertureValue = "2.970853654340484";
ColorSpace = 1;
DateTimeDigitized = "2011:02:14 10:26:17";
DateTimeOriginal = "2011:02:14 10:26:17";
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.06666666666666667";
FNumber = "2.8";
Flash = 32;
FocalLength = "3.85";
ISOSpeedRatings = (
125
);
MeteringMode = 1;
PixelXDimension = 2048;
PixelYDimension = 1536;
SceneType = 1;
SensingMethod = 2;
Sharpness = 1;
ShutterSpeedValue = "3.910431673351467";
SubjectArea = (
1023,
767,
614,
614
);
WhiteBalance = 0;
};
"{TIFF}" = {
DateTime = "2011:02:14 10:26:17";
Make = Apple;
Model = "iPhone 3GS";
Software = "4.2.1";
XResolution = 72;
YResolution = 72;
};
};
UIImagePickerControllerMediaType = "public.image";
UIImagePickerControllerOriginalImage = "<UIImage: 0x40efb50>";
}
However picture returns imageOrientation == 1;
UIImage *picture = [info objectForKey:UIImagePickerControllerOriginalImage];
I just started working on this issue in my own app.
I used the UIImage category that Trevor Harmon crafted for resizing an image and fixing its orientation, UIImage+Resize.
Then you can do something like this in -imagePickerController:didFinishPickingMediaWithInfo:
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *resized = [pickedImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:pickedImage.size interpolationQuality:kCGInterpolationHigh];
This fixed the problem for me. The resized image is oriented correctly visually and the imageOrientation property reports UIImageOrientationUp.
There are several versions of this scale/resize/crop code out there; I used Trevor's because it seems pretty clean and includes some other UIImage manipulators that I want to use later.
This what I have found for fixing orientation issue; Works for me
UIImage *initialImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSData *data = UIImagePNGRepresentation(self.initialImage);
UIImage *tempImage = [UIImage imageWithData:data];
UIImage *fixedOrientationImage = [UIImage imageWithCGImage:tempImage.CGImage
scale:initialImage.scale
orientation:self.initialImage.imageOrientation];
initialImage = fixedOrientationImage;
Here's a Swift snippet that fixes the problem efficiently:
let orientedImage = UIImage(CGImage: initialImage.CGImage, scale: 1, orientation: initialImage.imageOrientation)!
I use the following code that I have put in a separate image utility object file that has a bunch of other editing methods for UIImages:
+ (UIImage*)imageWithImage:(UIImage*)sourceImage scaledToSizeWithSameAspectRatio:(CGSize)targetSize
{
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor) {
scaleFactor = widthFactor; // scale to fit height
}
else {
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else if (widthFactor < heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
CGImageRef imageRef = [sourceImage CGImage];
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
CGColorSpaceRef colorSpaceInfo = CGImageGetColorSpace(imageRef);
if (bitmapInfo == kCGImageAlphaNone) {
bitmapInfo = kCGImageAlphaNoneSkipLast;
}
CGContextRef bitmap;
if (sourceImage.imageOrientation == UIImageOrientationUp || sourceImage.imageOrientation == UIImageOrientationDown) {
bitmap = CGBitmapContextCreate(NULL, targetWidth, targetHeight, CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), colorSpaceInfo, bitmapInfo);
} else {
bitmap = CGBitmapContextCreate(NULL, targetHeight, targetWidth, CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), colorSpaceInfo, bitmapInfo);
}
// In the right or left cases, we need to switch scaledWidth and scaledHeight,
// and also the thumbnail point
if (sourceImage.imageOrientation == UIImageOrientationLeft) {
thumbnailPoint = CGPointMake(thumbnailPoint.y, thumbnailPoint.x);
CGFloat oldScaledWidth = scaledWidth;
scaledWidth = scaledHeight;
scaledHeight = oldScaledWidth;
CGContextRotateCTM (bitmap, M_PI_2); // + 90 degrees
CGContextTranslateCTM (bitmap, 0, -targetHeight);
} else if (sourceImage.imageOrientation == UIImageOrientationRight) {
thumbnailPoint = CGPointMake(thumbnailPoint.y, thumbnailPoint.x);
CGFloat oldScaledWidth = scaledWidth;
scaledWidth = scaledHeight;
scaledHeight = oldScaledWidth;
CGContextRotateCTM (bitmap, -M_PI_2); // - 90 degrees
CGContextTranslateCTM (bitmap, -targetWidth, 0);
} else if (sourceImage.imageOrientation == UIImageOrientationUp) {
// NOTHING
} else if (sourceImage.imageOrientation == UIImageOrientationDown) {
CGContextTranslateCTM (bitmap, targetWidth, targetHeight);
CGContextRotateCTM (bitmap, -M_PI); // - 180 degrees
}
CGContextDrawImage(bitmap, CGRectMake(thumbnailPoint.x, thumbnailPoint.y, scaledWidth, scaledHeight), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* newImage = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap);
CGImageRelease(ref);
return newImage;
}
And then I call
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage *fixedOriginal = [ImageUtil imageWithImage:[mediaInfoDict objectForKey:UIImagePickerControllerOriginalImage] scaledToSizeWithSameAspectRatio:pickedImage.size];
In iOS 7, I needed code dependent on UIImage.imageOrientation to correct for the different orientations. Now, in iOS 8.2, when I pick my old test images from the album via UIImagePickerController, the orientation will be UIImageOrientationUp for ALL images. When I take a photo (UIImagePickerControllerSourceTypeCamera), those images will also always be upwards, regardless of the device orientation.
So between those iOS versions, there obviously has been a fix where UIImagePickerController already rotates the images if neccessary.
You can even notice that when the album images are displayed: for a split second, they will be displayed in the original orientation, before they appear in the new upward orientation.
The only thing that worked for me was to re-render the image again which forces the correct orientation.
if (photo.imageOrientation != .up) {
UIGraphicsBeginImageContextWithOptions(photo.size, false, 1.0);
let rect = CGRect(x: 0, y: 0, width: photo.size.width, height: photo.size.height);
photo.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
photo = newImage;
}

Resources