UInt8 EXC_BAD_ACCESS - image

I have a method that will add a filter to an image. This worked fine until a couple of months ago, now when I try to use this method the application will crash on the images buffer. I create the buffer and set it to the image's data, accessing the specific index later causes a bad access crash. I have looked for the past hour or two, and now I am convinced there is something im overlooking. I think something is being released that should not be. I am using the ios DP 4 preview of xcode, and I think this problem started with the update to the beta, but I am really not sure.
This is the line it crashes on located near the middle of the first for loop
m_PixelBuf[index+2] = m_PixelBuf[index+2]/*aRed*/;
Normally it is set to aRed Which I have checked, and it should not go out of the buffers boundaries.
-(void)contrastWithContrast:(float )contrast colorWithColor:(float )color{
drawImage.image = original;
UIImage * unfilteredImage2 = [[[UIImage alloc]initWithCGImage:drawImage.image.CGImage] autorelease];
CGImageRef inImage = unfilteredImage2.CGImage;
CGContextRef ctx;
CFDataRef m_DataRef;
m_DataRef = CGDataProviderCopyData(CGImageGetDataProvider(inImage));
UInt8 * m_PixelBuf = (UInt8 *) CFDataGetBytePtr(m_DataRef);
int length = CFDataGetLength(m_DataRef);
NSLog(#"Photo Length: %i",length);
//////Contrast/////////////
//NSLog(#"Contrast:%f",contrast);
int aRed;
int aGreen;
int aBlue;
for (int index = 0; index < length; index += 4){
aRed = m_PixelBuf[index+2];
aGreen = m_PixelBuf[index+1];
aBlue = m_PixelBuf[index];
aRed = (((aRed-128)*(contrast+100) )/100) + 128;
if (aRed < 0) aRed = 0; if (aRed>255) aRed=255;
m_PixelBuf[index+2] = m_PixelBuf[index+2]/*aRed*/;//Always crashes here
aGreen = (((aGreen-128)*(contrast+100) )/100) + 128;
if (aGreen < 0) aGreen = 0; if (aGreen>255) aGreen=255;
m_PixelBuf[index+1] = aGreen;
aBlue = (((aBlue-128)*(contrast+100) )/100) + 128;
if (aBlue < 0) aBlue = 0; if (aBlue>255) aBlue=255;
m_PixelBuf[index] = aBlue;
}
ctx = CGBitmapContextCreate(m_PixelBuf,
CGImageGetWidth( inImage ),
CGImageGetHeight( inImage ),
CGImageGetBitsPerComponent(inImage),
CGImageGetBytesPerRow(inImage ),
CGImageGetColorSpace(inImage ),
CGImageGetBitmapInfo(inImage) );
CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [[UIImage alloc]initWithCGImage:imageRef];
drawImage.image = rawImage;
[rawImage release];
CGContextRelease(ctx);
CFRelease(imageRef);
CFRelease(m_DataRef);
unfilteredImage2 = [[[UIImage alloc]initWithCGImage:drawImage.image.CGImage] autorelease];
inImage = unfilteredImage2.CGImage;
m_DataRef = CGDataProviderCopyData(CGImageGetDataProvider(inImage));
m_PixelBuf = (UInt8 *) CFDataGetBytePtr(m_DataRef);
length = CFDataGetLength(m_DataRef);
///////Color////////////////
for (int index = 0; index < length; index += 4)
{
//Blue
if((m_PixelBuf[index] + ((int)color * 2))>255){
m_PixelBuf[index] = 255;
}else if((m_PixelBuf[index] + ((int)color * 2))<0){
m_PixelBuf[index] = 0;
}
else{
m_PixelBuf[index]=m_PixelBuf[index] + ((int)color * 2);
}
//Green
if((m_PixelBuf[index+1] + ((int)color * 2))>255){
m_PixelBuf[index+1] = 255;
}else if((m_PixelBuf[index+1] + ((int)color * 2))<0){
m_PixelBuf[index+1] = 0;
}
else{
m_PixelBuf[index+1]=m_PixelBuf[index+1] + ((int)color * 2);
}
//Red
if((m_PixelBuf[index+2] + ((int)color * 2))>255){
m_PixelBuf[index+2] = 255;
}else if((m_PixelBuf[index+2] + ((int)color * 2))<0){
m_PixelBuf[index+2] = 0;
}
else{
m_PixelBuf[index+2]=m_PixelBuf[index+2] + ((int)color * 2);
}
//m_PixelBuf[index+3]=255;//Alpha
}
ctx = CGBitmapContextCreate(m_PixelBuf,
CGImageGetWidth( inImage ),
CGImageGetHeight( inImage ),
CGImageGetBitsPerComponent(inImage),
CGImageGetBytesPerRow(inImage ),
CGImageGetColorSpace(inImage ),
CGImageGetBitmapInfo(inImage) );
imageRef = CGBitmapContextCreateImage (ctx);
rawImage = [[UIImage alloc]initWithCGImage:imageRef];
drawImage.image = rawImage;
[rawImage release];
CGContextRelease(ctx);
CFRelease(imageRef);
CFRelease(m_DataRef);
//drawImage.image = unfilteredImage2;
willUpdate = YES;
}
sorry for any extra comments/info I just copied the whole method in.
Thanks,
Storealutes

I had same problem.
You should use below code to get pointer to pixel buffer instead of CFDataGetBytePtr().
CGImageRef cgImage = originalImage.CGImage;
size_t width = CGImageGetWidth(cgImage);
size_t height = CGImageGetHeight(cgImage);
char *buffer = (char*)malloc(sizeof(char) * width * height * 4);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef cgContext = CGBitmapContextCreate(buffer, width, height, 8, width * 4, colorSpace, kCGImageAlphaPremultipliedLast);
CGContextSetBlendMode(cgContext, kCGBlendModeCopy);
CGContextDrawImage(cgContext, CGRectMake(0.0f, 0.0f, width, height), cgImage);
free(buffer);
CGContextRelease(cgContext);
CGColorSpaceRelease(colorSpace);

Related

How do I return a CGImageRef from CGWindowListCreateImage to C#?

I'm currently attempting to write a plugin for macOS for Unity. I am taking a screenshot of the desktop with CGWindowListCreateImage. I'm trying to figure out how to return the byte[] data to C# so I can create a Texture2D from it. Any help would be greatly appreciated, thank you.
It doesn't want me to return a NSArray* The .h file is at the bottom.
NSArray* getScreenshot()
{
CGImageRef screenShot = CGWindowListCreateImage( CGRectInfinite, kCGWindowListOptionOnScreenOnly, kCGNullWindowID, kCGWindowImageDefault);
return getRGBAsFromImage(screenShot);
}
NSArray* getRGBAsFromImage(CGImageRef imageRef)
{
// First get the image into your data buffer
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
NSUInteger bytesPerPixel = 4;
unsigned long count = width * height * bytesPerPixel;
NSMutableArray *result = [NSMutableArray arrayWithCapacity:count];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = (unsigned char*) alloca(height * width * 4);
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
// Now your rawData contains the image data in the RGBA8888 pixel format.
int x = 0, y = 0;
NSUInteger byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
for (int i = 0 ; i < count ; ++i)
{
CGFloat alpha = ((CGFloat) rawData[byteIndex + 3] ) / 255.0f;
CGFloat red = ((CGFloat) rawData[byteIndex] ) / alpha;
CGFloat green = ((CGFloat) rawData[byteIndex + 1] ) / alpha;
CGFloat blue = ((CGFloat) rawData[byteIndex + 2] ) / alpha;
byteIndex += bytesPerPixel;
NSColor *acolor = [NSColor colorWithRed:red green:green blue:blue alpha:alpha];
[result insertObject:acolor atIndex:count];
}
free(rawData);
return result;
}
#ifndef TestMethods_hpp
#define TestMethods_hpp
#import <Foundation/Foundation.h>
#include <Carbon/Carbon.h>
#include <stdio.h>
#include <AppKit/AppKit.h>
typedef void (*Unity_Callback1)(char * message);
extern "C" {
NSArray* getScreenshot();
}
#endif /* TestMethods_h */

Xamarin iOS: How to change the color of a UIImage pixel by pixel

Sorry if this has already been answered somewhere but I could not find it.
Basically, I am receiving a QR code where the code itself is black and the background is white (this is a UIImage). I would like to change to the color of the white background to transparent or a custom color and change the QRCode color from black to white. (In Xamarin iOS)
I already know how to get the color of a specific Pixel using the following code:
static UIColor GetPixelColor(CGBitmapContext context, byte[] rawData,
UIImage barcode, CGPoint pt)
{
var handle = GCHandle.Alloc(rawData);
UIColor resultColor = null;
using (context)
{
context.DrawImage(new CGRect(-pt.X, pt.Y - barcode.Size.Height,
barcode.Size.Width, barcode.Size.Height), barcode.CGImage);
float red = (rawData[0]) / 255.0f;
float green = (rawData[1]) / 255.0f;
float blue = (rawData[2]) / 255.0f;
float alpha = (rawData[3]) / 255.0f;
resultColor = UIColor.FromRGBA(red, green, blue, alpha);
}
return resultColor;
}
This is currently my function:
static UIImage GetRealQRCode(UIImage barcode)
{
int width = (int)barcode.Size.Width;
int height = (int)barcode.Size.Height;
var bytesPerPixel = 4;
var bytesPerRow = bytesPerPixel * width;
var bitsPerComponent = 8;
var colorSpace = CGColorSpace.CreateDeviceRGB();
var rawData = new byte[bytesPerRow * height];
CGBitmapContext context = new CGBitmapContext(rawData, width,
height, bitsPerComponent, bytesPerRow, colorSpace,
CGImageAlphaInfo.PremultipliedLast);
for (int i = 0; i < rawData.Length; i++)
{
for (int j = 0; j < bytesPerRow; j++)
{
CGPoint pt = new CGPoint(i, j);
UIColor currentColor = GetPixelColor(context, rawData,
barcode, pt);
}
}
}
Anyone know how to do this ?
Thanks in advance !
Assuming your UIImage is backed by a CGImage (and not a CIImage):
var cgImage = ImageView1.Image.CGImage; // Your UIImage with a CGImage backing image
var bytesPerPixel = 4;
var bitsPerComponent = 8;
var bytesPerUInt32 = sizeof(UInt32) / sizeof(byte);
var width = cgImage.Width;
var height = cgImage.Height;
var bytesPerRow = bytesPerPixel * cgImage.Width;
var numOfBytes = cgImage.Height * cgImage.Width * bytesPerUInt32;
IntPtr pixelPtr = IntPtr.Zero;
try
{
pixelPtr = Marshal.AllocHGlobal((int)numOfBytes);
using (var colorSpace = CGColorSpace.CreateDeviceRGB())
{
CGImage newCGImage;
using (var context = new CGBitmapContext(pixelPtr, width, height, bitsPerComponent, bytesPerRow, colorSpace, CGImageAlphaInfo.PremultipliedLast))
{
context.DrawImage(new CGRect(0, 0, width, height), cgImage);
unsafe
{
var currentPixel = (byte*)pixelPtr.ToPointer();
for (int i = 0; i < height; i++)
{
for (int j = 0; j < width; j++)
{
// RGBA8888 pixel format
if (*currentPixel == byte.MinValue)
{
*currentPixel = byte.MaxValue;
*(currentPixel + 1) = byte.MaxValue;
*(currentPixel + 2) = byte.MaxValue;
}
else
{
*currentPixel = byte.MinValue;
*(currentPixel + 1) = byte.MinValue;
*(currentPixel + 2) = byte.MinValue;
*(currentPixel + 3) = byte.MinValue;
}
currentPixel += 4;
}
}
}
newCGImage = context.ToImage();
}
var uiimage = new UIImage(newCGImage);
imageView2.Image = uiimage; // Do something with your new UIImage
}
}
finally
{
if (pixelPtr != IntPtr.Zero)
Marshal.FreeHGlobal(pixelPtr);
}
If you do not actually need access to the individual pixels but the end result only, using CoreImage pre-exisitng filters you can first invert the colors and then use the black pixels as an alpha mask. Otherwise see my other answer using Marshal.AllocHGlobal and pointers.
using (var coreImage = new CIImage(ImageView1.Image))
using (var invertFilter = CIFilter.FromName("CIColorInvert"))
{
invertFilter.Image = coreImage;
using (var alphaMaskFiter = CIFilter.FromName("CIMaskToAlpha"))
{
alphaMaskFiter.Image = invertFilter.OutputImage;
var newCoreImage = alphaMaskFiter.OutputImage;
var uiimage = new UIImage(newCoreImage);
imageView2.Image = uiimage; // Do something with your new UIImage
}
}
The plus side is this is blazing fast ;-) and the results are the same:
If you need even faster processing assuming you are batch converting a number of these images, you can write a custom CIKernel that incorporates these two filters into one kernel and thus only process the image once.
Xamarin.IOS with this method you can convert all color white to transparent for me only works with files ".jpg" with .png doesn't work but you can convert the files to jpg and call this method.
public static UIImage ProcessImage (UIImage image)
{
CGImage rawImageRef = image.CGImage;
nfloat[] colorMasking = new nfloat[6] { 222, 255, 222, 255, 222, 255 };
CGImage imageRef = rawImageRef.WithMaskingColors(colorMasking);
UIImage imageB = UIImage.FromImage(imageRef);
return imageB;
}
Regards

offline rendering with a lowpass filter causes aliasing and clipping

I have a buffer of samples that are 8khz, and I am trying to simply apply a lowpass filter to the buffer. Meaning, I start with a buffer of 8khz samples, and I want to end up with a buffer of 8khz LOWPASSED samples. If I hook up a lowpass unit and connect it with the default output unit and supply my buffer, it sounds perfect and properly low passed. However, as soon as I remove the output and call AudioUnitRender on the low pass audio unit directly, the resulting samples are aliased and clipped.
#import "EffectMachine.h"
#import <AudioToolbox/AudioToolbox.h>
#import "AudioHelpers.h"
#import "Buffer.h"
#interface EffectMachine ()
#property (nonatomic, strong) Buffer *buffer;
#end
typedef struct EffectPlayer {
NSUInteger index;
AudioUnit lowPassUnit;
__unsafe_unretained Buffer *buffer;
} EffectPlayer;
OSStatus EffectMachineCallbackRenderProc(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData);
OSStatus EffectMachineCallbackRenderProc(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData) {
struct EffectPlayer *player = (struct EffectPlayer *)inRefCon;
for (int i = 0; i < inNumberFrames; i++) {
float sample;
if (player->index < player->buffer.size) {
sample = (float)player->buffer.samples[player->index];
player->index += 1;
} else {
sample = 0;
}
((float *)ioData->mBuffers[0].mData)[i] = sample;
((float *)ioData->mBuffers[1].mData)[i] = sample;
}
return noErr;
}
#implementation EffectMachine {
EffectPlayer player;
}
-(instancetype)initWithBuffer:(Buffer *)buffer {
if (self = [super init]) {
self.buffer = buffer;
}
return self;
}
-(Buffer *)process {
struct EffectPlayer initialized = {0};
player = initialized;
player.buffer = self.buffer;
[self setupAudioUnits];
Buffer *buffer = [self processedBuffer];
[self cleanup];
return buffer;
}
-(void)setupAudioUnits {
AudioComponentDescription lowpasscd = {0};
lowpasscd.componentType = kAudioUnitType_Effect;
lowpasscd.componentSubType = kAudioUnitSubType_LowPassFilter;
lowpasscd.componentManufacturer = kAudioUnitManufacturer_Apple;
AudioComponent comp = AudioComponentFindNext(NULL, &lowpasscd);
if (comp == NULL) NSLog(#"can't get lowpass unit");
AudioComponentInstanceNew(comp, &player.lowPassUnit);
AURenderCallbackStruct input;
input.inputProc = EffectMachineCallbackRenderProc;
input.inputProcRefCon = &player;
CheckError(AudioUnitSetProperty(player.lowPassUnit,
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input,
0,
&input,
sizeof(input)),
"AudioUnitSetProperty for callback failed");
CheckError(AudioUnitSetParameter(player.lowPassUnit,
kLowPassParam_CutoffFrequency,
kAudioUnitScope_Global,
0,
1500,
0), "AudioUnitSetParameter cutoff for lowpass failed");
CheckError(AudioUnitSetParameter(player.lowPassUnit,
kLowPassParam_Resonance,
kAudioUnitScope_Global,
0,
0,
0), "AudioUnitSetParameter resonance for lowpass failed");
CheckError(AudioUnitInitialize(player.lowPassUnit),
"Couldn't initialize lowpass unit");
}
-(Buffer *)processedBuffer {
AudioBufferList *bufferlist = malloc(sizeof(AudioBufferList));
UInt32 blockSize = 1024;
float *left = malloc(sizeof(float) * blockSize);
float *right = malloc(sizeof(float) * blockSize);
bufferlist->mBuffers[0].mData = left;
bufferlist->mBuffers[1].mData = right;
UInt32 size = sizeof(float) * blockSize;
AudioTimeStamp inTimeStamp;
memset(&inTimeStamp, 0, sizeof(AudioTimeStamp));
inTimeStamp.mSampleTime = 0;
AudioUnitRenderActionFlags flag = 0;
NSUInteger length = ceil(self.buffer.size / (float)blockSize);
double *processed = malloc(sizeof(double) * blockSize * length);
for (int i = 0; i < length; i++) {
bufferlist->mBuffers[0].mDataByteSize = size;
bufferlist->mBuffers[1].mDataByteSize = size;
bufferlist->mNumberBuffers = 2;
inTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
AudioUnitRender(player.lowPassUnit, &flag, &inTimeStamp, 0, blockSize, bufferlist);
for (NSUInteger j = 0; j < blockSize; j++) {
processed[j + (blockSize * i)] = left[j];
}
inTimeStamp.mSampleTime += blockSize;
}
Buffer *buffer = [[Buffer alloc] initWithSamples:processed size:self.buffer.size sampleRate:self.buffer.sampleRate];
free(bufferlist);
free(left);
free(right);
free(processed);
return buffer;
}
-(void)cleanup {
AudioOutputUnitStop(player.lowPassUnit);
AudioUnitUninitialize(player.lowPassUnit);
AudioComponentInstanceDispose(player.lowPassUnit);
}
#end
If I add a generic output and try to set an 8khz ASBD on its input, then I just get garbage noise for output.. It looks like, 0,0,0,0,0,17438231945853048031929171968.000000,0,0,0,-2548199532257382185315640279040.000000... Yikes!
I tried adding ASBDs to the input and output of the lowpass unit, giving it an 8khz sample rate property, and it did nothing.. I tried adding converter units (with ASBDs set to 8khz) before, and then after, and then before AND after the lowpass filter (in a chain), this also did not work.
As a side question, my buffer is mono 8khz samples, and if I make my buffer list have mNumberBuffers set to 1, then my lowpass input render proc is never called... Is there a way to not have to use stereo channels?
I am using converters at both ends with ASBD set to 8000 samplerate mono floats for input of input converter and output of output converter while using 44100.0 stereo for input and output of the low pass unit, and calling AudioUnitRender on the end converter with no io unit for the offline render. For the online render I put a converter unit before the io unit so the render callback will pull from buffers at 8K for playback too. It appears that the lower sample rate on the output ASBD requires a higher maximum frames per slice and a smaller slice (AudioUnitRender inNumberFrames) and that's why it wouldn't render.
#import "ViewController.h"
#import <AudioToolbox/AudioToolbox.h>
#implementation ViewController{
int sampleCount;
int renderBufferHead;
float *renderBuffer;
}
- (void)viewDidLoad {
[super viewDidLoad];
float sampleRate = 8000;
int bufferSeconds = 3;
sampleCount = sampleRate * bufferSeconds;//seconds
float *originalSaw = generateSawWaveBuffer(440, sampleRate, sampleCount);
renderBuffer = originalSaw;
renderBufferHead = 0;
AURenderCallbackStruct cbStruct = {renderCallback,(__bridge void *)self};
//this will do offline render using the render callback, callback just reads from renderBuffer at samplerate
float *processedBuffer = offlineRender(sampleCount, sampleRate, &cbStruct);
renderBufferHead = 0;//rewind render buffer after processing
//set up audio units to do live render using the render callback at sample rate then self destruct after delay
//it will play originalSaw for bufferSeconds, then after delay will switch renderBuffer to point at processedBuffer
float secondsToPlayAudio = (bufferSeconds + 1) * 2;
onlineRender(sampleRate, &cbStruct,secondsToPlayAudio);
//wait for original to finish playing, then change render callback source buffer to processed buffer
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)((secondsToPlayAudio / 2) * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
renderBuffer = processedBuffer;
renderBufferHead = 0;//rewind render buffer
});
//destroy after all rendering done
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(secondsToPlayAudio * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
free(originalSaw);
free(processedBuffer);
});
}
float * offlineRender(int count, double sampleRate, AURenderCallbackStruct *cbStruct){
AudioComponentInstance inConverter = getComponentInstance(kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter);
AudioComponentInstance lowPass = getComponentInstance(kAudioUnitType_Effect, kAudioUnitSubType_LowPassFilter);
AudioComponentInstance outConverter = getComponentInstance(kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter);
AudioStreamBasicDescription asbd = getMonoFloatASBD(sampleRate);
AudioUnitSetProperty(inConverter, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &asbd, sizeof(AudioStreamBasicDescription));
AudioUnitSetProperty(outConverter, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &asbd, sizeof(AudioStreamBasicDescription));
AudioUnitSetProperty(inConverter, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, cbStruct, sizeof(AURenderCallbackStruct));
formatAndConnect(inConverter, lowPass);
formatAndConnect(lowPass, outConverter);
UInt32 maxFramesPerSlice = 4096;
AudioUnitSetProperty(inConverter, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFramesPerSlice, sizeof(UInt32));
AudioUnitSetProperty(lowPass, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFramesPerSlice, sizeof(UInt32));
AudioUnitSetProperty(outConverter, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFramesPerSlice, sizeof(UInt32));
AudioUnitInitialize(inConverter);
AudioUnitInitialize(lowPass);
AudioUnitInitialize(outConverter);
AudioUnitSetParameter(lowPass, kLowPassParam_CutoffFrequency, kAudioUnitScope_Global, 0, 500, 0);
AudioBufferList *bufferlist = malloc(sizeof(AudioBufferList) + sizeof(AudioBufferList));//stereo bufferlist + sizeof(AudioBuffer)
float *left = malloc(sizeof(float) * 4096);
bufferlist->mBuffers[0].mData = left;
bufferlist->mNumberBuffers = 1;
AudioTimeStamp inTimeStamp;
memset(&inTimeStamp, 0, sizeof(AudioTimeStamp));
inTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
inTimeStamp.mSampleTime = 0;
float *buffer = malloc(sizeof(float) * count);
int inNumberframes = 512;
AudioUnitRenderActionFlags flag = 0;
int framesRead = 0;
while (count){
inNumberframes = MIN(inNumberframes, count);
bufferlist->mBuffers[0].mDataByteSize = sizeof(float) * inNumberframes;
printf("Offline Render %i frames\n",inNumberframes);
AudioUnitRender(outConverter, &flag, &inTimeStamp, 0, inNumberframes, bufferlist);
memcpy(buffer + framesRead, left, sizeof(float) * inNumberframes);
inTimeStamp.mSampleTime += inNumberframes;
count -= inNumberframes;
framesRead += inNumberframes;
}
free(left);
// free(right);
free(bufferlist);
AudioUnitUninitialize(inConverter);
AudioUnitUninitialize(lowPass);
AudioUnitUninitialize(outConverter);
return buffer;
}
OSStatus renderCallback(void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData){
ViewController *self = (__bridge ViewController*)inRefCon;
float *left = ioData->mBuffers[0].mData;
for (int i = 0; i < inNumberFrames; i++) {
if (self->renderBufferHead >= self->sampleCount) {
left[i] = 0;
}
else{
left[i] = self->renderBuffer[self->renderBufferHead++];
}
}
if(ioData->mNumberBuffers == 2){
memcpy(ioData->mBuffers[1].mData, left, sizeof(float) * inNumberFrames);
}
printf("render %f to %f\n",inTimeStamp->mSampleTime,inTimeStamp->mSampleTime + inNumberFrames);
return noErr;
}
void onlineRender(double sampleRate, AURenderCallbackStruct *cbStruct,float duration){
AudioComponentInstance converter = getComponentInstance(kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter);
AudioComponentInstance ioUnit = getComponentInstance(kAudioUnitType_Output, kAudioUnitSubType_DefaultOutput);
AudioStreamBasicDescription asbd = getMonoFloatASBD(sampleRate);
AudioUnitSetProperty(converter, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &asbd, sizeof(AudioStreamBasicDescription));
AudioUnitSetProperty(converter, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, cbStruct, sizeof(AURenderCallbackStruct));
formatAndConnect(converter, ioUnit);
AudioUnitInitialize(converter);
AudioUnitInitialize(ioUnit);
AudioOutputUnitStart(ioUnit);
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(duration * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
AudioOutputUnitStop(ioUnit);
AudioUnitUninitialize(ioUnit);
AudioUnitUninitialize(converter);
});
}
float * generateSawWaveBuffer(float frequency,float sampleRate, int sampleCount){
float *buffer = malloc(sizeof(float) * sampleCount);
float increment = (frequency / sampleRate) * 2;
int increasing = 1;
float sample = 0;
for (int i = 0; i < sampleCount; i++) {
if (increasing) {
sample += increment;
if (sample >= 1) {
increasing = 0;
}
}
else{
sample -= increment;
if (sample < -1) {
increasing = 1;
}
}
buffer[i] = sample;
}
return buffer;
}
AudioComponentInstance getComponentInstance(OSType type,OSType subType){
AudioComponentDescription desc = {0};
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentSubType = subType;
desc.componentType = type;
AudioComponent ioComponent = AudioComponentFindNext(NULL, &desc);
AudioComponentInstance unit;
AudioComponentInstanceNew(ioComponent, &unit);
return unit;
}
AudioStreamBasicDescription getMonoFloatASBD(double sampleRate){
AudioStreamBasicDescription asbd = {0};
asbd.mSampleRate = sampleRate;
asbd.mFormatID = kAudioFormatLinearPCM;
asbd.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsNonInterleaved | kAudioFormatFlagIsPacked;
asbd.mFramesPerPacket = 1;
asbd.mChannelsPerFrame = 1;
asbd.mBitsPerChannel = 32;
asbd.mBytesPerPacket = 4;
asbd.mBytesPerFrame = 4;
return asbd;
}
void formatAndConnect(AudioComponentInstance src,AudioComponentInstance dst){
AudioStreamBasicDescription asbd;
UInt32 propsize = sizeof(AudioStreamBasicDescription);
AudioUnitGetProperty(dst, kAudioUnitProperty_StreamFormat,kAudioUnitScope_Input,0,&asbd,&propsize);
AudioUnitSetProperty(src, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &asbd, sizeof(AudioStreamBasicDescription));
AudioUnitConnection connection = {0};
connection.destInputNumber = 0;
connection.sourceAudioUnit = src;
connection.sourceOutputNumber = 0;
AudioUnitSetProperty(dst, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &connection, sizeof(AudioUnitConnection));
}
#end

iOS 8 UIImage Metadata

This is my first question
I've a a "little" problem:
when i read UIImage metadata on iOS 7 I use this code and it works great
#pragma mark - Image Picker Controller delegate methods
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage* image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self metaDataFromAssetLibrary:info];
[picker dismissViewControllerAnimated:YES completion:NULL];
}
From imagePickerController i choose the image and call metaDataFromAssetLibrary method
- (void) metaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
NSMutableDictionary *imageMetadata = nil;
NSDictionary *metadata = asset.defaultRepresentation.metadata;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSLog (#"imageMetaData from AssetLibrary %#",imageMetadata);
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
on Xcode5 and iOS7 console return something like this
imageMetaData from AssetLibrary {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 1;
PixelHeight = 2448;
PixelWidth = 3264;
"{Exif}" = {
ApertureValue = "2.52606882168926";
BrightnessValue = "2.211389961389961";
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
DateTimeDigitized = "2014:06:05 08:54:09";
DateTimeOriginal = "2014:06:05 08:54:09";
ExifVersion = (
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.05";
FNumber = "2.4";
Flash = 24;
FlashPixVersion = (
1,
0
);
FocalLenIn35mmFilm = 35;
FocalLength = "4.28";
ISOSpeedRatings = (
125
);
LensMake = Apple;
LensModel = "iPhone 4S back camera 4.28mm f/2.4";
LensSpecification = (
"4.28",
"4.28",
"2.4",
"2.4"
);
MeteringMode = 3;
PixelXDimension = 3264;
PixelYDimension = 2448;
SceneCaptureType = 0;
SceneType = 1;
SensingMethod = 2;
ShutterSpeedValue = "4.321928460342146";
SubjectArea = (
1643,
1079,
610,
612
);
SubsecTimeDigitized = 347;
SubsecTimeOriginal = 347;
WhiteBalance = 0;
};
"{GPS}" = {
Altitude = 26;
AltitudeRef = 1;
DateStamp = "2014:06:05";
DestBearing = "177.086387434555";
DestBearingRef = M;
ImgDirection = "357.0864197530864";
ImgDirectionRef = M;
Latitude = "43.80268";
LatitudeRef = N;
Longitude = "11.0635195";
LongitudeRef = E;
Speed = 0;
SpeedRef = K;
TimeStamp = "06:54:08";
};
"{MakerApple}" = {
1 = 0;
3 = {
epoch = 0;
flags = 1;
timescale = 1000000000;
value = 27688938393500;
};
4 = 1;
5 = 186;
6 = 195;
7 = 1;
8 = (
"-0.6805536",
"0.02519802",
"-0.755379"
);
};
"{TIFF}" = {
DateTime = "2014:06:05 08:54:09";
Make = Apple;
Model = "iPhone 4S";
Orientation = 1;
ResolutionUnit = 2;
Software = "8.0";
XResolution = 72;
YResolution = 72;
};
}
But on Xcode6 and iOS 8 console return only this
imageMetaData from AssetLibrary {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 1;
PixelHeight = 768;
PixelWidth = 1020;
"{Exif}" = {
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
ExifVersion = (
2,
2,
1
);
FlashPixVersion = (
1,
0
);
PixelXDimension = 1020;
PixelYDimension = 768;
SceneCaptureType = 0;
};
"{TIFF}" = {
Orientation = 1;
ResolutionUnit = 2;
XResolution = 72;
YResolution = 72;
};
}
someone knows this problem?
any solution or suggestion?
Thank you so much
P.S.: excuse me for my terrible english ;-)
After upgrade to Xcode6, you have to get the image in dismissViewController's completion block, or more complex method: get the image from asset.
Please try the following code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:YES completion::^{
UIImage* image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self metaDataFromAssetLibrary:info];
}];
Also you can reference john.k.doe's answer in
didFinishPickingMediaWithInfo return nil photo

UIImagePickerController returning incorrect image orientation

I'm using UIImagePickerController to capture an image and then store it. However, when i try to rescale it, the orientation value i get out of this image is incorrect. When i take a snap by holding the phone Up, it gives me orientation of Left. Has anyone experienced this issue?
The UIImagePickerController dictionary shows following information:
{
UIImagePickerControllerMediaMetadata = {
DPIHeight = 72;
DPIWidth = 72;
Orientation = 3;
"{Exif}" = {
ApertureValue = "2.970853654340484";
ColorSpace = 1;
DateTimeDigitized = "2011:02:14 10:26:17";
DateTimeOriginal = "2011:02:14 10:26:17";
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.06666666666666667";
FNumber = "2.8";
Flash = 32;
FocalLength = "3.85";
ISOSpeedRatings = (
125
);
MeteringMode = 1;
PixelXDimension = 2048;
PixelYDimension = 1536;
SceneType = 1;
SensingMethod = 2;
Sharpness = 1;
ShutterSpeedValue = "3.910431673351467";
SubjectArea = (
1023,
767,
614,
614
);
WhiteBalance = 0;
};
"{TIFF}" = {
DateTime = "2011:02:14 10:26:17";
Make = Apple;
Model = "iPhone 3GS";
Software = "4.2.1";
XResolution = 72;
YResolution = 72;
};
};
UIImagePickerControllerMediaType = "public.image";
UIImagePickerControllerOriginalImage = "<UIImage: 0x40efb50>";
}
However picture returns imageOrientation == 1;
UIImage *picture = [info objectForKey:UIImagePickerControllerOriginalImage];
I just started working on this issue in my own app.
I used the UIImage category that Trevor Harmon crafted for resizing an image and fixing its orientation, UIImage+Resize.
Then you can do something like this in -imagePickerController:didFinishPickingMediaWithInfo:
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *resized = [pickedImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:pickedImage.size interpolationQuality:kCGInterpolationHigh];
This fixed the problem for me. The resized image is oriented correctly visually and the imageOrientation property reports UIImageOrientationUp.
There are several versions of this scale/resize/crop code out there; I used Trevor's because it seems pretty clean and includes some other UIImage manipulators that I want to use later.
This what I have found for fixing orientation issue; Works for me
UIImage *initialImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSData *data = UIImagePNGRepresentation(self.initialImage);
UIImage *tempImage = [UIImage imageWithData:data];
UIImage *fixedOrientationImage = [UIImage imageWithCGImage:tempImage.CGImage
scale:initialImage.scale
orientation:self.initialImage.imageOrientation];
initialImage = fixedOrientationImage;
Here's a Swift snippet that fixes the problem efficiently:
let orientedImage = UIImage(CGImage: initialImage.CGImage, scale: 1, orientation: initialImage.imageOrientation)!
I use the following code that I have put in a separate image utility object file that has a bunch of other editing methods for UIImages:
+ (UIImage*)imageWithImage:(UIImage*)sourceImage scaledToSizeWithSameAspectRatio:(CGSize)targetSize
{
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor) {
scaleFactor = widthFactor; // scale to fit height
}
else {
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else if (widthFactor < heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
CGImageRef imageRef = [sourceImage CGImage];
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
CGColorSpaceRef colorSpaceInfo = CGImageGetColorSpace(imageRef);
if (bitmapInfo == kCGImageAlphaNone) {
bitmapInfo = kCGImageAlphaNoneSkipLast;
}
CGContextRef bitmap;
if (sourceImage.imageOrientation == UIImageOrientationUp || sourceImage.imageOrientation == UIImageOrientationDown) {
bitmap = CGBitmapContextCreate(NULL, targetWidth, targetHeight, CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), colorSpaceInfo, bitmapInfo);
} else {
bitmap = CGBitmapContextCreate(NULL, targetHeight, targetWidth, CGImageGetBitsPerComponent(imageRef), CGImageGetBytesPerRow(imageRef), colorSpaceInfo, bitmapInfo);
}
// In the right or left cases, we need to switch scaledWidth and scaledHeight,
// and also the thumbnail point
if (sourceImage.imageOrientation == UIImageOrientationLeft) {
thumbnailPoint = CGPointMake(thumbnailPoint.y, thumbnailPoint.x);
CGFloat oldScaledWidth = scaledWidth;
scaledWidth = scaledHeight;
scaledHeight = oldScaledWidth;
CGContextRotateCTM (bitmap, M_PI_2); // + 90 degrees
CGContextTranslateCTM (bitmap, 0, -targetHeight);
} else if (sourceImage.imageOrientation == UIImageOrientationRight) {
thumbnailPoint = CGPointMake(thumbnailPoint.y, thumbnailPoint.x);
CGFloat oldScaledWidth = scaledWidth;
scaledWidth = scaledHeight;
scaledHeight = oldScaledWidth;
CGContextRotateCTM (bitmap, -M_PI_2); // - 90 degrees
CGContextTranslateCTM (bitmap, -targetWidth, 0);
} else if (sourceImage.imageOrientation == UIImageOrientationUp) {
// NOTHING
} else if (sourceImage.imageOrientation == UIImageOrientationDown) {
CGContextTranslateCTM (bitmap, targetWidth, targetHeight);
CGContextRotateCTM (bitmap, -M_PI); // - 180 degrees
}
CGContextDrawImage(bitmap, CGRectMake(thumbnailPoint.x, thumbnailPoint.y, scaledWidth, scaledHeight), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* newImage = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap);
CGImageRelease(ref);
return newImage;
}
And then I call
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage *fixedOriginal = [ImageUtil imageWithImage:[mediaInfoDict objectForKey:UIImagePickerControllerOriginalImage] scaledToSizeWithSameAspectRatio:pickedImage.size];
In iOS 7, I needed code dependent on UIImage.imageOrientation to correct for the different orientations. Now, in iOS 8.2, when I pick my old test images from the album via UIImagePickerController, the orientation will be UIImageOrientationUp for ALL images. When I take a photo (UIImagePickerControllerSourceTypeCamera), those images will also always be upwards, regardless of the device orientation.
So between those iOS versions, there obviously has been a fix where UIImagePickerController already rotates the images if neccessary.
You can even notice that when the album images are displayed: for a split second, they will be displayed in the original orientation, before they appear in the new upward orientation.
The only thing that worked for me was to re-render the image again which forces the correct orientation.
if (photo.imageOrientation != .up) {
UIGraphicsBeginImageContextWithOptions(photo.size, false, 1.0);
let rect = CGRect(x: 0, y: 0, width: photo.size.width, height: photo.size.height);
photo.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
photo = newImage;
}

Resources