Twilio iOS unrecognized selector sent to Class - xcode

I'm new to using Twilio and I'm hoping someone can help me debug my app.
I am making a call to get a capability token and it is returned just fine. I print it in the console to check then it appears when I make the call to initWithCapabilityToken that's where something is breaking and I cant figure it out.
Here is my code...
[NSURLConnection sendAsynchronousRequest:request queue:queue completionHandler:
^(NSURLResponse *response, NSData *data, NSError *error){
// Log Any Reply
if ([data length] >0 && error == nil) {
NSData *jsonData = data;
// Deserialize JSON into Dictionary
error = nil;
id jsonObject = [NSJSONSerialization JSONObjectWithData:jsonData
options:NSJSONReadingAllowFragments
error:&error ];
if (jsonObject != nil && error == nil) {
NSLog(#"Successfully deserialized JSON response...");
if ([jsonObject isKindOfClass:[NSDictionary class]]) {
NSDictionary *deserializedDictionary = (NSDictionary *)jsonObject;
NSLog(#"Deserialized JSON Dictionary = %#", deserializedDictionary);
NSString *token = [deserializedDictionary objectForKey:#"token"];
if (token == nil) {
NSLog(#"Error retrieving token");
} else {
NSLog(#"Token: %#", token);
// Setup TCDevice
_phone = [[TCDevice alloc] initWithCapabilityToken:token delegate:self];
}
}
} else if (error != nil){
NSLog(#"An error happened while de-serializing the JSON data.");
}
}else if ([data length] == 0 && error == nil){
NSLog(#"Nothing was downloaded.");
}else if (error != nil){
NSLog(#"Error happened = %#", error);
}
}];
Here is what I get right after my token is logged...
2015-01-29 16:32:14.637 AppName[4649:701822] +[NSString stringWithPJStr:]: unrecognized selector sent to class 0x3377da98
2015-01-29 16:32:14.639 AppName[4649:701822] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '+[NSString stringWithPJStr:]: unrecognized selector sent to class 0x3377da98'
*** First throw call stack:
(0x254da49f 0x32c90c8b 0x254df7d5 0x254dd7d7 0x2540f058 0x10ee7d 0x10e32b 0x10e2b7 0x105959 0x10568d 0x7b3ab 0x24fa228d 0x261d52b1 0x2614034d 0x26132b07 0x261d7c1b 0x487e29 0x4822c9 0x489567 0x48a891 0x33351e31 0x33351b84)
libc++abi.dylib: terminating with uncaught exception of type NSException
Thanks

After reading all of Twilios documentation I discovered I was missing one thing. I had to add -ObjC to Other Linker Flags and that solved my problem.

add this lines to the other linker flags:
-ObjC
-lTwilioClient
-lcrypto
-lssl
it worked for me only with a upper C add the end (-ObjC)

Somewhere you are calling a stringWithPJStr function(selector) that doesn't exist.

Related

Handling cocoa errors from (NS)URLSession in Swift 3

How to handle errors from cocoa frameworks in Swift 3 now that NSError is gone?
Swift 3 improved NSError Bridging - https://github.com/apple/swift-evolution/blob/master/proposals/0112-nserror-bridging.md
What I do not understand from that document is how to use the improved error bridging.
Let's say, I have the following code, written in Swift 2.3, where I'm trying to find out what the actual error is:
NSURLSession.sharedSession().dataTaskWithURL(url) { data, response, error in
guard let error = error { else return }
if error.domain == NSURLErrorDomain && error.code == NSURLErrorCancelled {
print("cancelled.")
} else {
print("error occured: \(error)")
}
}
The corresponding Swift 3 method provides plain Error in its completion handler according to the documentation:
func dataTask(with url: URL, completionHandler: #escaping (Data?, URLResponse?, Error?) -> Void) -> URLSessionDataTask
How do I migrate the mentioned code to Swift 3? I guess casting to NSError is not the correct answer.
Error is a protocol which includes also a potential NSError.
I guess casting to NSError is the right answer.
Practically just optional bind to NSError
URLSession.shared.dataTask(with: url) { data, response, error in
guard let nserror = error as? NSError else { return }
if nserror.domain == NSURLErrorDomain && nserror.code == NSURLErrorCancelled {
print("cancelled.")
} else {
print("error occured: \(nserror)")
}
}
If you have more different errors use a switch statement and pattern matching.

OSX CryptoTokenKit SmartCard returned error 6d00

I'm trying to read the master file of a smartcard on OSX using CryptoTokenKit but I always get statusword 6d00 as response. I also tried to run the trivial example with some modifications but get the same error. My reader is Gemalto PC Twin Reader.
Please let me know if you have any suggestion to fix it.
I'm using the following code:
TKSmartCardSlot *slot = [self.smartCardManager slotWithName:slotName];
TKSmartCard *card = [slot makeSmartCard];
card.sensitive = YES;
[card beginSessionWithReply:^(BOOL success, NSError *error) {
NSLog(#"%#", error);
NSLog(#"Proto: %ld", card.currentProtocol);
NSData *data = [CommonUtil dataFromHexString:#"3F00"]; //<3f00>
NSLog(#"%#", data);
[card sendIns:0xA4 p1:0x00 p2:0x00 data:data le:#0
reply:^(NSData *replyData, UInt16 sw, NSError *error)
{
NSLog(#"Response: %#", replyData);
if (error) {
if (error.code == TKErrorCodeCommunicationError) {
// set response error code.
}
NSLog(#"%#", error);
}
}];
}];
It is silly, but in the apdu where no response data is expected without the success code 90 00, le should be nil.
[card sendIns:0xA4 p1:0x00 p2:0x00 data:nil le:nil
reply:^(NSData *replyData, UInt16 sw, NSError *error)
{
}
Status Word 6D00 is " Instruction code not supported or invalid "
http://www.cardwerk.com/smartcards/smartcard_standard_ISO7816-4_5_basic_organizations.aspx
Not all the cards allow to select the Master File (0x3F00).

`PFFile getDataInBackgroundWithBlock` block not getting called Obj-C

I have the following code, to retrieve an image from Parse. This code block is in a recursive loop that gets called for a list of images in an array.
PFFile *remoteImageFile = object[#"Image"]
[remoteImageFile getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
DDLogInfo(#"%s:%d %#", __PRETTY_FUNCTION__, __LINE__, [error debugDescription]);
// other logics
}];
The code fetches most of the images successfully, the sizes of the images being 2MB to 3MB. But randomly it fails for certain files and the block is not getting fired and my whole fetching operation freezes. Couldn't find the reason.
Any help please
I doubt that there is a bug in the Parse framework here. The most probable cause is that your remoteImageFile is nil which means that getDataInBackgroundWithBlock: is not even called.
Can you check by using the following code :
PFFile *remoteImageFile = object[#"Image"];
if (remoteImageFile == nil || ![remoteImageFile isKindOfClass:[PFFile class]]) {
NSLog(#"Error : remoteImageFile = %#", remoteImageFile);
} else {
[remoteImageFile getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
DDLogInfo(#"%s:%d %#", __PRETTY_FUNCTION__, __LINE__, [error debugDescription]);
// other logics
}];
}

Mac OS X Simple Voice Recorder

Does anyone have some sample code for a SIMPLE voice recorder for Mac OS X? I would just like to record my voice coming from the internal microphone on my MacBook Pro and save it to a file. That is all.
I have been searching for hours and yes, there are some examples that will record voice and save it to a file such as http://developer.apple.com/library/mac/#samplecode/MYRecorder/Introduction/Intro.html . The sample code for Mac OS X seems to be about 10 times more complicated than similar sample code for the iPhone.
For iOS the commands are as simple as:
soundFile =[NSURL FileURLWithPath:[tempDir stringByAppendingString:#"mysound.cap"]];
soundSetting = [NSDictionary dictionaryWithObjectsAndKeys: // dictionary setting code left out goes here
soundRecorder = [[AVAudioRecorder alloc] initWithURL:soundFile settings:soundSetting error:nil];
[soundRecorder record];
[soundRecorder stop];
I think there is code to do this for the Mac OS X that would be as simple as the iPhone version. Thank you for your help.
Here is the code (currently the player will not work)
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface MyAVFoundationClass : NSObject <AVAudioPlayerDelegate>
{
AVAudioRecorder *soundRecorder;
}
#property (retain) AVAudioRecorder *soundRecorder;
-(IBAction)stopAudio:(id)sender;
-(IBAction)recordAudio:(id)sender;
-(IBAction)playAudio:(id)sender;
#end
#import "MyAVFoundationClass.h"
#implementation MyAVFoundationClass
#synthesize soundRecorder;
-(void)awakeFromNib
{
NSLog(#"awakeFromNib visited");
NSString *tempDir;
NSURL *soundFile;
NSDictionary *soundSetting;
tempDir = #"/Users/broncotrojan/Documents/testvoices/";
soundFile = [NSURL fileURLWithPath: [tempDir stringByAppendingString:#"test1.caf"]];
NSLog(#"soundFile: %#",soundFile);
soundSetting = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatMPEG4AAC],AVFormatIDKey,
[NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityHigh],AVEncoderAudioQualityKey, nil];
soundRecorder = [[AVAudioRecorder alloc] initWithURL: soundFile settings: soundSetting error: nil];
}
-(IBAction)stopAudio:(id)sender
{
NSLog(#"stopAudioVisited");
[soundRecorder stop];
}
-(IBAction)recordAudio:(id)sender
{
NSLog(#"recordAudio Visited");
[soundRecorder record];
}
-(IBAction)playAudio:(id)sender
{
NSLog(#"playAudio Visited");
NSURL *soundFile;
NSString *tempDir;
AVAudioPlayer *audioPlayer;
tempDir = #"/Users/broncotrojan/Documents/testvoices/";
soundFile = [NSURL fileURLWithPath: [tempDir stringByAppendingString:#"test1.caf"]];
NSLog(#"soundFile: %#", soundFile);
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFile error:nil];
[audioPlayer setDelegate:self];
[audioPlayer play];
}
#end
Here is the code that is working for me on macOS 10.14 with Xcode 10.2.1, Swift 5.0.1.
First of all you have to set up NSMicrophoneUsageDescription aka Privacy - Microphone Usage Description in your Info.plist file as described in the Apple docs: Requesting Authorization for Media Capture on macOS.
Then you have to request a permission from a user to use a microphone:
switch AVCaptureDevice.authorizationStatus(for: .audio) {
case .authorized: // The user has previously granted access to the camera.
// proceed with recording
case .notDetermined: // The user has not yet been asked for camera access.
AVCaptureDevice.requestAccess(for: .audio) { granted in
if granted {
// proceed with recording
}
}
case .denied: // The user has previously denied access.
()
case .restricted: // The user can't grant access due to restrictions.
()
#unknown default:
fatalError()
}
Then you can use the following methods to start and stop audio recording:
import AVFoundation
open class SpeechRecorder: NSObject {
private var destinationUrl: URL!
var recorder: AVAudioRecorder?
let player = AVQueuePlayer()
open func start() {
destinationUrl = createUniqueOutputURL()
do {
let format = AVAudioFormat(settings: [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVEncoderAudioQualityKey: AVAudioQuality.high,
AVSampleRateKey: 44100.0,
AVNumberOfChannelsKey: 1,
AVLinearPCMBitDepthKey: 16,
])!
let recorder = try AVAudioRecorder(url: destinationUrl, format: format)
// workaround against Swift, AVAudioRecorder: Error 317: ca_debug_string: inPropertyData == NULL issue
// https://stackoverflow.com/a/57670740/598057
let firstSuccess = recorder.record()
if firstSuccess == false || recorder.isRecording == false {
recorder.record()
}
assert(recorder.isRecording)
self.recorder = recorder
} catch let error {
let code = (error as NSError).code
NSLog("SpeechRecorder: \(error)")
NSLog("SpeechRecorder: \(code)")
let osCode = OSStatus(code)
NSLog("SpeechRecorder: \(String(describing: osCode.detailedErrorMessage()))")
}
}
open func stop() {
NSLog("SpeechRecorder: stop()")
if let recorder = recorder {
recorder.stop()
NSLog("SpeechRecorder: final file \(destinationUrl.absoluteString)")
player.removeAllItems()
player.insert(AVPlayerItem(url: destinationUrl), after: nil)
player.play()
}
}
func createUniqueOutputURL() -> URL {
let paths = FileManager.default.urls(for: .musicDirectory,
in: .userDomainMask)
let documentsDirectory = URL(fileURLWithPath: NSTemporaryDirectory())
let currentTime = Int(Date().timeIntervalSince1970 * 1000)
let outputURL = URL(fileURLWithPath: "SpeechRecorder-\(currentTime).m4a",
relativeTo: documentsDirectory)
destinationUrl = outputURL
return outputURL
}
}
extension OSStatus {
//**************************
func asString() -> String? {
let n = UInt32(bitPattern: self.littleEndian)
guard let n1 = UnicodeScalar((n >> 24) & 255), n1.isASCII else { return nil }
guard let n2 = UnicodeScalar((n >> 16) & 255), n2.isASCII else { return nil }
guard let n3 = UnicodeScalar((n >> 8) & 255), n3.isASCII else { return nil }
guard let n4 = UnicodeScalar( n & 255), n4.isASCII else { return nil }
return String(n1) + String(n2) + String(n3) + String(n4)
} // asString
//**************************
func detailedErrorMessage() -> String {
switch(self) {
case 0:
return "Success"
// AVAudioRecorder errors
case kAudioFileUnspecifiedError:
return "kAudioFileUnspecifiedError"
case kAudioFileUnsupportedFileTypeError:
return "kAudioFileUnsupportedFileTypeError"
case kAudioFileUnsupportedDataFormatError:
return "kAudioFileUnsupportedDataFormatError"
case kAudioFileUnsupportedPropertyError:
return "kAudioFileUnsupportedPropertyError"
case kAudioFileBadPropertySizeError:
return "kAudioFileBadPropertySizeError"
case kAudioFilePermissionsError:
return "kAudioFilePermissionsError"
case kAudioFileNotOptimizedError:
return "kAudioFileNotOptimizedError"
case kAudioFileInvalidChunkError:
return "kAudioFileInvalidChunkError"
case kAudioFileDoesNotAllow64BitDataSizeError:
return "kAudioFileDoesNotAllow64BitDataSizeError"
case kAudioFileInvalidPacketOffsetError:
return "kAudioFileInvalidPacketOffsetError"
case kAudioFileInvalidFileError:
return "kAudioFileInvalidFileError"
case kAudioFileOperationNotSupportedError:
return "kAudioFileOperationNotSupportedError"
case kAudioFileNotOpenError:
return "kAudioFileNotOpenError"
case kAudioFileEndOfFileError:
return "kAudioFileEndOfFileError"
case kAudioFilePositionError:
return "kAudioFilePositionError"
case kAudioFileFileNotFoundError:
return "kAudioFileFileNotFoundError"
//***** AUGraph errors
case kAUGraphErr_NodeNotFound: return "AUGraph Node Not Found"
case kAUGraphErr_InvalidConnection: return "AUGraph Invalid Connection"
case kAUGraphErr_OutputNodeErr: return "AUGraph Output Node Error"
case kAUGraphErr_CannotDoInCurrentContext: return "AUGraph Cannot Do In Current Context"
case kAUGraphErr_InvalidAudioUnit: return "AUGraph Invalid Audio Unit"
//***** MIDI errors
case kMIDIInvalidClient: return "MIDI Invalid Client"
case kMIDIInvalidPort: return "MIDI Invalid Port"
case kMIDIWrongEndpointType: return "MIDI Wrong Endpoint Type"
case kMIDINoConnection: return "MIDI No Connection"
case kMIDIUnknownEndpoint: return "MIDI Unknown Endpoint"
case kMIDIUnknownProperty: return "MIDI Unknown Property"
case kMIDIWrongPropertyType: return "MIDI Wrong Property Type"
case kMIDINoCurrentSetup: return "MIDI No Current Setup"
case kMIDIMessageSendErr: return "MIDI Message Send Error"
case kMIDIServerStartErr: return "MIDI Server Start Error"
case kMIDISetupFormatErr: return "MIDI Setup Format Error"
case kMIDIWrongThread: return "MIDI Wrong Thread"
case kMIDIObjectNotFound: return "MIDI Object Not Found"
case kMIDIIDNotUnique: return "MIDI ID Not Unique"
case kMIDINotPermitted: return "MIDI Not Permitted"
//***** AudioToolbox errors
case kAudioToolboxErr_CannotDoInCurrentContext: return "AudioToolbox Cannot Do In Current Context"
case kAudioToolboxErr_EndOfTrack: return "AudioToolbox End Of Track"
case kAudioToolboxErr_IllegalTrackDestination: return "AudioToolbox Illegal Track Destination"
case kAudioToolboxErr_InvalidEventType: return "AudioToolbox Invalid Event Type"
case kAudioToolboxErr_InvalidPlayerState: return "AudioToolbox Invalid Player State"
case kAudioToolboxErr_InvalidSequenceType: return "AudioToolbox Invalid Sequence Type"
case kAudioToolboxErr_NoSequence: return "AudioToolbox No Sequence"
case kAudioToolboxErr_StartOfTrack: return "AudioToolbox Start Of Track"
case kAudioToolboxErr_TrackIndexError: return "AudioToolbox Track Index Error"
case kAudioToolboxErr_TrackNotFound: return "AudioToolbox Track Not Found"
case kAudioToolboxError_NoTrackDestination: return "AudioToolbox No Track Destination"
//***** AudioUnit errors
case kAudioUnitErr_CannotDoInCurrentContext: return "AudioUnit Cannot Do In Current Context"
case kAudioUnitErr_FailedInitialization: return "AudioUnit Failed Initialization"
case kAudioUnitErr_FileNotSpecified: return "AudioUnit File Not Specified"
case kAudioUnitErr_FormatNotSupported: return "AudioUnit Format Not Supported"
case kAudioUnitErr_IllegalInstrument: return "AudioUnit Illegal Instrument"
case kAudioUnitErr_Initialized: return "AudioUnit Initialized"
case kAudioUnitErr_InvalidElement: return "AudioUnit Invalid Element"
case kAudioUnitErr_InvalidFile: return "AudioUnit Invalid File"
case kAudioUnitErr_InvalidOfflineRender: return "AudioUnit Invalid Offline Render"
case kAudioUnitErr_InvalidParameter: return "AudioUnit Invalid Parameter"
case kAudioUnitErr_InvalidProperty: return "AudioUnit Invalid Property"
case kAudioUnitErr_InvalidPropertyValue: return "AudioUnit Invalid Property Value"
case kAudioUnitErr_InvalidScope: return "AudioUnit InvalidScope"
case kAudioUnitErr_InstrumentTypeNotFound: return "AudioUnit Instrument Type Not Found"
case kAudioUnitErr_NoConnection: return "AudioUnit No Connection"
case kAudioUnitErr_PropertyNotInUse: return "AudioUnit Property Not In Use"
case kAudioUnitErr_PropertyNotWritable: return "AudioUnit Property Not Writable"
case kAudioUnitErr_TooManyFramesToProcess: return "AudioUnit Too Many Frames To Process"
case kAudioUnitErr_Unauthorized: return "AudioUnit Unauthorized"
case kAudioUnitErr_Uninitialized: return "AudioUnit Uninitialized"
case kAudioUnitErr_UnknownFileType: return "AudioUnit Unknown File Type"
case kAudioUnitErr_RenderTimeout: return "AudioUnit Rendre Timeout"
//***** Audio errors
case kAudio_BadFilePathError: return "Audio Bad File Path Error"
case kAudio_FileNotFoundError: return "Audio File Not Found Error"
case kAudio_FilePermissionError: return "Audio File Permission Error"
case kAudio_MemFullError: return "Audio Mem Full Error"
case kAudio_ParamError: return "Audio Param Error"
case kAudio_TooManyFilesOpenError: return "Audio Too Many Files Open Error"
case kAudio_UnimplementedError: return "Audio Unimplemented Error"
default: return "Unknown error (no description)"
}
}
}
The workaround for the inPropertyData == NULL issue is adapted from Swift, AVAudioRecorder: Error 317: ca_debug_string: inPropertyData == NULL.
The code that provides string messages for the OSStatus codes is adapted from here: How do you convert an iPhone OSStatus code to something useful?.
The AVFoundation framework is new in Lion and is very similar to the iOS version. That includes AVAudioRecorder. You can use the code from iOS with little or no modification.
Docs are here.
The reason that your code does not play the audio is audioPlayer variable is immediately released as soon as it reaches the end of the method block.
So move the following variable to the outside of the method block, then it will play the audio well.
AVAudioPlayer *audioPlayer;
By the way, your code snippet was very helpful for me! :D
Here is the snippet for Mac:
NSDictionary *soundSetting = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatMPEG4AAC],AVFormatIDKey,
[NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityHigh],AVEncoderAudioQualityKey, nil];
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* documentsDirectory = [paths objectAtIndex:0];
NSURL* audioFileURL = [NSURL fileURLWithPath: [documentsDirectory stringByAppendingString:#"/test.wav"]];
NSError* error;
AVAudioRecorder* soundRecorder = soundRecorder = [[AVAudioRecorder alloc] initWithURL: audioFileURL settings: soundSetting error: &error];
if (error)
{
NSLog(#"Error! soundRecorder initialization failed...");
}
// start recording
[soundRecorder record];

Clang Error on "Potential null dereference."

I keep getting Clang errors on the following type of code and I can't figure out why they're erroneous or how to resolve them to Clang's satisfaction:
+ (NSString *)checkForLength: (NSString *)theString error: (NSError **)error {
BOOL hasLength = ([theString length] > 0);
if (hasLength) return theString;
else {
*error = [NSError errorWithDomain:#"ErrorDomain" code:hasLength userInfo:nil];
return nil;
}
}
Leaving aside the utterly-contrived nature of the example (which Clang did object to so it's illustrative enough), Clang balks at the error assignment line with the following objection:
Potential null dereference. According to coding standards in 'Creating and Returning NSError Objects' the parameter 'error' may be null.
I like having a pristine Clang report. I've read the cited document and I can't see a way to do what's expected; I checked some open-source Cocoa libraries and this seems to be a common idiom. Any ideas?
The way to do what's expected is shown in listing 3-5 in that document. With your example code:
+ (NSString *)checkForLength: (NSString *)theString error: (NSError **)error {
BOOL hasLength = ([theString length] > 0);
if (hasLength) return theString;
else {
if (error != NULL) *error = [NSError errorWithDomain:#"ErrorDomain" code:hasLength userInfo:nil];
return nil;
}
}
The Cocoa convention is that the return value should indicate success or failure (in this case, you return nil for failure) and the error is filled in with additional information, but only when the caller requests it.
In other words
NSError *error = nil;
NSString *result = [self checkForLength: aString error: &error];
and
NSString *result = [self checkForLength: aString error: NULL];
are both valid ways to invoke the method. So the method body should always check for a NULL error param:
if (error != NULL)
*error = ...;

Resources