EXEC_BAD_INSTRUCTION (code=1, address=0xe) with MusicSequenceBarBeatTimeToBeats in Swift 2 - swift2

I have problem when using Apples MusicSequence C API in Swift 2. I can't figure out how to get rid of EXEC_BAD_INSTRUCTION when calling MusicSequenceBarBeatTimeToBeats.
I have tried a lot of different solutions found on internet, but nothing seems to work.
My first try that didn't result an compile error was:
import Cocoa
import AudioToolbox
var musicSequence = MusicSequence()
NewMusicSequence(&musicSequence)
var barBeatTime = CABarBeatTime(bar: 1, beat: 1, subbeat: 0, subbeatDivisor: 960, reserved: 0)
var musicTimeStamp: MusicTimeStamp = 0
MusicSequenceBarBeatTimeToBeats(musicSequence, &barBeatTime, &musicTimeStamp)
The result is: EXEC_BAD_INSTRUCTION (code=1, address=0xe) at the MusicSequenceBarBeatTimeToBeats code row
Then I tried with this, but with same result:
var barBeatTime = CABarBeatTime(bar: 1, beat: 1, subbeat: 0, subbeatDivisor: 960, reserved: 0)
var musicTimeStamp = UnsafeMutablePointer<MusicTimeStamp>.alloc(sizeof(MusicTimeStamp))
MusicSequenceBarBeatTimeToBeats(musicSequence, &barBeatTime, musicTimeStamp)
musicTimeStamp.dealloc(sizeof(MusicTimeStamp))
I have tried the withUnsafeutablePointer function, but with same runtime error:
var barBeatTime = CABarBeatTime(bar: 1, beat: 1, subbeat: 0, subbeatDivisor: 960, reserved: 0)
var musicTimeStamp: MusicTimeStamp = 0
withUnsafeMutablePointer(&musicTimeStamp,
{
MusicSequenceBarBeatTimeToBeats(musicSequence, &barBeatTime, UnsafeMutablePointer($0))
})
If I pass nil in the third parameter it doesn't crash, but this is useless because the output is missing. So it seem to be the UnsafeMutablePointer<MusicTimeStamp> output parameter that is the problem.
MusicSequenceBarBeatTimeToBeats(musicSequence, &caBarBeatTime, nil)
Any suggestions?

Related

PonyLang Windows CreateProcess FFI

I've been trying to call Window's CreateProcessA from Pony Language's FFI.
I created both a C and a PonyLang example. The C example works great:
#include <windows.h>
#include <stdio.h>
#include <tchar.h>
void wmain(void) {
STARTUPINFO info={0};
PROCESS_INFORMATION processInfo={0};
CreateProcessA("calc.exe", 0, 0, 0, 0, 0, 0, 0, &info, &processInfo);
if (status == 0)
printf("%d",GetLastError()); // never hits
}
I put calc.exe in the current directory. This works flawlessly on Windows.
However, my PonyLang implementation keeps on returning a non zero GetLastError:
use "lib:kernel32"
primitive _ProcessAttributes
primitive _ThreadAttributes
primitive _Inherit
primitive _Creation
primitive _Environment
primitive _CurrentDir
primitive _StartupInfo
primitive _ProcessInfo
primitive _HandleIn
primitive _HandleOut
primitive _HandleErr
primitive _Thread
primitive _Process
struct StartupInfo
var cb:I32 = 0
var lpReserved:Pointer[U8] tag= "".cstring()
var lpDesktop:Pointer[U8] tag= "".cstring()
var lpTitle:Pointer[U8] tag= "".cstring()
var dwX:I32 = 0
var dwY:I32 = 0
var dwXSize:I32=0
var dwYSize:I32=0
var dwXCountChars:I32=0
var dwYCountChars:I32=0
var dwFillAttribute:I32=0
var dwFlags:I32=0
var wShowWindow:I16=0
var cbReserved2:I16=0
var lpReserved2:Pointer[U8] tag="".cstring()
var hStdInput:Pointer[_HandleIn] = Pointer[_HandleIn]
var hStdOutput:Pointer[_HandleOut]= Pointer[_HandleOut]
var hStdError:Pointer[_HandleErr]= Pointer[_HandleErr]
struct ProcessInfo
var hProcess:Pointer[_Process] = Pointer[_Process]
var hThread:Pointer[_Thread] = Pointer[_Thread]
var dwProcessId:I32 = 0
var dwThreadId:I32 = 0
//var si:StartupInfo = StartupInfo
actor Main
new create(env: Env) =>
var si: StartupInfo = StartupInfo
var pi: ProcessInfo = ProcessInfo
var inherit:I8 = 0
var creation:I32 = 0
var one:I32 = 0
var two:I32 = 0
var three:I32 = 0
var four:I32 = 0
var z:I32 = 0
var p = #CreateProcessA[I8]("calc.exe",
z,
one,
two,
inherit,
creation,
three,
four,
addressof si,
addressof pi)
if p == 0 then
var err = #GetLastError[I32]() // hits this every time.
env.out.print("Last Error: " + err.string())
end
So the above code compiles for PonyLang, but GetLastError most of the time returns 2. Sometimes GetLastError returns 123. Other times it returns 998?
It all seems odd that the error code is different sometimes. Those codes all mean that there is some issue with file access?
Calc.exe is in the current directory (same directory as the c example).
Also not only is the Error code different but calc.exe is executed(runs fine) in the C version but not in the PonyLang version. This leads me to believe something is off with my PonyLang ffi setup.
Does anyone know what may be wrong?
The problem is with your use of addressof. When you create a struct object, e.g. with var si = StartupInfo, the underlying type is a pointer to the structure (i.e structs in Pony don't have value semantics). Then when you call CreateProcessA with addressof, you're actually passing a pointer to pointer to the function.
If your C function expects a pointer to a structure, you can simply pass the Pony object without addressof when doing the FFI call.

Minor Syntax issue regarding Swift 3.0 and a for loop

I recently just converted my code to the Swift 3.0 syntax that came with the Xcode 8 beta. I ran into several lines of code that I needed to change in order for the code to work with the latest syntax. I was able to correct all lines of code except for an error that I get regarding a for loop that I use to allow my background image to loop continuously.
The exact error message that I get is: Ambiguous reference to member '..<'
for i:CGFloat in 0 ..< 3 {
let background = SKSpriteNode(texture: backgroundTexture)
background.position = CGPoint(x: backgroundTexture.size().width/2 + (backgroundTexture.size().width * i), y: self.frame.midY)
background.size.height = self.frame.height
background.run(movingAndReplacingBackground)
self.addChild(background)
}
Don't use a floating point type as loop index
for i in 0 ..< 3 {
let background = SKSpriteNode(texture: backgroundTexture)
background.position = CGPoint(x: backgroundTexture.size().width/2 + (backgroundTexture.size().width * CGFloat(i)), y: self.frame.midY)
background.size.height = self.frame.height
background.run(movingAndReplacingBackground)
self.addChild(background)
}
Try with this:
for i in 0..<3{
let index = CGFloat(i)
//Your Code
let background = SKSpriteNode(texture: backgroundTexture)
background.position = CGPoint(x: backgroundTexture.size().width/2 + (backgroundTexture.size().width * index), y: self.frame.midY)
background.size.height = self.frame.height
background.run(movingAndReplacingBackground)
self.addChild(background)
}
The problem is you are performing for-loop on Int and you are also specifying it as a CGFloat. So, there a confusing between both the types.

Why don't I get the result of this Metal kernel

I am trying to understand how Metal compute shaders work, so I have wrote this code :
class AppDelegate: NSObject, NSApplicationDelegate {
var number:Float!
var buffer:MTLBuffer!
func applicationDidFinishLaunching(aNotification: NSNotification) {
// Insert code here to initialize your application
let metalDevice = MTLCreateSystemDefaultDevice()!
let library = metalDevice.newDefaultLibrary()!
let commandQueue = metalDevice.newCommandQueue()
let commandBuffer = commandQueue.commandBuffer()
let commandEncoder = commandBuffer.computeCommandEncoder()
let pointlessFunction = library.newFunctionWithName("pointless")!
let pipelineState = try! metalDevice.newComputePipelineStateWithFunction(pointlessFunction)
commandEncoder.setComputePipelineState(pipelineState)
number = 12
buffer = metalDevice.newBufferWithBytes(&number, length: sizeof(Float), options: MTLResourceOptions.StorageModeShared)
commandEncoder.setBuffer(buffer, offset: 0, atIndex: 0)
commandEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
let data = NSData(bytesNoCopy: buffer.contents(), length: sizeof(Float), freeWhenDone: false)
var newResult:Float = 0
data.getBytes(&newResult, length: sizeof(Float))
print(newResult)
}
By making a buffer with StorageModeShared, I want changes made to the Metal buffer reflected in my Swift code, but when I populate my newResult variable, it looks like the buffer is still the same value than at the beginning (12) while it should be 125 :
#include <metal_stdlib>
using namespace metal;
kernel void pointless (device float* outData [[ buffer(0) ]]) {
*outData = 125.0;
}
What am I doing wrong ?
A kernel function doesn't run unless you dispatch it. I think you're assuming if you have a function, then Metal should run it one time, until you say otherwise, but that won't happen. It will instead not run at all. Add this before endEncoding and you're good to go!
let size = MTLSize(width: 1, height: 1, depth: 1)
commandEncoder.dispatchThreadgroups(size, threadsPerThreadgroup: size)

Differences between Playground and Project

In the course of answering another question, I came across a weird bug in Playground. I have the following code to test if an object is an Array, Dictionary or a Set:
import Foundation
func isCollectionType(value : AnyObject) -> Bool {
let object = value as! NSObject
return object.isKindOfClass(NSArray)
|| object.isKindOfClass(NSDictionary)
|| object.isKindOfClass(NSSet)
}
var arrayOfInt = [1, 2, 3]
var dictionary = ["name": "john", "age": "30"]
var anInt = 42
var aString = "Hello world"
println(isCollectionType(arrayOfInt)) // true
println(isCollectionType(dictionary)) // true
println(isCollectionType(anInt)) // false
println(isCollectionType(aString)) // false
The code worked as expected when I put it into a Swift project or running it from the command line. However Playground wouldn't compile and give me the following error on the downcast to NSObject:
Playground execution failed: Execution was interrupted, reason: EXC_BAD_ACCESS (code=2, address=0x7fb1d0f77fe8).
* thread #1: tid = 0x298023, 0x00007fb1d0f77fe8, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=2, address=0x7fb1d0f77fe8)
* frame #0: 0x00007fb1d0f77fe8
frame #1: 0x000000010ba46e12 libswiftCore.dylib`Swift._EmptyArrayStorage._withVerbatimBridgedUnsafeBuffer (Swift._EmptyArrayStorage)<A>((Swift.UnsafeBufferPointer<Swift.AnyObject>) -> A) -> Swift.Optional<A> + 50
The build platform was OS X in all three cases. Does anyone know how to get Playground to play along?
Xcode 6.3.2. Swift 1.2. OS X 10.10.3 Yosemite
Not really the cause of that bug (it does look weird) but...
You will need to use optional chaining since value can be AnyObject:
import Foundation
func isCollectionType(value : AnyObject) -> Bool {
if let object = value as? NSObject {
return object.isKindOfClass(NSArray)
|| object.isKindOfClass(NSDictionary)
|| object.isKindOfClass(NSSet)
}
return false
}
var arrayOfInt = [1, 2, 3]
var dictionary = ["name": "john", "age": "30"]
var anInt = 42
var aString = "Hello world"
isCollectionType(arrayOfInt)
isCollectionType(dictionary)
isCollectionType(anInt)
isCollectionType(aString)
Also worth noting, NSArray and Array are different things:
NSArray is an immutable class:
#interface NSArray : NSObject <NSCopying, NSMutableCopying, NSSecureCoding, NSFastEnumeration>
whilst Array is a struct:
struct Array<T> : MutableCollectionType, Sliceable, _DestructorSafeContainer
With this in mind it might be surprising that isCollectionType(arrayOfInt) returns true - but there is a conversion happening.

AuthorizationCreate in Swift (Xcode 6)

I've been looking for some help in creating authorization for my app to have it run a few shell scripts as root. I've looked through the Apple documentation (which is of course written in OBJ-C and quite vague) and I'm trying to use the code examples in Swift.
Immediately I'm running in an error with the AuthorizationCreate function:
var authRef: AuthorizationRef
let osStatus = AuthorizationCreate(nil, nil, kAuthorizationFlagDefaults, &authRef)
'Int' is not convertible to 'AuthorizationFlags'
I'm just trying to follow along with the code snippets in the docs from: https://developer.apple.com/library/mac/documentation/Security/Conceptual/authorization_concepts/03authtasks/authtasks.html#//apple_ref/doc/uid/TP30000995-CH206-TP9
And I found the constant for kAuthorizationFlagDefaults from here: https://developer.apple.com/library/mac/documentation/Security/Reference/authorization_ref/#//apple_ref/doc/constant_group/Authorization_Options
I'm running in 10.10.1 if that matters.
I've seen the solution for using AppleScript, but I really want to avoid this is possible.
kAuthorizationFlagDefaults is an Int and has to be converted to
AuthorizationFlags (which is a type alias for UInt32). Also authRef has to be initialized:
var authRef: AuthorizationRef = nil
let authFlags = AuthorizationFlags(kAuthorizationFlagDefaults)
let osStatus = AuthorizationCreate(nil, nil, authFlags, &authRef)
Extended example (untested!):
var myItems = [
AuthorizationItem(name: "com.myOrganization.myProduct.myRight1",
valueLength: 0, value: nil, flags: 0),
AuthorizationItem(name: "com.myOrganization.myProduct.myRight2",
valueLength: 0, value: nil, flags: 0)
]
var myRights = AuthorizationRights(count: UInt32(myItems.count), items: &myItems)
let myFlags = AuthorizationFlags(kAuthorizationFlagDefaults |
kAuthorizationFlagInteractionAllowed |
kAuthorizationFlagExtendRights)
var authRef: AuthorizationRef = nil
let authFlags = AuthorizationFlags(kAuthorizationFlagDefaults)
let osStatus = AuthorizationCreate(&myRights, nil, authFlags, &authRef)
Edit: Swift 3
var myItems = [
AuthorizationItem(name: "com.myOrganization.myProduct.myRight1",
valueLength: 0, value: nil, flags: 0),
AuthorizationItem(name: "com.myOrganization.myProduct.myRight2",
valueLength: 0, value: nil, flags: 0)
]
var myRights = AuthorizationRights(count: UInt32(myItems.count), items: &myItems)
let myFlags : AuthorizationFlags = [.interactionAllowed, .extendRights]
var authRef: AuthorizationRef?
let osStatus = AuthorizationCreate(&myRights, nil, myFlags, &authRef)

Resources