Playing Sound With OpenAL - xcode

Whats the best way to play a sound using OpenAL in xcode 3.2.2 on a 3.1.2 SDK
im pulling my brains out at the moment. I've followed ben brittel and mike daley's tutorials on openAL and ive implemented all the things needed to play sound. Basically i creaed a sound manager class with the help of their fantastic tutorials.
The only problem is i get a SIGABRT error. the app doesnt even load when i try to initialise the sounds.
Im making a drum application. The app works fine until i try to play the sound
so ive decided i may need to start from fresh again. (since before i was playing sound using the playsystemsound but that is very slow and is not idea for games programming)
can someone please help and tell me the best way to play sound using OpenAL
Thanks
I need openal so i can use stuff like pitch control.

You can use Finch, a simple iOS SFX engine written atop of OpenAL. It has a very low latency, so that it should be a good fit for a drumming application, and it also offers pitch control.

Related

Where can I find an example of creating a FaceTime comparable camera in OSX

Many of us are working from home more and I envy the Windows guys who have a virtual webcam plugging in OBS (Open Broadcast Studio). OBS and the Windows plugin are open source projects. As a competent software engineer I should be able to create a plugin that works on OSX -But- I am not a hardened OSX dev.
I am sure I am not googling for the correct APIs and Subsystems. If someone(s) could help me with the Apple concept map to this obscure topic. I would be grateful for a set of crumbs that leads to the OSX API to call(s) to create a camera. I know it can be done as SnapCam does it, but that is a closed-source app.
I am aware of the workaround for OBS that
1) uses code injection and requires disabling security features
2) Doesn't even work in the current versions of OSX
3) Requires yet another app running with video previews etc.
I like the challenge of creating this plugin. I am also wise enough to try and ask for a road map if one is available.
Someone beat me to it. https://github.com/johnboiles/obs-mac-virtualcam
I thought I would search just githib.com directly with the search "virtual camera macos site:github.com". Constraining the search to just GitHub was quite useful.

How do I play an MP3 file using Lazarus on macOS

I'd like to be able to play an MP3 file programmatically, using Lazarus on macOS.
Lazarus 2.0 (fpc 3.0.4) on macOS is working great for me, but one thing I cannot manage to do is to play an MP3 file programmatically.
I managed to compile and run the OALSoundManager demo project, but only WAV files can be played that way.
I spent several hours following various leads from the freepascal forum, but I still could not manage to do the basic play operations:
Load an MP3 file
Start playing it.
Get the current playing position (e.g. during OnTimer).
Be notified when it stops.
I'm OK with using any common library. Of course the less dependencies the better.
Once I can play the file I can figure out the rest, but it would be great if the example also showed:
Start playing from a given time position
Pause/Restart
Thank you very much for any help!
You might be able to use Castle Engine and OpenAL.
You can install Castle Engine from with-in Lazarus. In the main menu under "Package" -> "Online Package Manager" you will be able to filter and install "castle".
Then you should be able to open the example project:
https://github.com/castle-engine/castle-engine/blob/master/examples/audio/alplay.lpr
Goodluck,

Cinder or pure OpenGL for iOS development

As im quite new to obj-C but on the other hand a common user of CPP and ANSI C, im a bit out of my comfortzone working with obj-C.
So people might ask, why i dont learn obj-C? Id love to learn it, but unfortunately im in a very tight schedule due to my uni project i would like to get as much working as possible.
I have worked a little with obj-C and run through tutorials, but i dont see myself being capable of creating big amounts of precious code as i would with CPP.
Yesterday i got familiar with Cinder framework and tried out some examples, which resulted in quite rapid results, especially with OpenGL and math libraries.
What bothers me right now is the capability of working with the XCode interface builder and binding the storyboard or xib files to the project.
Is there a creative way to combine the great GUI with Cinder in XCode, or am i "forced" to go back to pure obj-C and my libraries?
Thanks.
Have you looked into any of the frameworks built using other languages you may be more comfortable with that are specifically designed for iOS (and Android) game and app development?
ImpactJS - http://impactjs.com/
Corona SDK - http://www.anscamobile.com/corona/
The key features with these as well is that you write your code once and it works across multiple platforms, and may be easier and quicker for you if you know JavaScript or Lua (and the learning curve is smaller on these languages typically anyways).

Capturing Audio buffer in Cocoa

Hello All
In my application , i need to enable voice communication over IP,i.e. need to capture the Audio buffer and send it over the Internet, through secure socket,
In the Lower layer everything is ready, I need the entry point to start Voice Communication, but not getting any pointer in the Apple documentation, so far i have done following,
1 -- In the Apple documentation going through CoreAudio programming guide, is this right place to start,
2 -- if yes, somewhere it says, i need to download CoreAudio SDK, will it not come along with standard XCOde and Cocoa framework,
Kind Regards
Rohan
1: CoreAudio programming guide is the right place to start.
Also there are very good samples in Mac Dev Center and perhaps some iPhone examples are portable/useful as well (since CoreAudio in the iPhone is basically a sub-set of CoreAudio for the Mac)
2: You don't need to download anything else. If you have Xcode then just add CoreAudio.framework to your project and add #import <CoreAudio/CoreAudio.h>
Nacho4d is right. You don't need to download anything. The CoreAudio.framework is included in XCode.
As for learning how to work with Core-Audio in general Apple's documentation may be painful to use if you are not a seasoned programmer. Only one book has been written on the subject so far: It's called Core Audio (Rough Cuts) and it's written by Kevin Avila and Chris Adamson on Addison-Wesley Professional. It should help you with the basics.

Stream audio on Mac

I have been using AVAudioPlayer on the iPhone and iPad. I can't find anything that comes close on the Mac. Does anyone know where to find a library like that ?
What I want is a library that can stream songs from the internet so the user does not need to download the whole song.
NSSound can play songs from the internet but it does not stream, it just downloads the song and starts to play.
I know ffmpeg works on the Mac and supports streaming. As an added bonus, it works on Linux and Windows too.
Matt Gallagher has a series of articles on his website regarding this, streaming audio on the iPhone and on the Mac as well. He has this project posted on github as well.
It's pretty popular and I've personally used it on an iPhone project and it worked well. It comes with an included Mac sample project which also works well.

Resources