Is is possible to trap the MIDI signals being sent by my keyboard connected via MIDI-USB in Ruby? If not Ruby, how would I do it in C so I can make a Ruby extension?
Use PortMidi, which is part of the PortMedia project. A little Googling showed several references to existing Ruby bindings to PortMidi, so you may not need to do much/any work to get things running.
What is PortMedia?
PortMedia is a set of APIs and
library implementations for music and
other media.
PortMedia is open-source
and runs on Windows, Macintosh, and
Linux.
Currently, libraries support
Audio I/O and MIDI I/O.
Related
I'm currently writing a light application that handles IO on the serial port, standard input(keyboard) and output (screen). This allows interaction with software embedded on an external board linked to the PC with serial over USB.
I mainly use the read_nonblock function and it works well on Linux:
$c = $usb_ios.read_nonblock(500)
$in = STDIN.read_nonblock(500)
I tried to use it on Windows, but I had an error using read_nonblock, then I learned that read_nonblock cannot be used on Windows.
After reading lot of posts, FAQs, and blogs, I cannot find a simple way to have some kind of non-blocking read on the serial port. (There might be a dirty way for keyboard input.)
Please note that I don't want to use the 'serialport' gem for several reasons. In fact I want it to work with regular, basic, versions of Ruby.
Can ruby manipulate and work with peripherals like webcams? I would like to create a system that uses a webcam. Is it possible to do with ruby?
You should be able to control a webcam with Ruby. At the very least, you can interface with a Java or native library for the webcam control -- Ruby can easily talk to Java, C, C++, Objective-C...
Ruby is a generally used on the server-side. As such, if you're looking for a solution to interface with a client's webcam from Ruby running on a webserver then the answer is no.
On the other hand, if you'd like to interact with a webcam connected to the server executing Ruby code (or just to execute Ruby code locally) then the answer is potentially yes. I'm not a Ruby programmer but as far as I know while Ruby most probably doesn't have direct support for talking to a webcam, it does support bindings as C-style dlls and you should be able to craft a binding for it to provide an interface for interacting with webcams.
I don't know if such bindings already exist but in case they don't you should be able to build yourself one assuming you know C/C++ or some other language that can export bindings for Ruby.
I'm working on a desktop application for OS X using Ruby-Tk, and I would like to provide an Apple Events interface for the application. This means that the application would define a dictionary of AppleScript commands that it would respond to (corresponding to Apple Events being sent to the application), and users/other applications could script the Ruby-Tk application with AppleScript commands. Other scripting languages support such functionality--Python through the py-aemreceive library at http://appscript.svn.sourceforge.net/viewvc/appscript/py-aemreceive/ and Tcl through the tclAE library at http://tclae.sourceforge.net/. I've been looking for similar functionality in Ruby and have come up empty.
One possible mechanism is the rubyobjc bridge, which provides a low-level interface between Ruby and Objective-C, but this gem appears to be little-used and is sorely lacking in examples and documentation, so I am not sure if this would be a fruitful path to pursue.
NB: MacRuby might work but it is not compatible with Tk, so that rules out MacRuby. Also, RubyOSA and rb-appscript are not what I am looking for--they allow Ruby to send Apple Events to other applications, not receive them.
In the absence of other alternatives, it appears I am going to have to write my own Ruby wrapper for the portions of the AppleEvent C API that I need: mainly AEInstallEventHandler and related functions. Fortunately Apple still supports this API even though it has been relegated to "legacy" status in Apple's developer docs (though, interestingly, it is not deprecated). I'll either integrate these functions via Ruby's ffi gem or, more likely, via Ruby's C API (which I still have to dig into); using the C API directly would reduce the need for dependencies on other gems/modules. If this goes well I will release it as a gem.
Regarding Donal Fellows' comment, my need is for custom AppleEvents--the ones supported via Tk in the docs he recommends can likely be accessed through calls to Tk from Ruby's Tk interface.
You might want to check the appscript library (note: Seems to only work with the OS X-provided Ruby), or try using MacRuby, which wraps pretty much everything of the APIs available in OS X, even the C ones.
I've recently bought a copy of EZDrummer, a VST plugin that acts as a virtual drumkit. I'd really like to hook into it from Ruby code so that I can create loops and drum patterns programmatically. To be honest I am not sure even where to start. Presumably I have to create a VST host which can load the plugin and then hook into it somehow. I am a Ruby developer so that's the language I'd be looking to implement this in. Any pointers in the right direction?
Since you bought a VST plugin, I assume you have some sort of DAW as well. Before you start trying to host a VST from within ruby, try the following smaller projects:
Generate a MIDI file from ruby. Load
the MIDI file into your DAW, and
play.
Stream live MIDI data from your ruby
process to your DAW. On Windows, you
can do this with ReWire, on OSX, you
can create an IAC bus in the Audio /
MIDI setup app.
If you need more direct control of EZDrummer than this allows you, then go down the path of trying to host the VST from within Ruby.
Take, for example, the VSTi Triforce, by Tweakbench. When loaded up in any VST host on the market, it allows the host to send a (presumably MIDI) signal to the VSTi. The VSTi will then process that signal and output synthesized audio as created by a software instrument within the VSTi.
For example, sending an A4 (MIDI note, I believe) to the VSTi will cause it to synthesize the A above Middle C. It sends the audio data back to the VST Host, which then could either play it on my speakers or save it to .wav or some other audio file format.
Let's say I have Triforce, and am trying to write a program in my language of choice that could interact with the VSTi by sending in an A4 note to be synthesized, and automatically saving it to a file on the system?
Eventually, I'd like to be able to parse an entire one-track MIDI file (using established, stable libraries already available for this purpose) and send it to the VSTi to "render"/synthesize it into an audio file.
How would I go about this, and in what language should I look to build the core framework?
Ultimately, it will be used in a Ruby-based project, so any pointers to specific Ruby resources would be nice as well.
However, I'm just trying to understand basically how the API of a VSTi works. (I've realized that this question is very much related to the question of building a VST host in the first place, albeit one that can only save VST outputs to file and not play them back, and with considerably smaller scope)
Well, since you asked, the ideal language for a project like this is going to be C++. Although there are wrappers for higher-level languages such as Java & .NET for the VST SDK, I couldn't find one for Ruby (though I did find this rather cool project which lets you program VST plugins in Ruby). So you will be stuck doing some degree of C/C++ integration on your own.
That said, you have basically two options here:
Write a VST Host in C++, and launch it as a separate process from within Ruby.
Integrate your Ruby code directly to the VST SDK, and load the plugin DLL's/Bundles directly from your code. This is probably the cleaner but harder way to accomplish your goal.
I wrote up a VST host programming tutorial on my blog awhile back which you may find useful in either case. It details how you open and communicate with VST plugins on both Mac OSX and Windows. Once you have gotten your host to load up the plugins, you need to be able to either send MIDI events directly to the plugin, either by reading them from file or some type of communication between your Ruby code and the VST host (ie, a named pipe, socket, file, etc.). If you are unfamiliar with the MIDI protocol, check out these links:
The MIDI technical fanatic's brainwashing center (silly name, serious resource)
The Sonic Spot's MIDI file specification (in case you need to read MIDI files)
As you might have already figured out, VST is fundamentally a block-based protocol. You request small blocks of audio data from the plugin, and you send along any MIDI events to the plugin right before it processes that respective block. Be sure not to ignore the MIDI delta field; this will ensure that the plugin starts processing the MIDI event directly on the desired sample. Otherwise, the plugin will sound a bit off-tempo, especially in the case of instruments.
The VST SDK is also based around floating-point blocks, so any data you get back will contain individual samples in the range { -1.0 .. 1.0 }. Depending on your desired output format, you may need to convert these to some other format. Fortunately, there seems to be a Ruby binding for the audiofile library, so you may be able to send your output into that in order to generate a proper AIFF/WAV file.
In all, it'll be a fair amount of work to get to your desired end goal, but it's not impossible by any means. Good luck!