Can ruby manipulate and work with peripherals like webcams? I would like to create a system that uses a webcam. Is it possible to do with ruby?
You should be able to control a webcam with Ruby. At the very least, you can interface with a Java or native library for the webcam control -- Ruby can easily talk to Java, C, C++, Objective-C...
Ruby is a generally used on the server-side. As such, if you're looking for a solution to interface with a client's webcam from Ruby running on a webserver then the answer is no.
On the other hand, if you'd like to interact with a webcam connected to the server executing Ruby code (or just to execute Ruby code locally) then the answer is potentially yes. I'm not a Ruby programmer but as far as I know while Ruby most probably doesn't have direct support for talking to a webcam, it does support bindings as C-style dlls and you should be able to craft a binding for it to provide an interface for interacting with webcams.
I don't know if such bindings already exist but in case they don't you should be able to build yourself one assuming you know C/C++ or some other language that can export bindings for Ruby.
Related
My team is currently creating an API that will interact with our core Ruby API.
The new API is for the public as the Ruby API is our private API. We want to be able to compile this new API into a PHP, Java, Python, etc., client libraries when we are ready to release.
Are there any gems, or other ways to write this new API so we can compile it into different client libraries?
There are several ways to think about exposing an API these days. If we're talking about creating a library in the sense of something compiled into other applications, that takes us down one path. If we're talking about providing, effectively, command line functionality callable from other contexts as system calls, that's another story. More broadly is the API is more of a service, like REST, that's different. I'll assume one of the first two.
There are several tools that will create a binary package for platforms. Look at ruby-tioolbox.com for some examples. None of these are compiling true executable code (as far as I know) but mainly provide an executable version of ruby with your code and dependencies packaged up. Perhaps the API appears callable as a system library (DLL for Windows, SO for UNIXy).
But either way, I would think you're dealing with ruby and your code loading and running as a separate process on each call. There's a library like this (not in Ruby) called ImageMagick, with a wrapper for ruby calls called MiniMagick that might be a pointer to the kind of pattern you're looking for.
If you want to run ruby and your app as a service, there are several tools for this -- this helps address the overhead of loading a process each time, and built into Ruby 1.9 is a Process class that daemonizes ruby, although presumably only on Unix. Check this SO answer
The best answer is probably that ruby, like other similar languages (e.g. Python) are really not designed to be low-level system libraries. There are likely many ways of accomplishing what you want in a given environment (notably Linux) ... but as far as I know nothing that really exposes executable codepoints to all languages.
So I'm looking to build an application that will be able to record the users screen and stream it at the same time. I would like the application to run on both Windows and OSX. I don't have a high level of programming experience in any language, just basic understanding in C, C++, JS, (funny how each class you take in college wants a different language). I'm also pretty well versed in HTML and CSS but that is kind of irrelevant for this topic.
I've been looking around and it looks like the best solution is going to be writing the core of the program in one language, and then developing the Interface side for each platform differently, using appropriate languages and bindings for the different platforms (Objective-C and Cocoa for OSX and so forth).
I'm open to all suggestions, this project doesn't have a deadline or anything, I'm really just intending it as a learning experience. I've never done anything with video capture and streaming before, so I'm looking for suggestions as to which road to go down language-wise for this project.
Thanks in advance :)
the simplest solution that comes to my mind is to use VLC.
this is obviously not a "language" but an application, but it supports screen-capture and streaming on all of your target platforms (and more).
if this is not an option (e.g. because you don't want a separate application), you could use VLC's C-api for acquiring screen capture and use whatever you like for streaming.
if you want to only rely on native functionality, i would use C/C++ for the application and write the OSX part in ObjC/ObjC++ and Cocoa.
I'm working on a desktop application for OS X using Ruby-Tk, and I would like to provide an Apple Events interface for the application. This means that the application would define a dictionary of AppleScript commands that it would respond to (corresponding to Apple Events being sent to the application), and users/other applications could script the Ruby-Tk application with AppleScript commands. Other scripting languages support such functionality--Python through the py-aemreceive library at http://appscript.svn.sourceforge.net/viewvc/appscript/py-aemreceive/ and Tcl through the tclAE library at http://tclae.sourceforge.net/. I've been looking for similar functionality in Ruby and have come up empty.
One possible mechanism is the rubyobjc bridge, which provides a low-level interface between Ruby and Objective-C, but this gem appears to be little-used and is sorely lacking in examples and documentation, so I am not sure if this would be a fruitful path to pursue.
NB: MacRuby might work but it is not compatible with Tk, so that rules out MacRuby. Also, RubyOSA and rb-appscript are not what I am looking for--they allow Ruby to send Apple Events to other applications, not receive them.
In the absence of other alternatives, it appears I am going to have to write my own Ruby wrapper for the portions of the AppleEvent C API that I need: mainly AEInstallEventHandler and related functions. Fortunately Apple still supports this API even though it has been relegated to "legacy" status in Apple's developer docs (though, interestingly, it is not deprecated). I'll either integrate these functions via Ruby's ffi gem or, more likely, via Ruby's C API (which I still have to dig into); using the C API directly would reduce the need for dependencies on other gems/modules. If this goes well I will release it as a gem.
Regarding Donal Fellows' comment, my need is for custom AppleEvents--the ones supported via Tk in the docs he recommends can likely be accessed through calls to Tk from Ruby's Tk interface.
You might want to check the appscript library (note: Seems to only work with the OS X-provided Ruby), or try using MacRuby, which wraps pretty much everything of the APIs available in OS X, even the C ones.
Is is possible to trap the MIDI signals being sent by my keyboard connected via MIDI-USB in Ruby? If not Ruby, how would I do it in C so I can make a Ruby extension?
Use PortMidi, which is part of the PortMedia project. A little Googling showed several references to existing Ruby bindings to PortMidi, so you may not need to do much/any work to get things running.
What is PortMedia?
PortMedia is a set of APIs and
library implementations for music and
other media.
PortMedia is open-source
and runs on Windows, Macintosh, and
Linux.
Currently, libraries support
Audio I/O and MIDI I/O.
Can anyone tell me about using (Steel Bank) Common Lisp for writing GUIs via system calls? I know there are some libraries out there but this is a language learning exercise, so I want to do it myself.
I'm developing on Kubuntu 8.10 and SBCL 1.0.18.
Thanks.
You can take a closer look at the sb-posix-package. But as the page sais "The functionality contained in the package SB-UNIX is for SBCL internal use only; its contents are likely to change from version to version."
Another Possibility - the one I would choose - is to include the C-syscall(2)-Instruction via CFFI, and then perform it.
To create a windowed GUI, you must use X11, at least to give you a Window to paint on. Therefore, you must know the X11-Specifications to create a window, and implement the Syscalls for accessing Unix Domain Sockets, or initiating shm-devices etc.
If you just want graphics on a console, you could think of using the framebuffer-device /dev/fb*. You need ioctl(2)'s and the syscalls for read(2)ing and write(2)ing to use it, but still, I think this would be a lot easier than using X11 for this (even though still a lot of work). Maybe you shoudld look at the sourcecode for libFB or something similar to see how to initialize it, etc.
Is this really what you want to do? This is a lot of work, you will learn a lot, but more about the linux system infrastructure, rather than about SBCL, I think. If you want to use Syscalls under SBCL, maybe its better to try to open TCP-Sockets only with Linux-Syscalls - alone this can take hours of frustration.
By writing GUI via system calls you mean not using any GUI toolkit such as Gtk+ or Qt? In such case, you should talk directly to the X-server via socket and implement all the X11 protocol (or use CLX) and GUI on top of it. But that is not an easisest task, because X11 is complex.
In case you decide to dig into X11, there are some links for specifications: http://www.xfree86.org/current/specindex.html
http://www.freedesktop.org/wiki/Specifications?action=show&redirect=Standards