How can I call a built-in compiler function in Ruby? - ruby

I'm working in Ruby, and realized that it would be incredibly beneficial to be able to use some of the built-in gcc functions (and x86 architecture built-ins for that matter as well). It seems like having to write an extension to use these is impractical, so I was wondering if there was a way I could call built-ins. For example, if I wanted to call int __builtin_popcount(unsigned int), on a number in Ruby, is there a way I could somehow do
a = rand(1..10000)
__builtin_popcount(a)
I know that I obviously can't do something that basic, but is there a way that I could include gcc and x86 architecture built-ins in Ruby?

It is not quite clear what you want to do.
If you want to call into GCC, you could wrap libgcc in a C extension and design a Ruby API for it.
If you want to generate native code using GCC dynamically, that is currently not possible AFAIK. There is a project for a JIT compiler library based on GCC, but I don't know what its status is. You could wrap that library into a C extension and design a Ruby API for it. At any rate, you will also have to modify the Ruby implementation you are using to be able to link dynamically generated native code with your Ruby code. (And on some implementations that is simply impossible, e.g. on Opal, which is a pure static compiler.)
And of course, not all Ruby implementations actually support C extensions; they are a non-standard feature of YARV and are not guaranteed to work or even exist on other implementations.

Related

Are there any good extensible language cross compilers?

I am working on a project right now, and I would greatly enjoy being able to extend a cross compiler to convert some code into other languages. For example, I might have an AST of some code, and I would like to pass that off to a cross compiler with the intended language and receive some code in the language specified in return.
So to sum it up: is there any extensible cross compiler that I can just give an AST or equivalent and receive code in return?
(I know about Haxe, but the compiler is not very extensible and I would prefer to not transpile)
I have made the decision to use LLVM as the native compiler, and will write my own custom transpilers to other languages, as I could find no other decent option. If you would like to follow my project, head over to Provalang.

How do I use a cgo-based package on Windows?

The regexp in the Go's standard library is quite poor, so I need a more powerful engine, like regex in Python (pip install regex), supporting recursion, backref, look-ahead/behind, etc... .
I found:
https://godoc.org/github.com/dlclark/regexp2
.NET compatible, which was quite fine; however, recursion is not working properly.
and several bindings to PCRE, for example:
https://godoc.org/github.com/glenn-brown/golang-pkg-pcre/src/pkg/pcre
so, how can I use this binding on Win64?
You may consider using C++ standard library std::regex (no third-party library). Wrap the logic in try block, use catch(...){return ERROR;} to catch any error, and declare the C function extern "C" so you can call with cgo.
From https://github.com/golang/go/wiki/cgo (there is a part about Windows):
In order to use cgo on Windows, you'll also need to first install a
gcc compiler (for instance, mingw-w64) and have gcc.exe (etc.) in your
PATH environment variable before compiling with cgo will work.
That being said, I still think you should consider sticking with the regexp package and try to make regular expressions as simple as possible. Because complicated regular expressions are likely to hurt readability of code. Another problem is sometimes they introduce subtle bugs which are difficult to spot and fix. So writing more code in Go instead of regex may actually make the life easier.

How to use Aspectc++ with C++v11?

I want to use the aspectc++ compiler for a C++11-project. I have read in the manual, that c++11 support will come with version 2. I thought that aspect weaving happens only on the code level, so why does it depend on the used C++ version? Why does aspectc++ care the source code when it just has to weave the aspects to generate a composed piece of code? Is there a way to use aspectc++ for C++11 source code? Or is there an alternative which can handle it?
This post is already a bit older, i know.
Nevertheless I'd like to answer the question why aspectC++ depends on the C++-version:
aspectC++ internally parses the code (amongst other things to identify the locations where to weave the code). Not all of this can be done by external parsers therefore it needs to understand the syntax basically itself.
Some new c++-constructions from C++11 like attributes ([[...]]) could not be handeled by the AspectC++-compiler version < 2.0.
To use c++11 for compiling just use -std=c++11

GCC technical details

I don't know if this is the right place for things like this, but I am curious about a few aspects of the GCC front-end/back-end architecture:
I know I can compile .o files from C code and link them to C++ code, and I think I can do it the other way round, too. Does this work because the two languages are similar, or because the GCC back-end is really language-independent? Would this work with ADA code too? (I don't even know if that makes sense, since I don't know ADA or if it even has "functions", but the question is understood. If it makes no sense, think "Pascal" or even "my own custom language front-end")
Where would garbage-collection be implemented? For example, a Java front-end. The way I understand, if compiling to a JVM back-end, the "platform" will take care of the GC, and so the front-end needs not do anything about it, but if compiling to native code, would the front-end send garbage-collecting GENERIC code to the back-end, or does it turn on some flag telling the back-end to produce garbage-collecting code? The first makes more sense to me, but that would mean the front-end produces different output based on the target, which seems to miss the point of the GCC's front-end/back-end architecture.
Where would language-specific libraries go? For instance, the standard Java classes or standard C headers. If they are linked in at the end, then could a C program theoretically call functions from the Java library or something like that, since it is just another linked library?
Yes, the backend is at least reasonably language independent. Yes, it works with Ada.
GCJ generates native code which uses a runtime library. The garbage collector is part of the runtime library.
GCJ implements the CNI, which allows you to write code in C++ that can be used as native methods by Java code -- but being able to do this is a consequence of them having designed it in, not just an accidental byproduct of using the same back-end.
It is possible because calling convention is compatible, but name mangling is different (no mangling in C). To call C function from C++ you should declare it with extern "C". And to call C++ function from C you should declare it with mangled name (and may be with additional or different type args). The calling Fortran code is possible in some cases too, but argument passing convention is different (pass by ref in Fortran).
There were actually a converters from C++ to C (cfront) and from fortran to c (f2c) and some solutions from them are still used.
garbage-collection is implemented in run-time library, e.g. boehm. Backend should generate objects compatible with selected GC library.
Compiler driver (g++, gfortran, ..) will add language-specific libraries to linking step.

How is writing a C interface easier in Ruby than Perl?

According to the official ruby About page it's easier to extend Ruby with C than Perl. I'm not a (perl) XS guy, but I find it dirt simple to write something quick and simple with Inline::C, so why is it easier in Ruby?
Writing C extensions in Ruby is easier than in Perl or Python, with a very elegant API for calling Ruby from C. This includes calls for embedding Ruby in software, for use as a scripting language. A SWIG interface is also available.
Any further explanation from those that do more C extensions would be useful.
(Full disclosure, I am a Perl programmer)
The Ruby C API certainly looks much nicer than Perl's. It looks like a regular C library with functions that correspond to Ruby code. Perl's API is a mess of macros within macros within macros and magic threading flags. Using the Perl API outside of the Perl core is certainly a secondary concern. Ruby definitely wins on not being bowel clenchingly terrifying.
While Ruby has a better C API, Perl has the better tutorials on how to do anything with it. The generated Ruby documentation lacks any sort of cohesive tutorial or often any descriptive text at all. It's possible I'm looking in the wrong place but that's all that was offered. In contrast, the Perl API documentation is hand written prose with useful information about what each function does. In addition, there's over a dozen documents in the core docs about using Perl and C. I'd say Perl wins on docs.
FFI looks quite impressive. The closest thing Perl has to FFI is Inline::C which is a wrapper around the mess of XS. It's primary use is to inline C code into your Perl program, but you can also use it to access C library functions.
Here's a trivial example similar to nash's getpid example.
use Inline
C => Config =>
ENABLE => "AUTOWRAP";
use Inline C => q{ int getpid(); };
print getpid();
Now, I am cheating because technically getpid returns pid_t on my system, but that's just an integer. FFI seems to have an awful lot of special cased code for getpid, so I suspect it's ease of use will correspond directly to whether FFI has already taken care of it. Trivial examples are trivial. It would be interesting to see what happens when typical complications arise, such as functions that return pre-allocated memory and have odd types and throw around structs.
While FFI and Inline::C can be used to do the same thing, how they do it looks very, very different. Inline::C is actually compiling and caching C code. FFI is somehow not doing any compiling. I'm not sure if that's really for real, or if the compilation is done for you at install time for common libraries.
In addition, FFI smooths the portability problems across the various Ruby implementations and their different ways of calling native APIs. This is something Inline::C doesn't have to do, and quite frankly it's amazing if it really works. One benefit is the FFI interface is much smoother than Inline::C. With Inline::C, it's very clear that you're writing a wrapper around a C compiler.
With FFI it's very easy to extend Ruby with C. This is an example from github
require 'rubygems'
require 'ffi'
module Foo
extend FFI::Library
ffi_lib FFI::Library::LIBC
attach_function :getpid, [ ], :int
end
puts "My pid=#{Foo.getpid}"
You don’t need a compiler installed on
your system to be able to run FFI
extensions. On linux, you also do not
need to install the development
versions of libraries, just the
runtime versions. Of course, the
libraries you link against will need
to have been compiled at some point,
but odds are you won’t have had to do
it.
https://github.com/ffi/ffi/wiki/why-use-ffi

Resources