Related
I was reading about how people were having trouble finding people to work with COBOL when working government systems that still use it. I was also reading about how Fortran, a language made two years before COBOL, is interoperable with C, C++, R, and Python with the right libraries.
This allows Fortran scripts to work with modern programming languages to some degree and even create scripts in modern programming languages that can work alongside Fortran code, making it easier for novices of Fortran to work with it. Are there any particular issues that prevent COBOL from having similar interoperability with other programming languages like SQL (which is used for databases similar to COBOL) that would make it easier for modern programmers who might not normally learn COBOL to work with it?
Q1: Does anything prevents interoperability between modern languages and COBOL?
A1: Short answer similar to those above: No, it is actually often done.
But that may depends on what "modern language" is defined for the reader.
Even with "real" COBOL (not some "shiny" [may be read as "blending"] "managed COBOL") you are in most cases free to directly call any C functions so more or less can call anything (at least with a C wrapper) and also can call binaries as you can do on the operating system (`CALL 'SYSTEM' USING 'some-executabe-or-script "param1" "param2"' is a common extension).
For calling into any "native code" directly (like Win32 or POSIX) you obviously have to ensure you are using the correct parameter definitions, but COBOL 2002+ have stuff like USAGE SIGNED-LONG, USAGE POINTER and similar (the extension USAGE COMP-5 is also common in this place).
Additional there are often direct ways to inter-operate with socket servers, HTTP(S), XML, JSON, ... ; and many COBOL implementations also allow to ASSIGN a (line-)sequential file to a pipe, allowing to interact with other programs in this way, too.
Q2: Are there any particular issues that prevent COBOL from having [...] interoperability with [...] SQL?
A2: No, and SQL is a very common directly used in COBOL: EXEC SQL
Many people will say that SQL is no "programming language". It is a query language and may be used in different environments, including COBOL.
Depending on the environment used, EXEC SQL may be directly integrated into the COBOL environment or with a pre-parser that adjusts the code to be plain COBOL (normally CALLing some "native" code, see Q1).
Q3: [... stuff] that would make it easier for modern programmers who might not normally learn COBOL to work with it?
... this is a completely different question, whatever a "modern programmer" is.
For a programmer to get to know a programming language it all depends on the programmer and the resources (like time, manuals, tutorials, mentors) - and the will of the programmer. Many people actually don't "want" to learn COBOL (for reasons I've heard but don't understand or disagree), other miss some of the resources (a free compiler is available with GnuCOBOL, nearly all COBOL compiler have their manuals available online and the ISO working group for COBOL publish the draft standards online, too; you often can find mentors in COBOL discussion forums or mailing lists, along with many samples).
One thing that often is special with COBOL is not the language itself, but the environment it is used on ("mainframe" with job control language "jcl" instead of a GUI to click or a shell to use) and/or the software that is actually coded in COBOL; every software that is maintained over decades has "special ways" here and there, and if you get to "decade old code that wasn't actually maintained for years" you get into even more troubles/fun (this is not something COBOL specific, but with COBOL you may encounter this software more often).
No, there is nothing preventing interoperability.
The main reason (this is an opinion, not based on known facts) that Fortran seems to have more interop out-of-the-box was that there was a free software GNU/Fortran for interested parties to work with. COBOL was very late in the game getting a viable free software compiler. That is no longer an issue with GnuCOBOL and people are finally starting to write the code needed to catch up.
Adding to Simon's answer; proof of concept for direct embedding is in a branch for GnuCOBOL; intrinsic functions added to support FUNCTION TCL, FUNCTION PYTHON, FUNCTION REXX, FUNCION LUA and FUNCTION JVM, so far. With FUNCTION JVM tests for Scala, Groovy, Java, Frink, all worked. This allows data transfer between COBOL working storage and the other language engine using simple COBOL syntax. Including setups for callbacks to and from. Those functions are embedded into the compiler and libcob run-time, when using that branch.
For other interface trials, not built into the compiler, but still allowing interop; the GnuCOBOL FAQ has dozens of examples. Shakespeare? Yep. Falcon? Yep. C, well, GnuCOBOL emits intermediate C so that's covered in spades. There is also a C++ edition of the compiler, so C++ is also covered, in spades. Javascript; Jsish, Duktape, Spidermonkey, Quickjs to name a few of the trials.
Ada, D, Vala, Genie, S-Lang, ROOT/CINT, J, Gambas, Forth, Perl, Postscript, Pure, Icon and Unicon, Nim, BaCon, SWIG (which opens up many multiples), PARI/GP, Gretl, R, Red, Ruby, Haxe/Neko, Pascal, Erlang, Elixir, SQLite, Rust, Go, more..., including a fair number of esolangs, and GNU Lightning for on the fly assembly modules. Trials documented in the GnuCOBOL FAQ.
Framework interfacing for AWT/Swing, GTK, Agar, and things like ZeroMQ, CGI and websockets also proved successful and are in productive use. Along with at least 7 EXEC SQL preprocessors successfully tested, and in use.
It comes down to someone caring to try, and writing some glue or properly aligning call frames. No attempts I've tried have failed to produce satisfactory results, although Perl 5 was a hair pull of unraveling macro layers. (Ok, I just lied, while attempting to embed jq, which relies on using C call and return by struct features, I would have had to leave pure COBOL interface coding, and didn't bother with the C middleware that would have made it easy). ;-) Will do that someday though, as jq is quite the powerful little JSON handler.
Use the search engine you mistrust the least and look for "gnu-cobol-builtin-script" and "GnuCOBOL FAQ", and visit the hits on SourceForge.
In my particular explorations I usually focus on languages with a C Application Binary Interface, but other ABIs would be along a similar vein. It only takes sitting down and writing some middleware or figuring out how to properly synch the call frames.
Are these current samples perfect? Not always, there are edge and corner cases with some datatypes and COBOL PICTURE data that would require more work, but that is all; a little bit of work and testing to smooth over the bumps. When exploring, I don't always go that far until an actual need arises. These seed work experiments are just to get some proof in the pudding, all done for the simple joy of it.
One of the lead developers for GnuCOBOL just added uni and bi-directional piping using simple filenames, which provides access to whatever the base OS offers, using basic COBOL OPEN/READ/WRITE/CLOSE (and other file IO) statements. Code was committed to trunk just a few hours before I started typing this response.
Basically, the answer to the titular question is a resounding No.
The scenario involved in the governmental systems is most likely IBM mainframe hardware with a flavor of z/OS, z/VSE, or z/VM operating system.
It somewhat depends on what is meant by interoperability in the sense that most any modern mainframe supports TCP/IP and that pretty much opens up the whole networked computing ecosystem to networked interoperability.
My guess is when all is said and done, the reason there is a problem is that the state refuses to pay a market rate for experienced mainframe developers and has kicked the maintenance can down the road as cost-saving measures.
It most likely is not a matter of there being no mainframe COBOL professionals able to make the systems work; it's most likely the state won't pay the price.
But this is speculation on my part since all I know is that the governor blames inanimate objects for appropriations and management failures within the state IT administration.
As a 40-year mainframe veteran, I'm dying to know details as to how this perfectly good technology is at fault for problems dealing with (again, I assume) unprecedented volumes of processing demand.
We found an interoperability problem between C and GnuCOBOL.
Our problem was addressed so this answer is just for educational purposes so you can understand what kind of problems you may have.
The problem manifests when C calls COBOL(a, b) calls C(c) calls COBOL(a, b).
And specifically when the number of arguments varies.
A recent change to GnuCOBOL assumed that COBOL called COBOL so it passed meta data about the arguments in some global area. Then the called COBOL program cleared out the second argument because is falsely thought it was being called with one argument. That is, the intermediate C call was transparent to COBOL.
This is under the guise of making it more compatible with IBM mainframe but it caused me a lot of grief. It was quick addressed with runtime changes. I would like to see it addressed with a compile time option:
Make .so file a stand alone .so file called from any language but programmer has to be vigilant.
Make .so file assume it will be called from COBOL and has the additional protections afforded by mainframe COBOL.
BTW: GnuCOBOL is great and has a great community behind it. If you are experiencing problems report it and you will get better response than commercial products.
What are the code tutorial shells called that are used on various websites such as codecademy, codeschool, mongo db, try ruby etc.
Is there a service for building and maintaining these things? Or is it something you build yourself?
I found ShellInABox but it doesn't seem exactly like the only one people are using?
These are colloquially known as "web REPLs" (a term meaning "Read/Eval/Print Loop").
They can be distinguished by unrestricted ones used for debugging by describing them as "sandboxed". There is not a single, universal toolkit used to build these, though you may find repl.it (which uses a number of interpreters compiled to javascript to run client-side REPLs for numerous languages) interesting.
During web searching, I found the following comment : Traditional Lisp debugging practices can still be used.
What are the traditional debugging practices?
Normally, what tools are used for debugging lisp (with/without emacs)?
I don't know what Bill meant specifically, but IME:
Typically your editor will have a running instance connected to it. You can compile functions immediately to insert them into the running image -- since Lisp has its own compiler, you're just telling the running image to read and compile a small section of text. Or you can run functions directly, to see what they do.
When an exception is thrown (or a condition is signaled, if you're lucky enough to be in a dialect with conditions), the debugger will show you the stack trace and let you decide how to continue.
The major difference between Lisp and other high-level compiled languages is that in Lisp you're basically always writing code with the debugger attached.
As clojure was tagged in the question, I'll give our perspective.
Class files generated by the clojure compiler include line- and method-based debugging info, so any java debugger will interoperate directly with clojure code, including breakpoints and object inspection.
If you use emacs/slime as your development environment, integration with slime's debugger has recently been included. As documentation is a little sparse, it's probably best to check out the scope of the support on github directly.
Run edebug-defun in emacs and you will see that lisp is magic.
In something that I would call approaches a "traditional set of Lisp debugging techniques" are:
Debug printouts
Function tracing (each invocation of a traced function
is printed with an indentation that corresponds to call depth, on return the return
value is printed).
Explicit invocation of the in-image debugger
Ending up in the in-image debugger due to a bug (trying to add an integer and a symbol, for example)
Basically just things like adding code to print out values as it runs so you can see what's happening.
Will there be any incompatibilities with the code in SICP if I use IronScheme?
There shouldn't be any significant incompatibilities.
If you find any, why not post them here (as problems to solve), to document them?
You may want to use DrScheme (http://www.plt-scheme.org/) in 'Pretty Big' language mode and allow redefinition of initial bindings for working through code examples and exercises in SICP. I have been using it for trying out examples and exercises in SICP and have not run into any problems with the interpreter. It's also much more developed compared to IronScheme.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am looking for a version of Scheme or even LISP that I can use to recover some lost Lisp development skills. Some web capabilities would be nice but not essential.
I've looked at Plt and MIT scheme and, while both look pretty good, the Plt seems to be more feature rich. I've also looked at Lisp implementations but all of the seem quite expensive.
I favor free/inexpensive implementations as this is truly likely to just be occasional hobby programming. What recommendations would you have?
I'd go with Racket. It may not be as fast as SBCL, but it does have excellent libraries and documentation, as well as an integrated environment that's designed to get you developing and running Scheme programs right out of the gate. What I really like about Racket's IDE, DrRacket, is what you don't have to do—you don't have to learn Emacs, you don't have to learn SLIME, you don't have to worry about hunting down third-party libraries, as virtually all libraries meant for Racket can be found in packages. All in all, it really cuts down on the learning curve and allows you to focus on the actual task at hand: writing great code.
Also, it comes with a web server if you want to make Racket-powered websites (which I'm currently looking into).
I did quite a bit of experimenting with this.
Clozure Common Lisp (née Open MCL) is by far the fastest; 25-30 percent faster than the next competitor on my intel Mac Mini.
MIT Scheme works quite nicely on a Mac. I think I eventually compiled it myself, but there are binaries at that site. PLT Scheme is also nice, and possibly a little better integrated into the Mac world. (PLT Scheme is now known as Racket, but I haven't experimented with it after the change.)
I'm a huge fan of Clojure, SBCL, and Clozure CL. They are all fantastic, but they are also overkill if all you want to do is refresh your Lisping chops. They all require absurd amounts of info hunting, mailing list searching, package installing, irc lurking, etc.
Dr Scheme just installs and runs. I finished the first 3 chapters of SICP four and half years ago using Dr Scheme. Nothing was more profound than defining a Scheme evaluator in itself. Once you get your head around that you'll have a lot more patience for the industrial strength brethren.
For Scheme, DrRacket is awesome (included in Racket).
For Common Lisp, Ready Lisp is great. A single dmg with SBCL, Aquamacs and Slime working out of the box.
From the Web site:
Ready Lisp is a binding together of several popular Common Lisp packages especially for Mac OS X, including: Aquamacs, SBCL and SLIME. Once downloaded, you’ll have a single application bundle which you can double-click — and find yourself in a fully configured Common Lisp REPL.
It’s ideal for OS X users who want to try out the beauty of Common Lisp with a minimum of hassle. It could also be used by teachers to give their Mac students a free, complete Common Lisp environment to take home with them.
Requirements
The current version of Ready Lisp is 20090127 and requires Mac OS X 10.5 (Leopard).
It includes the following component software versions:
Aquamacs 1.6
SBCL 1.0.24
SLIME 2009-01-23
CL-FAD 0.6.2
CL-PPCRE 2.0.1
LOCAL-TIME 0.9.3
SERIES 2.2.10
CL HyperSpec 7.0
paredit.el 20
redshank.el 1
cldoc.el 1.16
I've just started playing with Clojure. It apparently has a nice web framework, and compiles to JVM bytecode.
I also use DrScheme quite a lot. It's a simple yet useful IDE.
Depending how you define "Lisp", Clojure may fit the bill. It runs on OS X fine (it runs anywhere the JVM runs). It has web capabilities and it's free.
It also has the benefit of being new and fresh and fun to use. Might be ideal for hobby programming. It's easy to write web apps or GUI apps (using Java's Swing or even Qt).
I haven't used it myself, but Steel Bank Common Lisp has received some favourable buzz over at reddit. It's open source and free so the price is right for some hobby programming.
In the past, I've had GNU Common Lisp running on my macbook pro.
If you are looking for Scheme you can take a look at just released JazzScheme.
I do recommend Racket to new-comers, since it provides one of the nicest IDE's for Scheme beginners (or rather, programming beginners who happen to be using Scheme, or better still, working their way through HtDP).
http://racket-lang.org/
Another option, for people who are more interested in a small Scheme system in order to modify it themselves or read its source code, is Larceny Scheme, which is of interest largely because its JIT compiler, Twobit, is itself implemented entirely in Scheme.
http://www.larcenists.org/
Update: In addition, Chez Scheme has recently been open sourced:
https://github.com/cisco/ChezScheme
(It may not be as "small" as Larceny, but it has a very aggressive optimizing compiler.)
If you're just hobby programming, LispWorks has a free, personal version which is quite powerful and sophisticated. It's biggest issue is a run time limit of several hours. So, you won't be writing any long running servers in it, but that doesn't mean it's not a useful tool.
CLISP runs on most everything, and is quite nice actually, it just doesn't do threads. (Important if you want to write an actual server, but as PHP and Perl have shown us, Apache + [insert language] is a very viable platform.)
You might want to look at what's at the Association of Lisp Users or the Common Lisp Wiki to see what's there. I set myself up with Steel Bank Common Lisp and Emacs, but have done little with it so far.
Clozure CL is available for free from the mac store!
http://itunes.apple.com/us/app/clozure-cl/id489900618
I have found that Chicken works well for Scheme and is available through homebrew.
brew install chicken
Most of the code from SICP works with minor modifications.
I've been asking myself the same question lately. Having used DrScheme on OS X it would be my first choice of Scheme distribution for any platform. Very nice IDE, debugging features and a good set of libraries/frameworks (including a very nice GUI toolkit that 'just works... even on Mac' ;-) )
However, I'm now looking for a similarly comfortable environment for Common Lisp. It came down to CCL (OpenMCL) versus SBCL. SBCL seems to be the popular choice but I read that on OS X is doesn't support threading. (Is this really an issue?). Clozure CL, on the other hand, boasts good support for native threads, the obcj-bridge, etc...
I'm finding CCL a little odd but I'm going to stick at it for a while - It still looks like the logical choice for integration.
I use Emacs 23 (built from source using --with-ns) and Slime as an environment and this works well for me. :-)
Go with Racket. I'm very happy with it!