First programming language to have an interactive shell? [closed] - shell

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Out of sheer curiosity and the pursuit of trivia, I couldn't find an answer on Google quickly.
Dear fellow programmers, what is the first programming language to provide an interactive shell?

I can't prove other systems weren't earlier but the LISP REPL construct is one common name given to this style of interpreter.
The LISP I Programmers Manual from 1960 (PDF) includes a mention on page 2 that is apropos:
Enlargements of the basic system are available for various purposes. The compiler version of the LISP system can be used to compile S-expressions into machine code. Values of compiled functions are computed about 60 times faster than the S-expressions for the functions could be interpreted and evaluated. The LISP-compiler system uses about half of the 32,000 memory.

Related

Do any computer languages not use a stack? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Do any computer languages not use a stack data structure to keep track of execution progress?
Or is the use of this data structure an emergent requirement stemming from something inherent to most computer languages or turing machines?
With a traditional "C-style" stack, certain language features are difficult or impossible to implement. For example, closures can't easily be implemented with a traditional stack because closures require a pointer to an old activation record to work correctly and that memory is automatically reclaimed in a C-style stack. As another example, generators and coroutines need their own memory to store local variables and relative offset information and therefore can't easily be implemented if you use a standard stack implementation.
Hope this helps!

Advanced Rudimentary Computing? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Lets say that my definition of 'rudimentary programming' refers to the fundamental tools employed for a computer to perform a task.
Considering programming rudiments, the learning spectrum usually looks something like this:
Variables, data types and variable memory
Arrays/Lists and their manipulation
Looping and conditionals
Functions
Classes
Multi threading/processing
Streams (hard-disk and web)
My question is, have I missed any of the major rudiments? Is there a 'next' to the spectrum that still eludes me?
I think you missed the most important one: algorithms. Understanding the complexity, know the situation to use them, why use them and more important, how to implement them.
I'm pretty sure that you already know a lot about algorithms but if you think that your tool-knowledge (aka the programming languages) are good enough, you should start focus, more, on the algorithms.
A great book to start is: Introduction to Algorithms, from Thomas H. Cormen

Any benchmarks for parser generators? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Has anyone seen a good comparison of parser generators' performance?
I'm particularly interested in:
1) recursive ascent parser generators for LALR(1) grammars;
2) parser generators which produce C/C++ based parsers.
Are you interested in how fast the parser generators run? Depends of the type of technology of the parsing engine it supports, and the care of the guy who implemented the parser generator. See this answer for some numbers about LALR/GLR parser generators for real languages: https://stackoverflow.com/a/14151966/120163 IMHO, this isn't very important; parser generators are mostly a lot faster than the guy using them.
If the question is, how fast are the generated parsers? you get different answers. LALR parsers can be implemented with a few machine instructions per GOTO transition (using directly-indexed GOTO tables), and a few per reduction. That's pretty hard to beat.

What language features does Ruby borrow from CLU? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have read in Ola Bini's blog that Ruby has influenced by CLU language, but besides the multiple assignment I'm not sure what other influences exist. Any guidance and simple examples would be most appreciative.
The only time matz ever mentioned CLU was when talking about iterators. Everything else is more or less directly from Smalltalk, Lisp, Flavors and Perl. Singleton classes seem to be unique to Ruby, though they are related to Smalltalk's metaclasses.
Depends on how abstract you want to get: CLU had iterators, exceptions, memory management, and was sort-of OO.
It was actually kind of interesting.

What's the meaning of GW in GW-BASIC? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
GW-Basic was my first programming language. And i never found out what the 'GW' stands for.
From the Wikipedia article:
There are several theories on what the
initials "GW" stand for. Greg Whitten,
an early Microsoft employee who
developed the standards in the
company's BASIC compiler line, says
Bill Gates picked the name GW-BASIC.
Whitten refers to it as Gee-Whiz BASIC
and is unsure if Gates named the
program after him. The Microsoft
User Manual from Microsoft Press also
refers to it by this name.
It may have also been nicknamed
Gee-Whiz because it had a large
number of graphics commands.
Other common theories as to the
initials' origins include "Graphics
and Windows", "Gates, William"
(Microsoft's president at the time),
or "Gates-Whitten" (the two main
designers of the program).

Resources