Any benchmarks for parser generators? [closed] - performance

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Has anyone seen a good comparison of parser generators' performance?
I'm particularly interested in:
1) recursive ascent parser generators for LALR(1) grammars;
2) parser generators which produce C/C++ based parsers.

Are you interested in how fast the parser generators run? Depends of the type of technology of the parsing engine it supports, and the care of the guy who implemented the parser generator. See this answer for some numbers about LALR/GLR parser generators for real languages: https://stackoverflow.com/a/14151966/120163 IMHO, this isn't very important; parser generators are mostly a lot faster than the guy using them.
If the question is, how fast are the generated parsers? you get different answers. LALR parsers can be implemented with a few machine instructions per GOTO transition (using directly-indexed GOTO tables), and a few per reduction. That's pretty hard to beat.

Related

Evaluate compression algorithm [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I'm researching on compression algorithms (huffman coding and LZ77) and was wondering how I would evaluate their efficiency depending on the input image. I know how they work but I can't find information on their evaluation (mathematically). Thanks!
General-purpose (universal) compressors like LZ77 are usually compared by testing them against a standard set of sources and comparing the results, see: http://www.maximumcompression.com/, http://www.maximumcompression.com/data/summary_mf.php, for example.
Compressors for specific purposes are tested against source sets that are chosen to be as representative as possible.
For some applications it is also useful to place mathematical bounds on compression efficiency in terms of the source entropy.

Advanced Rudimentary Computing? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Lets say that my definition of 'rudimentary programming' refers to the fundamental tools employed for a computer to perform a task.
Considering programming rudiments, the learning spectrum usually looks something like this:
Variables, data types and variable memory
Arrays/Lists and their manipulation
Looping and conditionals
Functions
Classes
Multi threading/processing
Streams (hard-disk and web)
My question is, have I missed any of the major rudiments? Is there a 'next' to the spectrum that still eludes me?
I think you missed the most important one: algorithms. Understanding the complexity, know the situation to use them, why use them and more important, how to implement them.
I'm pretty sure that you already know a lot about algorithms but if you think that your tool-knowledge (aka the programming languages) are good enough, you should start focus, more, on the algorithms.
A great book to start is: Introduction to Algorithms, from Thomas H. Cormen

Functional programming style vs performance in Ruby [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I love functional programming and I love Ruby as well. If I can code an algorithm in a functional style rather than in a imperative style, I do it. I tend to do not update or reuse variables as much as possible, avoid using "bang!" methods and use "map", "reduce", and similar functions instead of "each" or danger loops, etc. Basically I try to follow the rules of this article.
The problem is that usually the functional solution is much slower that the imperative one. In this article there are clear and scary examples about that, being until 15-20 times slower in some cases. After reading it and doing some benchmarks I am afraid of keep using the functional style, at least in Ruby.
By the other hand I feel more comfortable writing code in functional style because it is smart and clean, it tends to less bugs, and I think is more "correct", specially nowadays that we can use concurrency and parallelism for better performance.
So I am very confused about which style to use in Ruby. Any wise recommendation will be appreciated.

Algorithms under Plagiarism detection machines [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm very impressed to how plagiarism checkers (such as Turnitin website ) works. But how do they do that ? In a very effective way, I'm new to this area thus is there any word matching algorithm or anything that is similar to that is used for detecting alike sentences?
Thank you very much.
I'm sure many real-world plagiarism detection systems use more sophisticated schemes, but the general class of problem of detecting how far apart two things are is called the edit distance. That link includes links to many common algorithms used for this purpose. The gist is effectively answering the question "How many edits must I perform to turn one input into the other?". The challenge for real-world systems is performing this across a large corpus in an efficient manner. A related problem is the longest common subsequence, which might also be useful for such schemes to identify passages that are copied verbatim.

First programming language to have an interactive shell? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Out of sheer curiosity and the pursuit of trivia, I couldn't find an answer on Google quickly.
Dear fellow programmers, what is the first programming language to provide an interactive shell?
I can't prove other systems weren't earlier but the LISP REPL construct is one common name given to this style of interpreter.
The LISP I Programmers Manual from 1960 (PDF) includes a mention on page 2 that is apropos:
Enlargements of the basic system are available for various purposes. The compiler version of the LISP system can be used to compile S-expressions into machine code. Values of compiled functions are computed about 60 times faster than the S-expressions for the functions could be interpreted and evaluated. The LISP-compiler system uses about half of the 32,000 memory.

Resources