I got involved in a new project where Perl is a must. I'm coming from a good Ruby foundation and want a quick introduction or mapping between Perl and Ruby as I heard that Perl and Ruby are very close in syntax (know nothing about features).
Do you have any recommendations for me?
What great Perl book do you recommend as an extended reference?
What is the commonly used version of Perl right now?
I second Nathan's book recs, though I would also mention Beginning Perl. Two bonus features are (1) it's available freely (and legally) online in its first edition (note: this site is timing out right now, and I'm unsure if that's temporary or not) and (2) it covers about as much as Learning Perl and Intermediate Perl combined. A con is that it's at times more elementary that you might want. (Learning Perl goes faster and assumes a bit more - which can be a good thing.)
You might also check out this: To Ruby From Perl from Ruby's website. Just think of it in reverse.
In terms of versions, 5.10.1 is stable, but you will come across a range. Mostly you will find 5.8.x and up, I suspect. (Just as with Ruby 1.9.1 is stable but you will find plenty of places still using 1.8.6 or up.)
Since I'm somewhat going in the opposite direction (I know Perl reasonably well, and I'm using Ruby more and more often), I can mention things that stick out to me:
In Perl, you get automatic conversion between strings and numbers (and you don't need to explicitly ask for a float result by using .to_f or making one item a float).
Semicolons are not optional to end statements in Perl. Similarly parentheses are optional less often in Perl than they are in Ruby. (This gets complex quickly, but for example you must have parentheses for the test in a condition or a while block.)
0 (string, integer and float), undef and the empty string evaluate as false in boolean tests.
There are no separate booleans true and false.
You distinguish data types with sigils: $foo is a scalar; #foo is an array; %foo is a hash. (Arrays in particular will bug you: they aren't instance variables.)
You need to explicitly scope items in Perl, using the my keyword.
Arrays in Perl are automatically flattened when combined. (This constantly bites me in Ruby.)
Context, context, context. In Perl an enormous amount of what your code actually does depends on understanding what context you're in. Here's a link for a start, but it's a big topic with a lot of nooks and crannies.
(Note that I didn't mention the 1000 pound gorilla in the room. OO is part of what Perl is and can do, but it's not at the center of Perl, as it is in Ruby.)
Versions
In The Perl Survey 2007***, the majority of Perl coders used Perl 5.8, with 87% using 5.8.x at least some of the time, and 5.8.8 being the most common single version. A minority used 5.6.x for at least some projects, and a smaller (but occasionally quite vocal) minority uses 5.005. Since that point Perl 5.10 has been released and it's not clear yet what the adoption rate is; likely many businesses are being conservative and running 5.8.8 or 5.8.9, but many of what you might call "prominent hackers" are using 5.10.1 and even sometimes requiring its features.
References
Programming Perl is a winner for anyone with previous programming experience who wants to get up to speed on Perl quickly. The current edition is the third, which corresponds (unfortunately) to Perl 5.6.0.
The series Learning Perl, Intermediate Perl, Mastering Perl is also recommended; Learning Perl starts off rather slow because it targets beginners, but it eventually covers all of the major features of the language, including some that didn't exist in 2000 when Programming Perl was last revised.
Perldocs. Please, don't neglect the perldocs. The master index is at perldoc perl, or you can read online at perldoc.perl.org. These are nearly as good a reference as Programming Perl because in fact a decent portion of that book is drawn from the Perldocs. I would recommend at least a thorough skim of perlsyn, perlop, perlrun, perlvar, perlre, perlobj, perlmod, perluniintro, perlreftut, and perldsc. Reading straight through perlfunc once isn't a bad idea either.
Effective Perl Programming is a book I've always recommended for learning how to approach problems with a Perl frame of mind. Its 1998 publication date limits its usefulness a bit but I still think it's worth a read -- and fortunately, brian d foy and Josh McAdams are working on an updated edition, due (I think -- don't trust me too far on this) March 2010. [And the second edition is now here -- brian]
Perlmonks is a great resource, especially if you remember to use the search feature. A great many questions have been asked and answered there, and the best answers are indexed for posterity, along with lists of resources such as books and online FAQs.
***: I would love to provide a link to The Perl Survey 2007 but unfortunately the perlsurvey.org domain was allowed to lapse. A copy of the results PDF may be found at github, and the mostly-raw data is on CPAN as Data::PerlSurvey2007.
Do you have any recommendations for me?
Perl.org is a good site to get some resources
What great Perl book do you recommend as an extended reference?
Programming Perl and Learning Perl are very nice, but also take a look here
What is the common used version of Perl right now?
Depends of your platform, take a look here
People keep telling me Picking Up Perl is dated and they are right but it is such a concise introduction that I would recommend reading it and then jumping straight into Intermediate Perl and Perl Best Practices as well as Perl Cookbook. The Cookbook is great if you learn by better by concrete, bite-sized examples.
Further, I really recomend reading the FAQ List before doing anything else.
I recommend "Programming Perl", "Perl Best Practices" and the "Perl Cookbook" also.
The "PBP" book is good, not because it teaches you rules, but because it makes you stop and think about why you should do things a certain way, and make an educated decision when you decide to stray from Conway's recommended path.
As for documentation, I often use CPAN's documents as they're the most current and offer hyperlinks, something we don't get from the local perldocs on our drive.
One of the things I loved about Ruby when I first started using it compared to Perl, was gems vs. CPAN. Keeping a set of gems current seems so much easier than a set of CPAN-based modules.
And, like Sinan says, the FAQs are great reading. I've read and reread them many times because the knowledge is good to keep in your head.
And, though they can be rather blunt, the PerlMonks are a wonderful resource. Simply searching and reading how they recommend doing things can raise your Perl consciousness several levels, even if you don't engage them with a direct question.
And, in the way of the monk, contemplate Perl's hashes and learn about slicing them. They are the shiz.
Related
So, I have a few questions that I have to ask, I did browse the internet, but there weren't too many reliable answers. Mostly blog posts that would cancel each-other out because they both praised different things and had benchmarks to "prove their viewpoint" (I have never seen so many contradicting benchmarks in my life).
Anyway, my questions are:
Is Rubinius really faster? I was pretty impressed by this apparently honest pro-Rubinius presentation. Another thing that confuses me a little is that a lot of Rubinius is written in Ruby itself, yet somehow it is faster than C-Ruby? It must be a pretty damn good implementation of the language, then!
Does EventMachine work with Ruinius? As far as I know, EventMachine partially relies on Fibers (correct me if I'm wrong) which weren't implemented until 1.9. I know Rubinius will eventually support 1.9, too; I don't mind waiting a little.
Do C extensions work in Rubinius? I have written a C extension which "serializes" binary messages received from a TCP stream into Ruby Objects and vice-versa (I suppose the details are not important, but if it helps answer this question I will update the post). This can be a lot of messages! I managed to write the same code in Ruby (although, it made little sense after a month), but it proved to be a real bottle-neck in the application. So, I had to use C as a "solution" to my problem.
EDIT: I just remembered, I use C for another task, it is a hit-test method for Arrays. Basically it just checks if a "point" is inside an a polygon, it was impossibly slow in CRuby.
If the previous answer was a "No," is there then an alternative for C extensions in Rubinus? I gather the VM is written in C++, so that then.
A few "bonus" questions:
Will C-Ruby (2.0+, YARV) ever get rid of GIL? Or at least modify it so CRuby supports true parallelism?
What is exactly mruby? I see matz is working on it, and as far as the description goes it seems pretty awesome. How different is it from CRuby (performance-wise)?
I apologize for this text-storm I unleashed upon you! ♥
Is Rubinius really faster?
In most benchmarks, yes.
But benchmarks are... dumb. Apps are what we really care about. So the best thing to do is benchmark your app & see how well it performs. The 2 areas where Rubinius will real shine over MRI are parallelism & memory usage. Rubinius has no GIL, so you can utilize all available threads. It also has a much more sophisticated GC, so in general it could perform better with respect to GC.
I did those benchmarks back in Oct '11 for my talk on MagLev at RubyConf
Does EventMachine work with Rubinius?
Yes, and if there are parts that don't work, then the issue should be reported. With that said, currently the EM tests don't pass on any Ruby implementation.
Do C extensions work in Rubinius?
Yes. I maintain the compatibility issue for C-exts, so if there is one you have that is tested on Travis, Rubinius would like to see it pass against rbx. Rubinius has historically had good support for the C-api and C-exts, though it would be nice if someday Rubinius could run Ruby so fast one would not need C-exts or the C-api.
Will C-Ruby (2.0+, YARV) ever get rid of GIL? Or at least modify it so CRuby supports true parallelism?
No, most likely not. Jesse Storimer has a succinct writeup of Matz's opinion (or lack thereof) on threads from RubyConf 2012. Koichi Sasada tried to remove the GIL once and MRI perf just tanked. Evan Phoenix also tried once, before he created Rubinius, but didn't have good results.
What is exactly mruby?
An embeddable Ruby interpreter, akin to Lua. Matt Aimonetti has a few articles that might shed some light for you.
I am not too much into Ruby but I might be able to answer the first question.
Is Rubinius really faster?
I've seen different Benchmarks telling different things. However, the fact that Rubinius is partially written in Ruby does not have to mean that it is slower. I thought the same about PyPy which is Python in Python. After some research and the right classes in college I knew why.
As far as I know both are written in a subset of their language which should be much simpler. An (e.g. C) interpreter can be be optimized much easier for such a subset than the whole language.
Writing the Ruby/Python interpreter in its own language allows much more flexibility and quicker prototyping of new interpretation algorithms. The whole point of the existence of Ruby and Python are among others that algorithms can be implemented much quicker than in e.g. C or even assembler. A faster algorithm outweighs the little overhead of an interpreter a lot of the time.
Btw. writing an interpreter for a language in the same language is also a common academic practice to show how mighty the language is. In one class we've written Lisp in Lisp in Lisp.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
David Korn, a proponent of the Unix philosophy, chided Perl programmers a few years ago in a Slashdot interview for writing monolithic Perl scripts without making use of the Unix toolkit through pipes, redirection, etc. "Unix is not just an operating system," he said, "it is a way of doing things, and the shell plays a key role by providing the glue that makes it work."
It seems that reminder could apply equally to the Ruby community. Ruby has great features for working together with other Unix tools through popen, STDIN, STDOUT, STDERR, ARGF, etc., yet it seems that increasingly, Rubyists are opting to use Ruby bindings and Ruby libraries and build monolithic Ruby programs.
I understand that there may be performance reasons in certain cases for going monolithic and doing everything in one Ruby process, but surely there are a lot of offline and asynchronous tasks that could be well handled by Ruby programs working together with other small programs each doing one thing well in the Unix fashion, with all the advantages that this approach offers.
Maybe I'm just missing something obvious. Is the Unix Philosophy still as relevant today as it was 10 years ago?
The Unix philosophy of pipes and simple tools is for text. It is still relevant, but perhaps not as relevant as it used to be:
We are seeing more tools whose output is not designed to be easily parseable by other programs.
We are seeing much more XML, where there is no particular advantage to piping text through filters, and where regular expressions are a risky gamble.
We are seeing more interactivity, whereas in Unix pipes information flows in one direction only.
But although the world has changed a little bit, I still agree with Korn's criticism. It is definitely poor design to create large, monolithic programs that cannot interoperate with other programs, no matter what the language. The rules are the same as they have always been:
Remember your own program's output may be another program's input.
If your program deals in a single kind of data (e.g., performance of code submitted by students, which is what I've been doing for the last week), make sure to use the same format for both input and output of that data.
For interoperability with existing Unix tools, inputs and outputs should be ASCII and line-oriented. Many IETF Internet protocols (SMTP, NNTP, HTTP) are sterling examples.
Instead of writing a big program, consider writing several small programs connected with existing programs by shell pipelines. For example, a while back the xkcd blog had a scary pipeline for finding anagrams in /usr/share/dict/words.
Work up to shell scripts gradually by making your interactive shell one you can also script with. (I use ksh but any POSIX-compatible shell is a reasonable choice.)
In conclusion there are really two highly relevant ways of reusing code:
Write small programs that fit together well when connected by shell pipelines (Unix).
Write small libraries that fit together well when connected by import, #include, load, require, or use (Ruby, C++ STL, C Interfaces and Implementations, and many others).
In the first paradigm, dependency structure is simple (always linear) and therefore easy to understand, but you're more limited in what you can express. In the second paradigm, your dependency structure can be any acyclic graph—lots more expressive power, but that includes the power to create gratuitous complexity.
Both paradigms are still relevant and important; for any particular project, which one you pick has more to do with your clients and your starting point than with any intrinsic merit of the paradigm. And of course they are not mutually exclusive!
I think that the Unix philosophy started falling out of favor with the creation of Emacs.
My vote is yes. Subjective, but excellent programming question.
Just a personal anecdote during a time when we were re-writing a mass print output program for insurance carriers. We were literally scolded by advisors for "programming" in the shell. We made aware how it was too disconnected and the languages were too disparate to be complete.
Maybe.
All of a sudden multi-processor Intel boxen became commonplace and fork() didn't really perform as horribly as everyone was always warned in the new age of applications (think VB days). The bulk print programs (which queried a db, transformed to troff output and then to PostScript via msgsnd and then off to the LPD queue in hundreds of thousands) scaled perfectly well against all the systems and didn't require rewrites when the VB runtimes changed.
To your question:
I'm with Mr. Korn, but it is not Perl's fault, it is the Perl programmers who decide that Perl is sufficient enough. In multi-process systems maybe it is good enough.
I hope that Ruby, Perl, Python, and (gasp) even Java developers can keep their edge in the shell pipeline. There is inherent value in the development philosophy for implicit scaling and interfacing, separation of duties, and modular design.
Approached properly, with our massively-cored processing units on the horizon, the Unix philosophy may again gain ground.
It does not appear to be completely lost. I read a recent entry blog entry by Ryan Tomayko that aggrandized the UNIX philosophy and how it is embraced by the Unicorn HTTP Server. However, he did have the same general feeling that the ruby community is ignoring the UNIX philosophy in general.
I guess the rather simple explanation is that Unix tools are only available on Unix. The vast majority of users, however, run Windows.
I remember the last time I tried to install Nokogiri, it failed because it couldn't run uname -p. There was absolutely no valid reason to do that. All the information that can be obtained by running uname -p is also available from within Ruby. Plus, uname -p is actually not even Unix, it's a non-standard GNU extension which isn't even guaranteed to work on Unix, and is for example completely broken on several Linux distributions.
So, you could either use Unix and lose 90% of your users, or use Ruby.
No. Unix and Ruby are alive and well
Unix and Ruby are both alive and well, Unix vendor Apple's stock is headed into orbit, linux has an irrevocable dug-in position running servers, development, and lab systems, and Microsoft's desktop-software empire is surrounded by powerful SaaS barbarians like Google and an army of allies.
Unix has never had a brighter future, and Ruby is one of its key allies.
I'm not sure the software-tools pattern is such a key element of Unix anyway. It was awesome in its day given the general clunky borderline-worthless quality of competing CLI tools but Unix introduced many other things including an elegant process and I/O model.
Also, I think you will find that many of those Ruby programs use various Unix and software-tools interfaces internally. Check for popen() and various Process methods.
I think Ruby simply has its own sphere of influence.
Today someone asked me what was wrong with their source code. It was obvious. "Use double equals in place of that single equal in that if statement. Um, I think..." As I remember some languages actually take a single equals for comparison. Since I sometimes forget or mix up the syntax details among the several languages I use, I stepped over to my laptop to try a quickie experiment.
It costs a bit of time and is a break in the flow to try "quick" experiments (though maybe the practice is good for memory.) What tips do you have for keeping straight in your mind the syntax (and other) details of multiple languages?
(And nowadays, this applies just as well to the many wiki-like markups!)
To me, the hardest part isn't the syntax -usually you get into the mode when looking at the code you're working on. The really hard part is remembering the library of the language so you don't go inventing the wheel over and over again. Now if only people would organize their help files so it was easy to search for particular stuff in the library.
IDEs that can draw red and yellow squiggles can help, until you develop that mental muscle memory.
One of the annoying things with XCode (for Cocoa/ObjectiveC) is that you don't get said squiggles until you compile. (As opposed to Eclipse/Java where you get live squiggles).
In my case it's just experience. I think once you code in a language for long enough your brain seems to be able to do language-context-switching with it.
Indeed, on SO I advised not to forget avoiding if (a = b) in Java, and someone reminded me that it is legal only if a and b are boolean! Of course, the advice is good for C, C++, JavaScript and a number of other C-like languages.
Likewise, I realized only recently that var v in JavaScript have a function-level scope only, not a brace-level scope.
Somehow, that's the pitfall of having similar syntaxes, but different behaviors.
For the anecdote, some people in the Lua mailing list complain that this language isn't C-like, with the terse and familiar curly braces, the += and ++, the bitwise operators. They say it hurts adoption of the language, because people are more familiar with C-like syntax.
That's non-sense, Basic was (and still is) widely used with its verbose syntax. And so is Pascal (Delphià. And lot of people find the Lua syntax readable and easy to learn, good for those non familiar to programming (game AI specialists, for example).
Moreover, and to the point, Lua is designed to be integrated to C/C++ programs and to be extended with C[++] functions. And people say the quite different syntaxes helps in the mindset shifting.
Does anyone know if Google uses Ruby for application development?
What are the general job prospects of Ruby compared to other languages like Perl or Python?
Aaron's roughly right. We use C only for kernel work (and other maintenance on 3rd party stuff written in C) so I wouldn't count that as "application development", and Objective C for the very specific case of apps running client-side on Apple gear, etc.
Ruby is the embedded scripting language for Google Sketchup, see http://code.google.com/apis/sketchup/docs/gsrubyapi_examples.html -- that decision was made before Google acquired "#Last Software", Sketchup's makers.
Regarding Nishant's second question, in the wider job market, Ruby's kind of OK: still low absolute numbers but good growth, see http://duartes.org/gustavo/blog/post/programming-language-jobs-and-trends and http://blog.timbunce.org/2008/02/12/comparative-language-job-trend-graphs/ -- the data are getting a bit long in the tooth, but it's really hard to do these assessments in a very up-to-the-minute fashion;-).
Does anyone know ig google uses Ruby for any application development?
Nope: they use C/C++/Java/Python/JavaScript (I'll go find a reference).
Here's a post by Steve Yegge that makes it pretty clear they don't do Ruby.
About job prospects: If you want to work for Google, it doesn't matter which of Python, Perl and Ruby you are fluent at: Python hackers don't have an advantage over Ruby hackers etc. when applying for a job as a Software Engineer. If you want to do Perl or Ruby programming a lot, Google is not the place for you.
To get an approximation about programming language popularity in job openings, try searching for programming languages on job offer sites. For example, http://www.itpinoy.com/search/ says Java is more popular than PHP, which is more popular than Ruby.
I've been programming Perl for several years before I started using Ruby. Again, a few years later, I started using Python, while still doing Perl and Ruby as well. In general, I tend to be more productive in Ruby and Python than in Perl, so I don't do much Perl anymore. I like Python because it feels like mature, well-designed and clean for me (compared to Ruby, which I feel a little bit hackier), and I like Ruby because I can do powerful operations by typing only a little (in contrast, Python doesn't support assignment in the middle of an expression, blocks, regexps as first-class objects, mutable strings; and the standard library of Python is not so versatile, e.g. the list and dict types have less methods than in Ruby).
So for someone new to Perl, Ruby and Python, I'd recommend spending a day with Ruby, one more day with Python, and choose which of these two to concentrate on learning.
For a long time I've been trying different languages to find the feature-set I want and I've not been able to find it. I have languages that fit decently for various projects of mine, but I've come up with an intersection of these languages that will allow me to do 99.9% of my projects in a single language. I want the following:
Built on top of .NET or has a .NET implementation
Has few dependencies on the .NET runtime both at compile-time and runtime (this is important since one of the major use cases is in embedded development where the .NET runtime is completely custom)
Has a compiler that is 100% .NET code with no unmanaged dependencies
Supports arbitrary expression nesting (see below)
Supports custom operator definitions
Supports type inference
Optimizes tail calls
Has explicit immutable/mutable definitions (nicety -- I've come to love this but can live without it)
Supports real macros for strong metaprogramming (absolute must-have)
The primary two languages I've been working with are Boo and Nemerle, but I've also played around with F#.
Main complaints against Nemerle: The compiler has horrid error reporting, the implementation is buggy as hell (compiler and libraries), the macros can only be applied inside a function or as attributes, and it's fairly heavy dependency-wise (although not enough that it's a dealbreaker).
Main complaints against Boo: No arbitrary expression nesting (dealbreaker), macros are difficult to write, no custom operator definition (potential dealbreaker).
Main complaints against F#: Ugly syntax, hard to understand metaprogramming, non-free license (epic dealbreaker).
So the more I think about it, the more I think about developing my own language.
Pros:
Get the exact syntax I want
Get a turnaround time that will be a good deal faster; difficult to quantify, but I wouldn't be surprised to see 1.5x developer productivity, especially due to the test infrastructures this can enable for certain projects
I can easily add custom functionality to the compiler to play nicely with my runtime
I get something that is designed and works exactly the way I want -- as much as this sounds like NIH, this will make my life easier
Cons:
Unless it can get popularity, I will be stuck with the burden of maintenance. I know I can at least get the Nemerle people over, since I think everyone wants something more professional, but it takes a village.
Due to the first con, I'm wary of using it in a professional setting. That said, I'm already using Nemerle and using my own custom modified compiler since they're not maintaining it well at all.
If it doesn't gain popularity, finding developers will be much more difficult, to an extent that Paul Graham might not even condone.
So based on all of this, what's the general consensus -- is this a good idea or a bad idea? And perhaps more helpfully, have I missed any big pros or cons?
Edit: Forgot to add the nesting example -- here's a case in Nemerle:
def foo =
if(bar == 5)
match(baz) { | "foo" => 1 | _ => 0 }
else bar;
Edit #2: Figured it wouldn't hurt to give an example of the type of code that will be converted to this language if it's to exist (S. Lott's answer alone may be enough to scare me away from doing it). The code makes heavy use of custom syntax (opcode, :=, quoteblock, etc), expression nesting, etc. You can check a good example out here: here.
Sadly, there's no metrics or stories around failed languages. Just successful languages. Clearly, the failures outnumber the successes.
What do I base this on? Two common experiences.
Once or twice a year, I have to endure a pitch for a product/language/tool/framework that will Absolutely Change Everything. My answer has been constant for the last 20 or so years. Show me someone who needs support and my company will support them. And that's that. Never hear from them again. Let's say I've heard 25 of these.
Once or twice each year, I have to work with a customer who has orphaned technology. At some point in the past, some clever programming built a tool/framework/library/package that was used internally for several projects. Then that programmer left. No one else can figure that darn thing out, and they want us to replace/rewrite it. Sadly, we can't figure it out either, and our proposal is to rewrite from scratch. And they complain that their genius built the set of apps in a period of weeks, it can't take us months to rewrite them in Java/Python/VB/C#. Let's say I've written 25 or so of these kinds of proposals.
That's just me, one consultant.
Indeed one particularly sad situation was a company who's entire IT software portfolio was written by one clever guy with a private language and tools. He hadn't left, but he'd realized that his language and toolset had fallen way behind the times -- the state of the art had moved on, and he hadn't.
And the move was -- of course -- in an unexpected direction. His language and tools were okay, but the world had started to adopt relational databases, and he had absolutely no way to upgrade his junk to move away from flat files. It was something he had not foreseen. Indeed, it was something he could not possibly foresee. [You won't fall into this trap, will you?]
So, we talked. He rewrote a lot of the applications in Plain-Old VAX Fortran (yes, this is a long time ago.) And he rewrote it to use plain old relational SQL stuff (Ingres, at the time.)
After a year of coding, they were having performance problems. They called me back to review all the great stuff they'd done in replacing the home-built language. Sadly, they'd done the worst possible relational database design. Worst possible. They'd taken their file copies, merges, sorts, and what-not, and implemented each low-level file system operation using SQL, duplicating database rows left, right and center.
He was so mired in his private vision of the perfect language, that he couldn't adapt to a relatively common, pervasive new technology.
I say go for it.
It would be an awesome experience regardless of weather it makes it to production or not.
If you make it compile down to IL then you do not have to worry about not being able to re-use your compiled assemblies with C#
If you believe that you have valid complaints about the languages you listed above, it is likely that many will think like you. Of course, for every 1000 interested person there might be 1 willing to help you maintain it - but that is always the risk
But here are a few things to be cautioned about:
Get your language specification IN STONE before development. Make sure any and all language features are figured out before hand - even things that you may only want in the future. In my opinion, C# is slowly falling into the "oh-just-one-more-language-extension" trap that will lead to its eventual doom.
Be sure to make it optimized. I dont know what you already know; but if you dont know then learn ;) Nobody will want a language that has nice syntax but runs as slow as IE's javascript implementation.
Good luck :D
When I first started my career in the early 90s, there seemed to be this craze of everyone developing their own in-house languages. My first 3 jobs were with companies that had done this. One company had even developed their own operating system!
From experience, I'd say this is a bad idea for the following reasons:
1) You will spend time debugging the language itself in addition to the code base on top of it
2) Any developers you hire will need to go through the learning curve of the language
3) It will be hard to attract and keep developers since working in a proprietary language is a dead-end for someone's career
The main reason I left those three jobs was because they had proprietary languages and you'll notice that not many companies take this route any more :).
An additional argument I'd make is that most languages have entire teams whose full time job it is to develop the language. Maybe you'd be an exception, but I'd be very surprised if you'd be able to match that level of development by only working on the language part-time.
Main complaints against Nemerle: The
compiler has horrid error reporting,
the implementation is buggy as hell
(compiler and libraries), the macros
can only be applied inside a function
or as attributes, and it's fairly
heavy dependency-wise (although not
enough that it's a dealbreaker).
I see your post has been written more than two years ago.
I advise you trying Nemerle language today.
The compiler is stable. There are no blocker bugs for today.
The VS integration has a lot of improvements , also there is SharpDevelop integration.
If you give it a chance, you won't be disappointed.
NEVER EVER develop your own language.
Developing your own language is a fool's trap, and worse it will limit you to what your imagination can provide, as well demanding that you work out both your development environment and the actual programme you're writing.
The cases in which this doesn't apply are pretty much if you're Larry Wall, the AWK guys, or part of a substantial group of people dedicated to testing the boundaries of programming. If you're in any of those categories, you don't need my advice, but I strongly doubt that you're targeting a niche where there is no suitable programming language for the task AND the characteristics of the people doing the task.
If you are as clever as you seem to be (a likely possibility), my advice is to go ahead and do the design of the language first, iterate a couple of times over it, ask some smart fellows you trust in smart programming language related communities about the concrete design you came up with and then take the decision.
You might realize in the process of creating the design that just a quick hack on Nemerle would give it all you need, for example. Many things can happen just when thinking hard about a problem, and the final solution might not be what you actually had in mind when beginning the project.
Worst case scenario, you're stuck with actually implementing the design, but by then you will have it proof read and mature, and you'll know with a high degree of certainty that it was a good path to take.
A related piece of advice, start small, just define the features you absolutely need and then build on them to get the rest.
Writing your own language is not a easy project.. Especially one to be used in any kind of "professional setting"
It is a huge amount of work, and I would doubt you could write your own language, and still write any big projects that use it - you will spend so long adding features that you need, fixing bugs, and general language-design stuff.
I would strongly recommend choosing a language that is closest to what you want, and extending it to do what you need. It'll never be exactly what you want, but compared to the time you'll spend writing your own language, I would say that's a small compromise..
Scala has a .NET compiler. I don't know the status of this though. It's kind of a second class citizen in the Scala world (which is more focused on the JVM). But it might be a good tradeof to adopt the .NET compiler instead of creating a new language from scratch.
Scala is kind of weak in the meta-programming department ATM. It's possible that the need for metaprogramming is somewhat reduced by other language features. In any case I don't think anyone would be sad if you were to implement metaprogramming features for it. Also there is a compiler plug-in infrastructure on the way.
I think most languages will never fit all of the bill.
You might want to combine your 2 favourite languages (in my case C# and Scheme) and use them together.
From a professional point of view, this probably not a good idea though.
It would be interesting to hear some of the things you feel you can't do in existing languages. What kind of projects are you working on that can't be done in C#?
I'm just curios!