How useful is Alice as a learning tool? - alice

I am currently studying and my next subject is 'Programming Using Alice'. Has anyone had any experience using this? And how would you rate it as a learning tool?

Yea, I used it in my AP Java 2 class back in high school. It was fun, though primitive (at that point, I was more interested in advancing my programming capabilites and didn't like ANYTHING that limited what I could do). It is a good learning tool for children and those who are just getting started in programming (i.e. have never programed in any language before). It teaches the basic concepts in a very visual way with immediate results. Its one very good feature is the 3d environment you get to explore. This makes it applicable even if you are developing your programming skills into something more. You can play around with it and make a fairly decent game; I would expect that a chess game is very possible!
In all, I think it is worth while learning if nothing else to say that you've done it. As it has been a while since I last used it, my response may be a bit dated and I apologize if there is something that I said that is no longer true.
Enjoy!

Alice is good tool for students who have had no experience with programming. Simple to the point of being childish, but it gets its point across. Many instructors are raving about how much better the students are understanding programming after using Alice in their 101.
For full disclosure, I have seen Alice used but have never used it myself. My comments come from instructors that I have talked to.

I took a course on Alice, and it really helps you visualize the concept of programming and most important of, the manipulation of classes. But I really don't recommended for universities courses like the one I took because there are a lot more helpful and efficient object oriented software that can help you applying what you learn in programming courses in an easy way and gaining experience at the same time for example in game development and others involving graphics and visual objects...

I had this class recently, and although I have years of programming experience, it was really funny.
Deciding if a language is useful or not is out of the question when it's about learning algorithmic. I remind myself asking the same question about my Turbo Pascal class, 18 years ago. Yeah it's true that I never made a living of it, as for Alice. However it provided me with good bases in algorithmic.
Yes Alice is a good learning tool. You will learn about functions, parameters, variable settings, lists and arrays, and command functions, not to mention loops. And it's an OOP language too, so you will learn that objects have properties that can be manipulated.
These are the basic principles of any modern language, with a fun layer on top of it since you ultimately will animate characters and objects in 3D.

Related

Which will serve a budding programmer better: A classic book in scheme or a modern language like python?

I'm really interested in becoming a serious programmer, the type that people admire for hacker chops, as opposed to a corporate drone who can't even complete FizzBuzz.
Currently I've dabbled in a few languages, most of my experience is in Perl and Shell, and I've dabbled slightly in Ruby.
However, I can't help but feel that although I know bits and pieces of languages, I don't know how to program.
I'm really in no huge rush to immediately learn a language that can land me a job (though I'd like to do it soon), and I'm considering using PLT Scheme (now called Racket) to work through How to Design Programs or Structure and Interpretation of Computer Programs, essentially, one of the Scheme classics, because I have always heard that they teach people how to write high-quality, usable, readable code.
However, even MIT changed its introductory course from using SICP and Scheme to one in Python.
So, I ask for the sage advice of the many experienced programmers here regarding the following:
Does Scheme (and do those books) really teach one how to program well? If so, which of the two books do you recommend?
Is this approach to learning still relevant and applicable? Am I on the right track?
Am I better off spending my time learning a more practical/common language like Python?
Is Scheme (or lisp in general) really a language that one learns, only to never use? Or do those of you who know a lisp code in it often?
Thanks, and sorry for the rambling.
If you want to learn to really program, start doing it. Quit dabbling and write code. Pick a language and write code. Solve problems and release applications. Work with experienced programmers on open source projects, but get doing. A lot.
Does Scheme (and do those books) really teach one how to program well? If so, which of the two books do you recommend?
Probably. Probably better than any of the Learn X in Y Timespan books.
Is this approach to learning still relevant and applicable? Am I on the right track?
Yes.
Am I better off spending my time learning a more practical/common language like Python?
Only if you plan to get a job in it. Scheme will give you a better foundation though.
Is Scheme (or lisp in general) really a language that one learns, only to never use? Or do those of you who know a lisp code in it often?
I do emacs elisp fiddling to adjust my emacs. I also work with functional languages on the side to try to keep my mind flexible.
My personal opinion is that there are essentially two tracks that need to be walked before the student can claim to know something about programming. Track one is the machine itself, the computer. You should start with assembly here and learn how the computer works. After some work and understanding there - don't skimp - you should learn C and then C++; really getting the understanding of resource management and what really happens. Track two is the very high level language track - Scheme, Prolog, Haskell, Perl, Python, C#, Java, and others that execute on a VM or interpreter lie in this area. These, too, need to be studied to learn how problems can be abstracted and thought about in different ways that do not involve the fiddly bits of a real computer.
However, what will not work is being a language dilettante when learning to program. You will need to find a language - Scheme is acceptable, although I'd recommend starting at the low level first - and then stick with that language for a good year at least.
The most important parts of Scheme are the programming-language concepts you can pick up that modern languages are now just adopting or adding support for.
Lisp and Scheme have supported, before most other languages, features that were often revolutionary for the time: closures and first-order functions, continuations, hygienic macros, and others. C has none of these.
But they're appearing more and more often in programming languages that Get Stuff Done today. Why can you just declare functions seemingly anywhere in JavaScript? What happens to outside variables you reference from within a function? What are these new "closures" that PHP 5.3 is just now getting? What are "side effects" and why can they be bad for parallel computing? What are "continuations" in Ruby? How do LINQ functions work? What's a "lambda" in Python? What's the big deal with F#?
These are all questions that learning Scheme will answer but C won't.
I'd say it depends on what you want to do.
If you want to get into programming, Python is probably better. It's an excellent first language, resembles most common programming languages, and is widely available. You'll find more libraries handy, and will be able to make things more easily.
If you want to get into computer science, I'd recommend Scheme along with SICP.
In either case, I'd recommend learning several very different languages eventually, to give you more ways to look at and solve problems. Getting reasonably proficient in Common Lisp, for example, will make you a better Java programmer. I'd take them one at a time, though.
The best languages to start with are probably:
a language you want to play/learn in
a language you want to work in
And probably in that order, too, unless the most urgent need is to feed yourself.
Here's the thing: the way to learn to program is to do it a lot. In order to do it a lot, you're going to need a lot of patience and more than a little bit of enthusiasm. This is more important than the specific language you pick.... but picking a language that you like working in (whether because you like the features or because you feel it'll teach you something) can be a big boost.
That said, here's a couple of comments on Scheme:
Does Scheme (and do those books)
really teach one how to program well?
The thing about Scheme (or something like it) is that if you learn it, it'll teach you some very useful abstractions that a lot of programmers who don't ever really come to grips with a functional programming language never learn. You'll think differently The substance of programming languages and computing will look more fluid to you. You'll have a better idea of how to compose your own quasi-primitives out of a very small set of primitives rather than relying on the generally static set of primitives offered in some other languages.
The problem is that a lot of what I'm saying might not mean much to you at the moment, and it's a bit more of a mind-bending road than coming into a common dynamic language like Perl, Python, or Ruby... or even a language like C which is close to the Von Neumann mechanics of the machine.
This doesn't mean it's necessarily a bad idea to start there: I've been part of an experiment where we taught Prolog of all things to first-time programmers, and it worked surprisingly well. Sometimes beginner's mind actually helps. :) But Scheme as a first language is definitely an unconventional path. I suspect Ruby or Python would be a gentler road.
Is Scheme (or lisp in general) really
a language that one learns, only to
never use?
It's a language that you're unlikely to be hired to program in. However, while you're learning to program, and after you've learned and are doing it in your free time, you can write code in whatever you want, and because of the Internet, you'll probably be able to find people working on open source projects in whatever language you want. :)
I hate to tell ya, but nobody admires programmers for their "hacker chops". There's people who get shit done, then there's everyone else. A great many of the former types are the "corporate drones" you appear to hold in contempt.
Now, for your question, I personally love Lisp (and Scheme), but if you want something you're more likely to use in industry "Beginning Python" might be better material for you as Python is found more often in the wild. Or if you enjoy Ruby, find some good Ruby material and start producing working solutions (same with Java or .Net or whatever).
Really, either route will serve you well. The trick is to stick with it until you've internalized the concepts being taught.
Asking whether an approach to learning is relevant and applicable is tricky - there are many different learning styles, and it's a matter of finding out which ones apply to you personally. Bear in mind that the style you like best might not be the one that actually works best for you :-)
You've got plenty of time and it sounds like you have enthusiasm to spare, so it's not a matter of which language you should learn, but which one you should learn first. personally, I'd look at what you've learnt so far, what types of languages and paradigms you've got under your belt, and then go off on a wild tangent and chose one completely different.
I started programming at a very very young age. When I was in high school, I thought I was a good programmer. That's when I started learning about HOW and WHY the languages work rather than just the syntax.
Before learning the how and why, switching to a new language would have been hell. I had learned a language, but I hadn't learned to program. Now that I know the fundamental concepts well, I can apply them to virtually any language and pick it up with ease.
I would highly recommend a book (or even a school coarse, if you can afford it) that takes you through the processes of coding without relying on a specific language.
Unfortunately I don't have any books to recommend, but if others agree with me and know of any, maybe they can offer a suggestion.
//Edit: After re-reading your question, I realize that I may have not actually answered any of them... Sorry about that. I think picking up a book that will take you in-depth with best-practices would be extremely helpful, regardless of the language you choose.
There are basic programming concepts (logic flow, data structures), which are easily taught by using languages like Python. However, there are much more complex programming concepts (design patterns, optimization, threading, etc.) which the classic languages don't abstract away for you.
If your search for knowledge leans more toward algorithm development and the science of programming, start with C. If your search is more for a practical ends, I hear Ruby is a good starting point.
I agree with gruszczy. I'd start programming with C.
It may be kind of scary at first (at least for me :S ) but in the long run you'd be grateful it. I mean I love Python, but because I learned C first, the learning curve for other languages wasn't very steep at all.
Start with C and make it so.
Just remember to practice, because you'll never improve at something by doing nothing. ;)
To a specific point in your question, the "classics" you mention will help you with exactly what the titles say. SICP is about the structure and interpretation of computer programs. It is not about learning Scheme (though you will learn Scheme). HtDP is about how to design programs, it is not about learning Scheme (though you will learn Scheme).
Scheme, in principle, is a very small and concise language with almost no gotchas. This makes it excellent for moving on to learning how to structure and interpret programs, or how to design them. More traditional "practical" languages like C, C++, Python, or Java do not have this quality. They are rife with syntax. Learning with these languages means you must simultaneously learn syntactical quirks while learning to think like a programmer. In my opinion, this is unfortunate. In some cases the quirks are good, in others they are accidents of history, but in all cases it is unfortunate.
Start coding in C. It should be a horror for you at first, but this teaches you most important stuff like: pointers, recurrence, memory management. Try reading some classic books about programming like The Art of Computer Programming by Donald Knuth. After you master that, you can think about learning object oriented programming or functional programming. First basics. If fou manage to learn them, nothing will be hard for you ever again.

Why is GW-BASIC still taught in schools? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I dunno about USA and the UK, but in India, schools still teach GW-BASIC. Yes, it's:
10 PRINT "HELLO WORLD"
20 GOTO 10
As far as my experience goes, even writing assembler is easier than this mess of a language. It could easily be replaced by something like Python, which would make it easier for students to actually understand the basic concepts of programming and help them to understand the logic behind what they're doing better.
Because Basic is the most uhh... basic introduction into von-Neumann architecture which is what all modern computers and (by extension) programming languages are based on.
Think about it:
Line numbers = Memory Addresses
Variables = CPU Registers
Current
Line = CPU Instruction Pointer
Goto
= Jump instruction
Ever try teaching programming to someone with no idea what it's about?
I did for 4 years. For absolutely starting out, GWBASIC is pretty good. You can get the most action for the least effort, while still conveying basic ideas, like:
The computer finishes one statement before starting the next. (Newbies are inclined to think the computer does everything "at once".)
A program is like something built out of tinker-toys. There are only a few basic pieces, and you assemble them to make it do what you want. (Newbies often think since the language has words like IF and PRINT that it will just understand whatever they type in.)
Variables are a key concept. They have a name that you give them, and they have values that they get when the programs runs. That's complicated. The name and the value are not the same thing, and there is a distinction between write-time and run-time.
Once you get past some basic concepts with the help of GWBASIC you can begin to introduce a more modern disciplined language.
GW-Basic was taught to me in 7th grade about 10 years ago. I found it was a great language and easy to experiment with as a beginner. Even the non-pc-freaks had little problem learning the language.
In my opinion it is a great tool to motivate beginners to learn more advanced programming languages.
GW-Basic is a great language for new programmers. If someone has never done any programming before, something simple like GW-Basic will be a lot easier for them to comprehend as compared to something like Python. Also, Java has a lot better support for Object Oriented programming as compared to C++. More commercial applications these days are written in Java than C++.[source]. Hence I would say that its a good thing they are switching to Java over C++.
As far as teaching in India is concerned and why they use GW-Basic, I can only guess (being from the USA):
It's cheap. Perhaps they have received old hardware with GW-Basic on it. Hey, it's there, it's free, why not use it to teach children.
The teacher knows it. If the teacher knows/understands it, he/she can teach it.
At a prev. employer, I met a number of people who immigrated to the USA from India and explained that the first time they worked with Windows was when they arrived over here, none of the schools (not even college/university) had it. It might depend on the school they went to, but maybe its a matter of the available equipment. It's possible this GW-Basic usage you speak of works the same way: they used what technology they had.
Maybe it means they are, well, resourceful.
As to whether its good that they are learning something so old, I'm not so sure it's such a good idea. But as the famous (American West) folk wisdom says, "Do with what you got. It'll pay off in the end." Better to expose them when they are young.
It's funny how fast humans forget.
Remember the first time you struggled with the concept of a loop? With the idea of a variable and how it retained values? With remembering syntax?
Basic has a relatively small built-in syntax, it has fairly flexible structures for loops and other constructs.
I guess over all it's "loose". This helps a lot in learning.
Loose is very bad for good, stable programs. You want very little flexibility, you want patterns that you can count on and very few options (even if you don't know that this is what you want, you will understand it as soon as you have to lead a team of 5 developers from another country).
If any here haven't really considered it, the reason we don't like basic isn't a lack of "power" or speed--is because it's loose--the exact same reason it's good for teaching.
You don't start out running, you learn to crawl in a wobbly sort of way, then you stumble, etc.
But once you are running sprints, you really want to make sure that every footfall is placed exactly where you want it, and if the guy ahead of you decides he suddenly wants to start crawling, you're screwed.
Of course, if you're running along the track alone or in a small, in-sync team, it doesn't matter much what you do. Feel free to use any language you want :)
If someone is truly interested in programming, they will take what they learn in that class and apply it to a language learned on their own time.
There's also something to be said for starting in a language that is much less powerful than Java or C++.
so you'll learn NOT to use GOTO
Thats easy to learn,school dont target to teach new technology,school want to teach basics of informatics
I think in my school GW Basic is still taught at 6-7 years (of 10) and the reason of it is that little girls and boys can't understand anything harder than basic :)
Even more, in my university we program on QBasic o_O omg you say? yeah, i'm shoked too :) oh, and they promise one semester of C++ on 4th grade.. yay!
I am from India and GW-BASIC was my first language way back in 1995. It was fun. Things have changed now. My school now teaches another BASIC variant, QBASIC as the first language. Then students move to C++ and Java in standards 8,9,10. Hopefully, Python will take over sometime.
As someone already pointed out, its plain inertia. Its not much of inexpensive hardware which is the reason. Its just the mindset to continue doing whatever has been going on.sigh.
I think GW-BASIC is a good tool to teach programming to children. I am teaching programming to school children for about 10 years. GW-BASIC provides an easy to learn enviornment without going into techniqual details.
If we use some hi-fi programming language to teach kids they will learn the programming language not the programming. Using GW-BASIC it is easy to teach programming, and we can concentrate on programming techniques rather then discussing the structures of programming languages. It has very easy and english like syntax so students understand it easily.
Another thing to keep in mind is its an interpreter to BASIC so we can execute different instructions line by line and can execute any part of the program, this give clear understanding to students.
Direct mode of GW-BASIC provides great help to explain the memory concepts as we can monitor the changing states of variables (memory addresses and values)
As far as GW-BASIC is concerned I couldn't agree more. This is why a Ruby programmer known only as "_why the lucky stiff" created an amazing platform for learning to program called "Hackety Hack". He in fact had quite a lot of insight into teaching programming to young people at the Art & Code symposium:
http://vodpod.com/watch/2078103-art-code-symposium-hackety-hack-why-the-lucky-stiff-on-vimeo

How long does it take an experienced programmer to become proficient with a new technology / language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I realize that the question is likely o get a lot of "it depends", but I am curious anyway. When you hire somebody new (but experienced) to the team, and they don't have expertise in technology you are using, but know something similar, how much time do you budget for them to "get online."
I am talking about something fairly substantial, like a language, or a framework / product that has a lot of ways of doing things. Obviously, many libraries takes very little time to start using.
In my own experience (10 years of experience, including a fair amount of consulting, so learning new technologies is par for the course), it takes me about three to six months of experience to become proficient at a new technology, and about a year to feel like I am approaching expert level where I know all the basics and medium-difficulty issues, along with a few areas very well.
What do you do in your projects? How do you budget the time to account for learning.
It doesn't only depend on the individual involved -- it crucially depends on the specific technology as well as the individual's background; certain technologies, esp. languages, are just harder and slower to get into. I've seen world-class Java gurus with zero previous exposure to C++ take many months, say on the order of six or so, to be fully productive in C++; vice versa (world-class C++ guru with zero previous exposure to Java) I've seen take about 2-3 months; again for extremely experienced and skilled programmers with no previous exposure to dynamic languages, being fully productive in Python can be expected to take 3-4 weeks. In each case I'm talking about 100% full-time involvement in the relevant technology, by a programmer in the world's top one percent in terms of skill and experience, within a team having several other programmers of that caliber who also gurus in the specific language in use.
Factors that can shorten the time are previous exposure to "similar" languages/technologies, e.g. a solid background in C makes C++ slightly faster to learn, solid background in C# helps with Java, solid background in Ruby or Perl helps with Python. Factors that can lengthen the time include lack of suitably experienced teammates, not being 100%-immersed in the "new thing", and psychological resistance (not really wanting to do it with all one's heart!-).
I've focused on programming languages for my examples, but some technologies can be even harder, i.e., take longer to master -- if you've never written embedded real-hard-time programs (no dynamic allocation of memory allowed, proofs of upper bound on response time required of all function) even six months might not suffice; some application areas require mastery of application domains that, all on their own, can take even longer (if to understand at all what's going on, and therefore be fully productive, you need the equivalent of a BSc in Psychology, or deep knowledge of the Law, or a CPA's qualifications, etc, well, each of those takes years on its own!).
I don't think the language as such is the issue, rather the programming paradigm it encompasses.
e.g. earlier this year I tried C#, coming from a Java perspective. That was all very straightforward. However, I'm now trying Scala. Because of the functional aspect, I expect to be learning and honing my skills for a lot longer (you can write Scala in an imperative fashion, but you don't leverage its strengths doing that).
I suspect the same would apply when (say) migrating from a relational database to an OO database, vs. a MS-SQL/Oracle migration.
It does depend, mainly on how closely the language resembles a language they already know, as well as individual abilities at picking up new things. Moving between similar languages like C++, Java, and C# is very easy. Similarly, moving from (say) Win32 to MFC to .net is going to be easier than from MFC to MacOS.
Moving from C to C++ is likely to take longer, as the programmer has to learn OO methodologies. Moving from C++ to Perl or ML could take a lot longer!
However, you usually don't need to know much to get started. Moving from C++ to C# can be done in a few hours reading (on the main differences) and then you can start writing (or modifying existing) code. That's because (a) you already know how to do OO programming, and (b) 95% of the syntax is identical.
But the main thing it depends on is your definition of "proficient". With similar languages, you will be able to write good code within a few days (an algorithm is usually failry language independent), but it usually takes months or years to become truly "proficient" in a language or large library.
So I'd say as a rule of thumb, "up to (a reasonable) speed" in a few weeks, but you might see silly "mistakes" or inefficiencies in their code for months/years until they learn all the little tricks of the language.
In the case of people learning OO, usually it seems to take a few days to get the basic concepts, and then at about the 2 year mark, a moment of epiphany occurs where the programmer suddenly relises that they truly "get" it. (I guess this is when your brain starts thinking fluently in OO rather than trying to think procedurally and then translate that into an OO aproach)
In our environment (US health care revenue cycle) it is more than just learning and becoming proficient in the language or technology stack we use to deliver our solutions to our customers. The developer also has to understand the problem domain. We work with entities that often don't document the behaviors of their systems well-enough for external entities (us) to communicate with to get the data that our customers we want. Our developers are forced to think beyond the specs to build a functioning system.
There is also the inevitable "It doesn't work; fix it" problem report from the customer support staff. Frequently the problem isn't a defect in our software; it is an issue with other entities with which our software communicates. Our developers have to be able to identify (and sometimes prove) that it isn't our software so that our business analyst-types can go to that other entity and explain the issue in a way that will get them to resolve the problem.
You were expecting this answer but it all depends on the person/programmer. I have been in a situation where two equally skilled programmers had to pick up something new, one got it right away, while the other one took some time. Previous exposures to other technologies are also a factor.
Personally, in regard to something new, I budget my time to learning everything about it every chance I get. It would take about 6 months to fully be comfortable.
Hope this helps.
I am talking about something fairly substantial, like a language, or a framework / product
that has a lot of ways of doing things. Obviously, many libraries takes very little time to start using.
When you hire somebody new (but experienced) to the team, and they don't have expertise in technology you
are using, but know something similar, how much time do you budget for them to "get online."
Twenty-three work days, six hours, forty-three minutes, and seventeen point nine seconds.
What do you do in your projects? How do you budget the time to account for learning.
I think these questions are better!
Try to find an easy project in the new technology, and have them do that. If possible, have the person start by fixing bugs, then adding small features.
Learning is incremental. One can continue learning details of, say, C++ syntax throughout one's life. When one is an "expert" in a topic, it just means that the gains from learning more in that topic are growing smaller.
+1 for it depends.
It depends on such things as
the attitude and capabilities of the person learning it
is the problem area programming/paradigm well understood by that person
the similarity of the new technology / language to other technologies he/she does know
the consistency of the new technology / language in its interface (API, grammar, etc...)
what is proficient (knowing just the language, or als the basic library, or also runtime behaviour (interactions with underlying technology))
Having said that, in my experience a smart person learning a new language/technology will quickly be more productive than other people with more experience in that language/technology.
See Peter Norvig's Teach Yourself Programming in Ten Years for the related question of how long does it take to become proficient in programming.
It so completely depends on whether you already know languages that are similar to the new one, and know something about the problem domain the new language is suited for. I'd say don't expect to be reasonably proficient in less than 3-6 months, but again, it depends.
To take one example I implemented a PHP/MySQL web application a couple years ago (total effort was about 6 months). It was my first reasonably large web application, and my first PHP ever. I've used relational databases, but this was also my first exposure to MySQL. MySQL came very quickly, as expected, since it's really only a dialect of a language I knew well. What surprised me was that PHP also came quickly. I realized that not only did it borrow ideas from PERL and C/C++, but the whole paradigm of coding with integrated SQL statements strongly drew on some experience I had in the 90's with, of all things, Informix 4GL.
At the other end of the spectrum, I've never really learned a functional language, so I'm trying to pick up Scala. This is going to take substantially longer, and there'll be a long period where my Scala will feel like Java in disguise, and not be that functional.
So ... it depends! ;-)
I agree that it depends.
You also run the risk that if the person knows one technology/paradigm, they will code in the new language/technology using the old practices/paradigms.
For example, I picked up Python really fast (I'm a Java/C++ guy), but it took a long time since I stopped writing Java style code in Python and started thinking functionally.
To get really good, I think there's no replacement for experience. For instance, I'm sure I can easily pick up J2EE, but the experience to built up the best enterprise systems is not something you can pick up that fast.

Minimum CompSci Knowledge Needed for Writing Desktop Apps

Having been a hobbyist programmer for 3 years (mainly Python and C) and never having written an application longer than 500 lines of code, I find myself faced with two choices :
(1) Learn the essentials of data structures and algorithm design so I can become a l33t computer scientist.
(2) Learn Qt, which would help me build projects I have been itching to build for a long time.
For learning (1), everyone seems to recommend reading CLRS. Unfortunately, reading CLRS would take me at least an year of study (or more, I'm not Peter Krumins). I also understand that to accomplish any moderately complex task using (2), I will need to understand at least the fundamentals of (1), which brings me to my question : assuming I use C++ as the programming language of choice, which parts of CLRS would give me sufficient knowledge of algorithms and data structures to work on large projects using (2)?
In other words, I need a list of theoretical CompSci topics absolutely essential for everyday application programming tasks. Also, I want to use CLRS as a handy reference, so I don't want to skip any material critical to understanding the later sections of the book.
Don't get me wrong here. Discrete math and the theoretical underpinnings of CompSci have been on my "TODO: URGENT" list for about 6 months now, but I just don't have enough time owing to college work. After a long time, I have 15 days off to do whatever the hell I like, and I want to spend these 15 days building applications I really want to build rather than sitting at my desk, pen and paper in hand, trying to write down the solution to a textbook problem.
(BTW, a less-math-more-code resource on algorithms will be highly appreciated. I'm just out of high school and my math is not at the level it should be.)
Thanks :)
This could be considered heresy, but the vast majority of application code does not require much understanding of algorithms and data structures. Most languages provide libraries which contain collection classes, searching and sorting algorithms, etc. You generally don't need to understand the theory behind how these work, just use them!
However, if you've never written anything longer than 500 lines, then there are a lot of things you DO need to learn, such as how to write your application's code so that it's flexible, maintainable, etc.
For a less-math, more code resource on algorithms than CLRS, check out Algorithms in a Nutshell. If you're going to be writing desktop applications, I don't consider CLRS to be required reading. If you're using C++ I think Sedgewick is a more appropriate choice.
Try some online comp sci courses. Berkeley has some, as does MIT. Software engineering radio is a great podcast also.
See these questions as well:
What are some good computer science resources for a blind programmer?
https://stackoverflow.com/questions/360542/plumber-programmers-vs-computer-scientists#360554
Heed the wisdom of Don and just do it. Can you define the features that you want your application to have? Can you break those features down into smaller tasks? Can you organize the code produced by those tasks into a coherent structure?
Of course you can. Identify any 'risky' areas (areas that you do not understand, e.g. something that requires more math than you know, or special algorithms you would have to research) and either find another solution, prototype a solution, or come back to SO and ask specific questions.
Moving from 500 loc to a real (eve if small) application it's not that easy.
As Don was pointing out, you'll need to learn a lot of things about code (flexibility, reuse, etc), you need to learn some very basic of configuration management as well (visual source safe, svn?)
But the main issue is that you need a way to don't be overwhelmed by your functiononalities/code pair. That it's not easy. What I can suggest you is to put in place something to 'automatically' test your code (even in a very basic way) via some regression tests. Otherwise it's going to be hard.
As you can see I think it's no related at all to data structure, algorithms or whatever.
Good luck and let us know
I must say that sitting down with a dry old textbook and reading it through is not the way to learn how to do anything effectively, even if you are making notes. Doing it is the best way to learn, using the textbooks as a reference. Indeed, using sites like this as a reference.
As for data structures - learn which one is good for whatever situation you envision: Sets (sorted and unsorted), Lists (ArrayList, LinkedList), Maps (HashMap, TreeMap). Complexity of doing basic operations - adding, removing, searching, sorting, etc. That will help you to select an appropriate library data structure to use in your application.
And also make sure you're reasonably warm with MVC - i.e., ensure your model is separate from your view (the QT front-end) as best as possible. Best would be to have the model and algorithms working on their own, and then put the GUI on top. Or a unit test on top. Etc...
Good luck!
It's like saying you want to move to France, so should you learn french from a book, and what are the essential words - or should you just go to France and find out which words you need to know from experience and from copying the locals.
Writing code is part of learning computer science. I was writing code long before I'd even heard of the term, and lots of people were writing code before the term was invented.
Besides, you say you're itching to write certain applications. That can't be taught, so just go ahead and do it. Some things you only learn by doing.
(The theoretical foundations will just give you a deeper understanding of what you wind up doing anyway, which will mainly be copying other people's approaches. The only caveat is that in some cases the theoretical stuff will tell you what's futile to attempt - e.g. if one of your itches is to solve an NP complete problem, you probably won't succeed :-)
I would say the practical aspects of coding are more important. In particular, source control is vital if you don't use that already. I like bzr as an easy to set up and use system, though GUI support isn't as mature as it could be.
I'd then move on to one or both of the classics about the craft of coding, namely
The Pragmatic Programmer
Code Complete 2
You could also check out the list of recommended books on Stack Overflow.

What are the biggest time wasters for learning programming? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I've had several false starts in the past with teaching myself how to program. I've worked through several books (mostly C and Python), and end up just learning the syntax without feeling as though I could sit down and actually write a program for myself. When I try to look through the source trees of a project on Codeplex or Sourceforge, I never seem to know where to start reading the code -- the dependencies seem to go in all directions.
I feel as though I'm not learning programming the way it's done "on the street," so I figured I'd take a different approach to asking how a newbie should learn how to code. If you had to learn programming all over again, what are the things you wouldn't do? What did you spend time doing that you now know wasted you weeks or months?
Where I see beginners wasting weeks or months is typing at the keyboard. The computer is very responsive and will cheerfully chew up hours of your time in the edit-compile-run cycle. If you are learning you will save many hours if
You plan out your design on paper before you approach a computer. It doesn't matter what design method you pick or if you have never heard of a design method. Just write down a plan while your brain is fully engaged and not distracted by the computer.
When code will not compile or will not produce the right answer, if you can't fix it in five minutes, walk away from the computer. Go think about what's happening. Print out your code and scribble on it until you believe it's right.
These are just devices for helping to implement the simple but difficult old advice to think before you code.
When I was learning, I solved countless problems on the 15-minute walk from the computing center to my home. Sadly, with modern PCs we don't get that 15 minutes :-) If you can learn to take it anyway, you will become a better programmer, faster.
I certainly wouldn't start by looking at "real" software projects. Like you say, it's too hard to know where to start. That's largely because large projects are more about their large-scale design than about the individual algorithms or about program flow; for one thing, you're probably looking at a complex GUI application with multi-threading, etc. There isn't really anywhere to "start" looking at the code.
The best way to learn programming is to have a problem you want (need) to solve, and then going about solving it. But most importantly, WRITE CODE. When you read programming books, do ALL the exercises. Make sure you did them right. There's no substitute for writing code. No substitute for screwing up and then fixing it.
Stack Over F.. wait no, heh.
The biggest time-sinks for me are generally in respect to "finding the best answer." I often find that I will run into a problem that I know how to solve but feel that there is a better solution and go on the hunt for it. It is only hours/days later that I come to my senses and realize that I have 7 instances of Firefox, each containing at least 5 tabs sprawled out across 46" of monitor space that I realize that I've been caught in the black hole that is the pursuit of endless knowledge.
My advice to you, and myself for that matter, is to become comfortable with notion of refractoring. Essentially what this means (incase you are are not familiar with the term) is you come up with a solution for a problem and go with it, even if there is quite likely a better way of doing it. Once you have finished the problem, or even the program, you can then revisit your methodology, study it, and figure out where you can make changes to improve it.
This concept has always been hard for me to follow. In college I preferred to to write a paper once, print, and turn it in. Writing code can be thought of very similarly to writing a paper. Simply putting the pen to the pad and pushing out whats on your mind may work - but when you look back over it with a fresh pair of eyes you will, without question, see something you will wish you had done differently.
I just noticed you talked about reading through source trees of other people's projects. Reading other people's code is a wonderful idea, but you must read more selectively. A lot of open-source code is hard to read and not stuff you should emulate anyway. So avoid reading any code that hasn't been recommended by a programmer you respect.
Hint: Jon Bentley, Brian Kernighan, Rob Pike, and P. J. Plauger, who are all programmers I respect, have published a lot of code worth reading. In books.
The only way to learn how to program is to write more code. Reading books is great, but writing / fixing code is the best way to learn. You can't learn anything without doing.
You might also want to look at this book, How to Design Programs, for more of a perspective on design than details of syntax.
The only thing that I did that wasted weeks or months was worry about whether or not my designs were the best way to implement a particular solution. I know now that this is known as "premature optimization" and we all suffer from it to one degree or another. The right way to learn programming is to solve a problem, measure your solution to make sure it performs good enough, then move on to the next problem. After some time you'll have a pile of problems you've solved, but more importantly, you'll know a programming language.
There is excellent advice here, in other posts. Here are my thoughts:
1) Learn to type, the reasons are explained in this article by Steve Yegge. It will help more than you can imagine.
2) Reading code is generally considered a hard task. So, it is better to get an open source project, compile it, and start changing it and learn that way, rather than reading and trying to understand.
I can understand the situation you're in. Reading through books, even many will not make you programmer. What you need to do is START PROGRAMMING.
Actually programming is a lot like swimming in my opinion, even if you know only a little syntax and even lesser amount of coding techniques, start coding anyway. Make a small application, a home inventory, an expense catalog, a datesheet, a cd cataloger, anything you fancy.
The idea is to get into the nitty-gritties of it. Once you start programming you'll run into real-world problems and your problem solving skills will develop as you combat them. That's how you become a better programmer everyday.
So get into the thick of it, and swim right through... That's how you'll make it.
Good luck
I think this question will have wildly different answers for different people.
For myself, I tried C++ at one point (I was about ten and had already been programming for a while), with a click-and-drag UI builder. I think this was a mistake, and I should have gone straight to C and pointers and such. Because I'm just that kind of person.
In your case, it sounds like you want to be led down the right path by someone and feel a bit timid about jumping in and doing something by yourself. (You've read several books and now you're asking what not to do.)
I'll tell you how I learned: by doing plenty of fun, relatively short projects, steadily growing in difficulty. I began with QBasic (which I think is still a great learning tool) and it was there where I developed most of my programming skills. They have of course been expanded and refined since that time but I was already capable of good design back in those days.
The sorts of projects you could take on depend on your interests; if you're mathematically inclined you might want to try a prime number generator or projecting 3D points onto the screen; if you're interested in game design then you could try cloning pong (easy) or minesweeper (harder); or if you're more of a hacker you might want to make a simple chat program or file encryption software.
Work on these projects on your own, and don't worry about whether you're doing things the "right" way. As long as you get it to work, you've learned many things. Some time after you've completed a project you may want to revisit it and try to do it better, or just see how other people have done that sort of thing.
Given the way you seem to want to be led along, perhaps you should find yourself a mentor.
Do not learn how to use pointers and how to manually manage memory. You mentioned C, and I spent plenty of time trying to fix bugs that were caused by mixing *x and &x. This is evil...
Find some problem to solve, write or draw a sketch of an algorithm solving the problem, then try to write it. Either use Python (which is much more friendly for beginners) or use C with statically allocated memory only. And use books/tutorials. They offer multiple excercises with solutions, so you can compare yours with them and see other approaches.
Once you'll feel that you can actually write something simple, see some book/tutorial for Object Oriented Design. It's not the best the world has to offer, but it might turn out to be intuitive. If not, check the functional programming (like LISP, Scheme or Haskell languages), or programming in logic (like Prolog). Maybe those will suit you better.
Also - find some mate. A person you can talk to about coding, code maintenance and design. Such person is worth even more than a book.
To all C fans: The C language is great, really. It allows memory usage optimization to the extent impossible in high-level languages as Python or Ruby. The compiled code is also very fast, and is the only choice for RTOS, or modern 3D games engine. But this is not a good entry point for a beginner, that's what I believe.
Oh, and good luck to you! And don't be ashamed to ask! If you don't ask, the answer is much harder to find.
Assuming you have decent math skills try http://projecteuler.net/ It presents a series of problems to solve of increasing dificulty that should be solvible by writing short programs. This should give you experience in solving specific problems with out getting lost in the details of open source projects.
After basic language syntax, you need to learn design. Which is hard. This book may help.
I think you should stop thinking you've wasted time so far-- instead I think you're education is just incomplete, and you've taken a step you're not really ready for. It sounds like the books you've read are useful, you're learning the intricacies of the language. It sounds like you're just not accustomed to the tools you'd use then to package that code together so it runs.
Some books cover that focus on topics like language syntax, design patterns, algorithms and data structures will never mention the tools you need to actual apply that information. These books are great but if its all you've touched I think it would explain your situation.
What development environment are you using? If you're developing for windows you really should be proficient with creating projects, adding code, running and debugging in Visual Studio. You can download Visual Studio Express for free from Microsoft.
I recommend looking for tutorial like books that actually step you through the UI of development environment you are using. Look for actual screenshots with dropdown menus. Look at what the tutorials walk you through, and if its something you don't know how to do consider buying that book. Preferably it will have code you can copy'n'paste in, not code you write yourself.
I personally don't like these books as I can anticipate how to do new things in VS based on how I'd do other things. But if you're training is incomplete from a tools-usage perspective this could move you in the right direction.
It is probably harder to find these types of tutorial books for Python or C development. There is an overabundance of them for .Net development though.
As someone who has only been working as a programmer for 6 months, I might not be the best person to help you get going, but since it wasn't that long ago when I knew next to nothing, its quite fresh in my mind.
When I started my current job programming wasn't going to be part of my job description but when the opportunity came up to do some programming on the side, I couldn't pass it up.
I spent about 1 month doing tutorials on About.com's Delphi section. As much as people diss about.com, Zarko Gajic's tutorials were simple to understand and easy to follow. Once I had a basic knack of the language and the IDE, I jumped straight into a project exporting accounting data for a program called "Adept". Took me a while but I got there...
The biggest help for me was taking on a personal project. I developed an IRC bot in Java for a crappy 2D game called Soldat. I learnt a lot by planning out and coding my own project.
Now I'm pretty comfortable with Delphi Pascal, SQL, C# and Java. I think, once you get the hang of one OOP language, you can learn the syntax of another language, and it gets a lot easier to catch on.
Perhaps start with a small existing project, and find some thing within it that handles some core part of what it does - then with a debugger, step through it and follow what it's doing from the point where you ask it to do that thing for you.
This helps you in a number of ways. You start to better grasp all of the various things that are touched by the code as it attempts to complete its request. Also, you learn invaluable debugging techniques which it seems like far too many developers lack - while you can often eventually discover what is wrong either with repeated printf() (or equivalent) calls, if you can debug you can solve issues an order of magnitude faster.
I have found that conceptually, a great mental model for understanding programming in the abstract is a pattern of data flow. When a user manipulates data, how is it altered by a program for digestion and storage? How is it transformed to re-present to the user in a form that makes sense to them? Fundamentally code is about transformation of data, and all code can be broken down into constructs of various sizes whose purpose is to alter data in one way or another, bugs forming around the mismatch between what the programmer was expecting from the data, how high level libraries the coder is using treat the data, and how the data actually arrives. Following code with a debugger helps you fully understand this transformation in action by observing changes as they occur.
Standard answer is to make something; picking an easy language to do it in is good, but not essential. It's more the working out stuff in your own head, fixing it because it won't work, that really teaches you. For me, this always happens when I try my eternal dream projects (games) which I never finish but always learn from.
I think the thing I would avoid is learning a language in isolated snippets that don't really hang together but just teach various facets of a particular language. As others have said, the really hard and important thing is to learn design. I think the best way to do this is through a tutorial that walks you through creating an actual application, teaching design along the way. That way you can learn why certain decisions are made and learn how to accomplish what's needed to implement the design choices.
For example, I found Agile Web Development with Rails to be a really easy way to learn Ruby on Rails, much better than simply reading a Ruby manual or even poking my way around scattered web tutorials.
Another thing that I would avoid is developing code in isolation, that is, not having people look at it as I go along. Getting feedback from a mentor will help keep you on the right track with respect to the choices you are making and the correct use of language idioms.
Find a problem in your life or something you do that you just feel could be more efficient and write a small solution to it. It might just be a single script but you will gain much more confidence in your abilities when you start to see useful results of your work. You will also be more motivated to finish it as you are interested in using the solution. Start simple and small and then gradually move up to bigger projects.
And as your working on a small project, focus on building everything with quality. I think this is lost on some programmers who feel that their software is more impressive if it contains a ton of features but usually those features aren't well done or usable. If you focus on building quality solutions to real problems you'll be a fantastic programmer.
Good luck!
Work on projects/problems that you already know how to solve partially
You should read Mike clark's article : How I Learned Ruby. Essentially, he used the test framework for Ruby to exercise different elemnents of the languages.
I used this technique to learn python and it was very, very helpful. Not only did i learn the language, but I was very proficient in the test framework for Python at the end of the excercise. Once you have the basics you can start reading code and then working on building some larger project.

Resources