Is Test Driven Development good for a starter? [closed] - tdd

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Expanding this question on how I learnt to pass from problem description to code Two people mentioned TDD.
Would it be good for a starter to get into TDD ( and avoid bad habits in the future ? ) Or would it be too complex for a stage when understand what a programming language is?

TDD is meant to be simpler than the "traditional" method (of not testing it till the end) - because the tests clarify what you understand of the problem. If you actually didn't have a clear idea of what the problem was, writing tests is quite hard.
So for a beginner, writing tests gets the thinking juice going in the right direction, which is contractual behaviour, not implementation behaviour.

I wish TDD were around when I was first learning to program, and that I had picked it up before getting so entrenched in the 'old way' such that it's very difficult for me to learn TDD...

Experiencing TDD Rules All
I also think that ideally TDD would be very helpful in the early stages of learning. In hindsight I know it would of helped me approach the problems in a completely different light.
What I'm perplexed about is that when one is learning, there are so many new concepts being absorbed that confusion can start to set in very early. Therefore, while I do think TDD would be super helpful, I don't think it can be something that's learned successfully by one's self.
Just like anything else in life we tend to learn best when somebody is physically teaching us. Showing us how they approach the problems in a TDD manner can do so much more than reading about it in books or on the web. I mean, this can't hurt but it's not a substitute for a mentor that can truly show you the ropes.
Experiencing TDD is everything so if you can have somebody teach you how to TDD during those early stages, I think learning as a whole would be accelerated beyond what anyone would expect.

def self.learn_tdd_and_programming_together?
if you_have_tdd_mentor_sitting_next_to_you?
"go for it"
else
if language.ruby?
"it's possible, there is quite a bit of good stuff out
there that could give you a chance of learning programming
with TDD from the start. It's sort of in the ruby culture"
elsif language.dot_net?
"learn TDD after you learn the basics of .NET"
end
end
end

it's certainly a lot to take in, but having said that I wish I started out writing unit tests. What would actually have been good was if I had a mentor at my workplace who could have guided my TDD progress. I've been self learning TDD on and off for about a year and there's a lot to cover and the more you do it the more involved it gets, but it's really starting to pay off now for me.

I think this comment illustrates that it can be a very good thing for beginners to learn straight up.

My programming motto is:
Make it run -- the program solves the problem
Make it right -- the program is designed cleanly and there is a small amount of duplication
Make it fast -- optimized (if needed)
Test Driven Development handles the first two.
I think a beginner should be taught TDD so that he knows how to make programs run. IMHO, only then can good design techniques be taught.

I think yes. Studies even found that the benefits are largest for beginners. It gives you more guidance for writing the code. You know what the results and behavior should be, and write the tests. Then you write the code. Tests pass. You're done. And you know you're done.

Yes! Definitely.

I think it's not good for someone just learning programming. How will that person know what to assert? :P TDD is for design, not for testing. Once a person knows how to program, it'll be a good thing to start studiying the TDD approach.

First you need to understand how to code well. Read, study and practive that until you have a good handle on it. Once you have that, look into test driven design - it's very powerful.

An important benefit of TDD is defining doneness. In simple algorithmic programming, if you come up with a couple scenarios where correctness is easily asserted, its easy to enumerate them in a unit test and keep coding until they all work.
Sometimes unit testing can be hard for beginners, if there are many dependencies and you start to run into scenarios where mocking objects is necessary.
However, if you can make a simple statement about correctness, and it is easy to type out, then definitely write it down in code.
You may also note that if a simple statement of correctness is not easily described, you may not fully understand your problem.
Good luck...

It really depends on your definition of a "starter". If by "starter" you mean someone with absolutely no programming background, then no, I don't think TDD is a very good way to start out. A programmer needs to learn the basics (avoiding infinite loops, memory allocation, etc.) before worrying about refactoring and test driven development.

code is code whether it is the thing you're trying to spike out, or a test.
Learning TDD at the very beginning has a lot of value. It's one of those skills that should be a habit. There are a lot of us out there that understand and like the value of tdd but years of programming have instilled some some habits that can be hard to break later on.
As far as TDD being for contract design/code implementation/testing it's all of those things. Will TDD bring you to the perfect code? No, experience and studying the craft will help you mature your coding approaches. But TDD is a very important tool for every developer.
The use of TDD will hopefully help bring you to a design that is testable. And a design that is testable is in theory well encapsulated and should adhere to the open closed principal.
In my opinion as long as people view TDD as something that's a niche tool or is somehow optional while writing code, those people obviously don't get the value of TDD.

Related

Why is GW-BASIC still taught in schools? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I dunno about USA and the UK, but in India, schools still teach GW-BASIC. Yes, it's:
10 PRINT "HELLO WORLD"
20 GOTO 10
As far as my experience goes, even writing assembler is easier than this mess of a language. It could easily be replaced by something like Python, which would make it easier for students to actually understand the basic concepts of programming and help them to understand the logic behind what they're doing better.
Because Basic is the most uhh... basic introduction into von-Neumann architecture which is what all modern computers and (by extension) programming languages are based on.
Think about it:
Line numbers = Memory Addresses
Variables = CPU Registers
Current
Line = CPU Instruction Pointer
Goto
= Jump instruction
Ever try teaching programming to someone with no idea what it's about?
I did for 4 years. For absolutely starting out, GWBASIC is pretty good. You can get the most action for the least effort, while still conveying basic ideas, like:
The computer finishes one statement before starting the next. (Newbies are inclined to think the computer does everything "at once".)
A program is like something built out of tinker-toys. There are only a few basic pieces, and you assemble them to make it do what you want. (Newbies often think since the language has words like IF and PRINT that it will just understand whatever they type in.)
Variables are a key concept. They have a name that you give them, and they have values that they get when the programs runs. That's complicated. The name and the value are not the same thing, and there is a distinction between write-time and run-time.
Once you get past some basic concepts with the help of GWBASIC you can begin to introduce a more modern disciplined language.
GW-Basic was taught to me in 7th grade about 10 years ago. I found it was a great language and easy to experiment with as a beginner. Even the non-pc-freaks had little problem learning the language.
In my opinion it is a great tool to motivate beginners to learn more advanced programming languages.
GW-Basic is a great language for new programmers. If someone has never done any programming before, something simple like GW-Basic will be a lot easier for them to comprehend as compared to something like Python. Also, Java has a lot better support for Object Oriented programming as compared to C++. More commercial applications these days are written in Java than C++.[source]. Hence I would say that its a good thing they are switching to Java over C++.
As far as teaching in India is concerned and why they use GW-Basic, I can only guess (being from the USA):
It's cheap. Perhaps they have received old hardware with GW-Basic on it. Hey, it's there, it's free, why not use it to teach children.
The teacher knows it. If the teacher knows/understands it, he/she can teach it.
At a prev. employer, I met a number of people who immigrated to the USA from India and explained that the first time they worked with Windows was when they arrived over here, none of the schools (not even college/university) had it. It might depend on the school they went to, but maybe its a matter of the available equipment. It's possible this GW-Basic usage you speak of works the same way: they used what technology they had.
Maybe it means they are, well, resourceful.
As to whether its good that they are learning something so old, I'm not so sure it's such a good idea. But as the famous (American West) folk wisdom says, "Do with what you got. It'll pay off in the end." Better to expose them when they are young.
It's funny how fast humans forget.
Remember the first time you struggled with the concept of a loop? With the idea of a variable and how it retained values? With remembering syntax?
Basic has a relatively small built-in syntax, it has fairly flexible structures for loops and other constructs.
I guess over all it's "loose". This helps a lot in learning.
Loose is very bad for good, stable programs. You want very little flexibility, you want patterns that you can count on and very few options (even if you don't know that this is what you want, you will understand it as soon as you have to lead a team of 5 developers from another country).
If any here haven't really considered it, the reason we don't like basic isn't a lack of "power" or speed--is because it's loose--the exact same reason it's good for teaching.
You don't start out running, you learn to crawl in a wobbly sort of way, then you stumble, etc.
But once you are running sprints, you really want to make sure that every footfall is placed exactly where you want it, and if the guy ahead of you decides he suddenly wants to start crawling, you're screwed.
Of course, if you're running along the track alone or in a small, in-sync team, it doesn't matter much what you do. Feel free to use any language you want :)
If someone is truly interested in programming, they will take what they learn in that class and apply it to a language learned on their own time.
There's also something to be said for starting in a language that is much less powerful than Java or C++.
so you'll learn NOT to use GOTO
Thats easy to learn,school dont target to teach new technology,school want to teach basics of informatics
I think in my school GW Basic is still taught at 6-7 years (of 10) and the reason of it is that little girls and boys can't understand anything harder than basic :)
Even more, in my university we program on QBasic o_O omg you say? yeah, i'm shoked too :) oh, and they promise one semester of C++ on 4th grade.. yay!
I am from India and GW-BASIC was my first language way back in 1995. It was fun. Things have changed now. My school now teaches another BASIC variant, QBASIC as the first language. Then students move to C++ and Java in standards 8,9,10. Hopefully, Python will take over sometime.
As someone already pointed out, its plain inertia. Its not much of inexpensive hardware which is the reason. Its just the mindset to continue doing whatever has been going on.sigh.
I think GW-BASIC is a good tool to teach programming to children. I am teaching programming to school children for about 10 years. GW-BASIC provides an easy to learn enviornment without going into techniqual details.
If we use some hi-fi programming language to teach kids they will learn the programming language not the programming. Using GW-BASIC it is easy to teach programming, and we can concentrate on programming techniques rather then discussing the structures of programming languages. It has very easy and english like syntax so students understand it easily.
Another thing to keep in mind is its an interpreter to BASIC so we can execute different instructions line by line and can execute any part of the program, this give clear understanding to students.
Direct mode of GW-BASIC provides great help to explain the memory concepts as we can monitor the changing states of variables (memory addresses and values)
As far as GW-BASIC is concerned I couldn't agree more. This is why a Ruby programmer known only as "_why the lucky stiff" created an amazing platform for learning to program called "Hackety Hack". He in fact had quite a lot of insight into teaching programming to young people at the Art & Code symposium:
http://vodpod.com/watch/2078103-art-code-symposium-hackety-hack-why-the-lucky-stiff-on-vimeo

Premature refactoring? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
We have all heard of premature optimization, but what do you think about premature refactoring? Is there any such thing in your opinion? Here is what I am getting at.
First off, reading Martin Fowler's seminal work "Refactoring" quite literally changed my life in regards to programming.
One thing that I have noticed, however, is that if I start refactoring a class or framework too quickly, I sometimes find myself coded into a corner so-to-speak. Now, I suspect that the issue is not really refactoring per se, but maybe premature/poor design decisions/assumptions.
What are your thoughts, insights and/or opinions on this issue? Do you have any advice or common anti-patterns related to this issue?
EDIT:
From reading your answers and reflecting on this issue more, I think I have come to the realization that my problem in this case is really an issue of "premature design" and not necessarily "premature refactoring". I have been guilty of assuming a design and refactoring in that direction to early in the coding process. A little patience on my part to maintain a level of design agnosticism and focus on refactoring towards clean code would keep me from heading down these design rabbit trails.
I actually think the opposite.
The earlier you start thinking about whether or not your design needs refactoring, the better. Refactor constantly, so it's never a large issue.
I've also found that the more I refactor early on, the better I've gotten about writing code more cleanly up front. I tend to create fewer large methods, and have fewer problems.
However, if you find yourself "refactoring" yourself into a corner, I'd expect that is more a matter of lack of initial design or lack of planning for the scope of use of a class. Try writing out how you want to use the class or framework before you start writing the code - it may help you avoid that issue. This is also I think one advantage to test driven design - it helps you force yourself to look at using your object before it's written.
Remember, refactoring technically should NEVER lock you into a corner - it's about reworking the internals without changing how a class is used. If your trapping yourself by refactoring, it means your initial design was flawed.
Chances are you'll find that, over time, this issue gets better and better. Your class and framework design will probably end up more flexible.
We have all heard of Premature Optimization, but what do you thing about Premature Refactoring? Is there any such thing in your opinion?
Yes, there is. Refactoring is a way of paying down technical debt that has accrued over the life of your development process. However, the mere accrual of technical debt is not necessarily a bad thing.
To see why, imagine that you are writing tax-return analysis software for the IRS. Suddenly, new regulations are introduced at the last minute which break several of your original assumptions. Although you designed well, your domain model has fundamentally shifted from under your feet in at least one important place. It's April 14th, and the project must go live tomorrow, come hell or high water. What do you do?
If you implement a nuts-and-bolts solution at the cost of some moderate technical debt, your system will become more rigid and less able to withstand another round of these changes. But the site can go live and proceed onward, and there will be no risk of delivering late; you're confident you can make the required changes.
On the other hand, if you take the time to refactor the solution so that it now supports the new design in more sophisticated and flexible way, you'll have no trouble adapting to future changes. But you run the risk of your company's flagship product running up against the clock; you're not sure if the redesign will take longer than today.
In this case, the first option is the better choice. Assuming you have little previous technical debt, it's worth it to take your lumps now and pay it down later. This is, of course, a business decision, and not a design one.
I think it is possible to refactor too early.
At the nuts and bolts end of design is the code itself. This final stage of the design comes in to existence as you code, it will at times be flawed, and you'll see that as the code evolves. If you refactor too early it makes it harder to change the flawed design.
For example, it's much easier to delete a single long function when you realise it's rubbish or going in the wrong direction than it is to delete a nice well-formed function and the functions it uses and the functions they use, etc., whilst ensuring you're not breaking something else that was part of the refactor.
It could be said that perhaps you should have spent more time designing, but a key element in an agile process is that coding is part of the design process and in most cases, having put some reasonable effort into design, it's better to just get on with it.
Edit In response to comments:-
Design isn't done until you've written code. We can't solve all problems in pre-coding design, the whole point behind Agile is that coding is problem solving. If the non-code design solved all problems up-front before coding there would be no need to re factor, we would simply convert the design to well factored code in one step.
Anyone remember the late 1980s and early 1990s structured design methods, the ones where you got all the problems solved in clever diagrams before you wrote a line of code?
Premature refactoring is refactoring without unit-tests. You are at that point simply not ready for a refactoring. First get some unit-tests and then start thinking about refactoring. Otherwise you will (might) hurt the project more than help.
I am a strong believer in constant refactoring. There is no reason to wait until some specific time to start refactoring.
Anytime you see something that should be done better, Refactor.
Just keep this in my mind. I know a developer (a pure genius) who refactors so much (he is so smart he can always find a better way) he never finishes a project.
I think any "1.0" project is susceptible to this kind of ... let's call it "iterative design". If you don't have a clear spec before you start designing you're objects, you'll likely think of many designs and approaches to problems.
So, I think overcoming this specific problem is to clearly design things before you start writing code.
There are a couple of promising solutions to this type of problem, depending on the situation.
If the problem is that you decide something can be optimized in a certain way and you extract a method or something and realize that because of that decision, you are forced to code everything else in a convoluted way, the problem is probably that you didn't think far enough in the design process. If there had been a well written and planned spec, you would have known about this problem ahead of time (unless you didn't read the spec, but that's another issue :) )
Depending on the situation, rapid prototyping can also address this problem, since you'll have a better idea of these implementation details when you start working on the real thing.
The reason why premature optimization is bad is that optimization usually leads to a worse design. Unlike refactoring, which leads to a better and cleaner design, if done thoughtful and right. What I learned to be useful for me to analyze the usefulness of a refactoring was first looking at our UML diagram to visualize the change and then writing the code-doc (e.g Javadoc) for the class first and adding stubs ahead of any real code. Of course experience help a lot with that, if in doubt, ask your favorite architect ;)

How to deal with those TDD breaking people? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Personally I really prefer Unit Testing and write them for "good" coverage. (let's say I try as hard as possible to write good tests ;)
As usual some time later someone different needs to add some features to the code (add methods to classes and so on). He doesn't break those written unit tests but refuses to write additional (which would cover those additional features of the code he wrote).
This leads to a big hole in the tdd process (and even worse maybe a broken window effect)
anything I can do to make him write those tests?
how do you deal with those people?
Remember that TDD isn't primarily about generating good unit test coverage; it's about motivating good design first, about ensuring that the code you write does what you expect second, and about providing a body of high quality tests third.
When another programmer extends a class without writing tests, they miss out on these benefits, and you should feel pity on them. But when you work, you will continue to work the best way you know how (test first) because you know that it you get decoupled code that is easy on the consumer, and that your code does what you expect.
The biggest pain for you is that you have to be careful about what you refactor: if you are refactoring code that is under tests, you can go fast, and design will quickly and safely improve. If you are refactoring code that is not tested, you should be extremely cautious about refactoring it (perhaps only using reliable automated tools to do so) or add the tests.
In the end, you will continue to benefit from your use of TDD, because you produce clearer, correct code, faster, while your TDD-impaired colleague will suffer.
Don't approach this as a confrontation! You're asking how to force a coworker to do something s/he clearly does not see any benefit to. You can't make someone use TDD - as you've already seen yourself. The only way a developer will embrace TDD is when someone else helps them reach that "aha!" moment. Be respectful as one colleague to another and show him/her through your actions and be positive in wanting to help him/her get over the mental hump.
Pair Programming. With two people working on something, programmers are much less likely to take shortcuts like this.
If you have a build process you could use a tool like NCover or PartCover and fail the build if the coverage isn't sufficient.
Aside from a company policy and repercussions from their manager, there's not much you can do about it. Maybe there's some way in your Source Control tool to require that anything public have a unit test that's flagged as such.
You could even write a macro that's part of your build process that looks for anything marked PUBLIC (I'm a VB guy), and then checks to ensure that, somewhere in the solution, there's a unit test with a code comment that sufficiently links it. Failing to have an associated unit test breaks the build and sends out an email to the whole dev group that sufficiently shames said non-tester.
Maybe I'll set that up here, now that I think about it...
Track code coverage with some tool, e.g. for Java there is Emma, and generate a report for management with each release. When numbers are too low or go down management should investigate the causes.
Lead by example. Your coworker may simply not understand how to use TDD appropriately. Next time it happens, write a unit test for them. Make sure to point this out to them: "Hey, I noticed you added x feature to the program without a unit test, so I wrote one for you and put it here." This way they have an example and won't feel embarrassed by having to ask how to unit test.
Only do this once or twice. After that, make sure to mention any future occurrences. You'd be surprised at the difference a polite "Hey, you didn't write a unit test for function y, it'd really help me out if you'd write one for me" will make. Remember, you're goal isn't to try making them write tests. It's to make writing tests less of a hassle than not writing tests.
If the above doesn't work, it's time for a discussion with management. You've already tried to resolve the situation amicably, so it's time to consider a less-than-amicable approach.
Teach your co-workers how to do TDD, so that they can turn their brains upside-down (I had that feeling when I tried TDD the first time) and begin to write tests first.
Once I did an experiment with a programmer friend of mine, who did not know TDD. I came to his house and we started writing Tetris using TDD (we spent about 6 hours that day and progressed nicely). First I wrote a test method, and then he wrote the code to pass the test. In the beginning he was slightly opposed to writing "the simplest thing that could possibly work" (such as hardcoding the return values in the first trivial tests) and not planning much ahead, but anyways he sucked it up and followed my instructions. As we progressed, it appears that slowly he begun to understand what was the point in it all.
Play the video of the "Don't tase me bro!" guy as a warning

What are the biggest time wasters for learning programming? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I've had several false starts in the past with teaching myself how to program. I've worked through several books (mostly C and Python), and end up just learning the syntax without feeling as though I could sit down and actually write a program for myself. When I try to look through the source trees of a project on Codeplex or Sourceforge, I never seem to know where to start reading the code -- the dependencies seem to go in all directions.
I feel as though I'm not learning programming the way it's done "on the street," so I figured I'd take a different approach to asking how a newbie should learn how to code. If you had to learn programming all over again, what are the things you wouldn't do? What did you spend time doing that you now know wasted you weeks or months?
Where I see beginners wasting weeks or months is typing at the keyboard. The computer is very responsive and will cheerfully chew up hours of your time in the edit-compile-run cycle. If you are learning you will save many hours if
You plan out your design on paper before you approach a computer. It doesn't matter what design method you pick or if you have never heard of a design method. Just write down a plan while your brain is fully engaged and not distracted by the computer.
When code will not compile or will not produce the right answer, if you can't fix it in five minutes, walk away from the computer. Go think about what's happening. Print out your code and scribble on it until you believe it's right.
These are just devices for helping to implement the simple but difficult old advice to think before you code.
When I was learning, I solved countless problems on the 15-minute walk from the computing center to my home. Sadly, with modern PCs we don't get that 15 minutes :-) If you can learn to take it anyway, you will become a better programmer, faster.
I certainly wouldn't start by looking at "real" software projects. Like you say, it's too hard to know where to start. That's largely because large projects are more about their large-scale design than about the individual algorithms or about program flow; for one thing, you're probably looking at a complex GUI application with multi-threading, etc. There isn't really anywhere to "start" looking at the code.
The best way to learn programming is to have a problem you want (need) to solve, and then going about solving it. But most importantly, WRITE CODE. When you read programming books, do ALL the exercises. Make sure you did them right. There's no substitute for writing code. No substitute for screwing up and then fixing it.
Stack Over F.. wait no, heh.
The biggest time-sinks for me are generally in respect to "finding the best answer." I often find that I will run into a problem that I know how to solve but feel that there is a better solution and go on the hunt for it. It is only hours/days later that I come to my senses and realize that I have 7 instances of Firefox, each containing at least 5 tabs sprawled out across 46" of monitor space that I realize that I've been caught in the black hole that is the pursuit of endless knowledge.
My advice to you, and myself for that matter, is to become comfortable with notion of refractoring. Essentially what this means (incase you are are not familiar with the term) is you come up with a solution for a problem and go with it, even if there is quite likely a better way of doing it. Once you have finished the problem, or even the program, you can then revisit your methodology, study it, and figure out where you can make changes to improve it.
This concept has always been hard for me to follow. In college I preferred to to write a paper once, print, and turn it in. Writing code can be thought of very similarly to writing a paper. Simply putting the pen to the pad and pushing out whats on your mind may work - but when you look back over it with a fresh pair of eyes you will, without question, see something you will wish you had done differently.
I just noticed you talked about reading through source trees of other people's projects. Reading other people's code is a wonderful idea, but you must read more selectively. A lot of open-source code is hard to read and not stuff you should emulate anyway. So avoid reading any code that hasn't been recommended by a programmer you respect.
Hint: Jon Bentley, Brian Kernighan, Rob Pike, and P. J. Plauger, who are all programmers I respect, have published a lot of code worth reading. In books.
The only way to learn how to program is to write more code. Reading books is great, but writing / fixing code is the best way to learn. You can't learn anything without doing.
You might also want to look at this book, How to Design Programs, for more of a perspective on design than details of syntax.
The only thing that I did that wasted weeks or months was worry about whether or not my designs were the best way to implement a particular solution. I know now that this is known as "premature optimization" and we all suffer from it to one degree or another. The right way to learn programming is to solve a problem, measure your solution to make sure it performs good enough, then move on to the next problem. After some time you'll have a pile of problems you've solved, but more importantly, you'll know a programming language.
There is excellent advice here, in other posts. Here are my thoughts:
1) Learn to type, the reasons are explained in this article by Steve Yegge. It will help more than you can imagine.
2) Reading code is generally considered a hard task. So, it is better to get an open source project, compile it, and start changing it and learn that way, rather than reading and trying to understand.
I can understand the situation you're in. Reading through books, even many will not make you programmer. What you need to do is START PROGRAMMING.
Actually programming is a lot like swimming in my opinion, even if you know only a little syntax and even lesser amount of coding techniques, start coding anyway. Make a small application, a home inventory, an expense catalog, a datesheet, a cd cataloger, anything you fancy.
The idea is to get into the nitty-gritties of it. Once you start programming you'll run into real-world problems and your problem solving skills will develop as you combat them. That's how you become a better programmer everyday.
So get into the thick of it, and swim right through... That's how you'll make it.
Good luck
I think this question will have wildly different answers for different people.
For myself, I tried C++ at one point (I was about ten and had already been programming for a while), with a click-and-drag UI builder. I think this was a mistake, and I should have gone straight to C and pointers and such. Because I'm just that kind of person.
In your case, it sounds like you want to be led down the right path by someone and feel a bit timid about jumping in and doing something by yourself. (You've read several books and now you're asking what not to do.)
I'll tell you how I learned: by doing plenty of fun, relatively short projects, steadily growing in difficulty. I began with QBasic (which I think is still a great learning tool) and it was there where I developed most of my programming skills. They have of course been expanded and refined since that time but I was already capable of good design back in those days.
The sorts of projects you could take on depend on your interests; if you're mathematically inclined you might want to try a prime number generator or projecting 3D points onto the screen; if you're interested in game design then you could try cloning pong (easy) or minesweeper (harder); or if you're more of a hacker you might want to make a simple chat program or file encryption software.
Work on these projects on your own, and don't worry about whether you're doing things the "right" way. As long as you get it to work, you've learned many things. Some time after you've completed a project you may want to revisit it and try to do it better, or just see how other people have done that sort of thing.
Given the way you seem to want to be led along, perhaps you should find yourself a mentor.
Do not learn how to use pointers and how to manually manage memory. You mentioned C, and I spent plenty of time trying to fix bugs that were caused by mixing *x and &x. This is evil...
Find some problem to solve, write or draw a sketch of an algorithm solving the problem, then try to write it. Either use Python (which is much more friendly for beginners) or use C with statically allocated memory only. And use books/tutorials. They offer multiple excercises with solutions, so you can compare yours with them and see other approaches.
Once you'll feel that you can actually write something simple, see some book/tutorial for Object Oriented Design. It's not the best the world has to offer, but it might turn out to be intuitive. If not, check the functional programming (like LISP, Scheme or Haskell languages), or programming in logic (like Prolog). Maybe those will suit you better.
Also - find some mate. A person you can talk to about coding, code maintenance and design. Such person is worth even more than a book.
To all C fans: The C language is great, really. It allows memory usage optimization to the extent impossible in high-level languages as Python or Ruby. The compiled code is also very fast, and is the only choice for RTOS, or modern 3D games engine. But this is not a good entry point for a beginner, that's what I believe.
Oh, and good luck to you! And don't be ashamed to ask! If you don't ask, the answer is much harder to find.
Assuming you have decent math skills try http://projecteuler.net/ It presents a series of problems to solve of increasing dificulty that should be solvible by writing short programs. This should give you experience in solving specific problems with out getting lost in the details of open source projects.
After basic language syntax, you need to learn design. Which is hard. This book may help.
I think you should stop thinking you've wasted time so far-- instead I think you're education is just incomplete, and you've taken a step you're not really ready for. It sounds like the books you've read are useful, you're learning the intricacies of the language. It sounds like you're just not accustomed to the tools you'd use then to package that code together so it runs.
Some books cover that focus on topics like language syntax, design patterns, algorithms and data structures will never mention the tools you need to actual apply that information. These books are great but if its all you've touched I think it would explain your situation.
What development environment are you using? If you're developing for windows you really should be proficient with creating projects, adding code, running and debugging in Visual Studio. You can download Visual Studio Express for free from Microsoft.
I recommend looking for tutorial like books that actually step you through the UI of development environment you are using. Look for actual screenshots with dropdown menus. Look at what the tutorials walk you through, and if its something you don't know how to do consider buying that book. Preferably it will have code you can copy'n'paste in, not code you write yourself.
I personally don't like these books as I can anticipate how to do new things in VS based on how I'd do other things. But if you're training is incomplete from a tools-usage perspective this could move you in the right direction.
It is probably harder to find these types of tutorial books for Python or C development. There is an overabundance of them for .Net development though.
As someone who has only been working as a programmer for 6 months, I might not be the best person to help you get going, but since it wasn't that long ago when I knew next to nothing, its quite fresh in my mind.
When I started my current job programming wasn't going to be part of my job description but when the opportunity came up to do some programming on the side, I couldn't pass it up.
I spent about 1 month doing tutorials on About.com's Delphi section. As much as people diss about.com, Zarko Gajic's tutorials were simple to understand and easy to follow. Once I had a basic knack of the language and the IDE, I jumped straight into a project exporting accounting data for a program called "Adept". Took me a while but I got there...
The biggest help for me was taking on a personal project. I developed an IRC bot in Java for a crappy 2D game called Soldat. I learnt a lot by planning out and coding my own project.
Now I'm pretty comfortable with Delphi Pascal, SQL, C# and Java. I think, once you get the hang of one OOP language, you can learn the syntax of another language, and it gets a lot easier to catch on.
Perhaps start with a small existing project, and find some thing within it that handles some core part of what it does - then with a debugger, step through it and follow what it's doing from the point where you ask it to do that thing for you.
This helps you in a number of ways. You start to better grasp all of the various things that are touched by the code as it attempts to complete its request. Also, you learn invaluable debugging techniques which it seems like far too many developers lack - while you can often eventually discover what is wrong either with repeated printf() (or equivalent) calls, if you can debug you can solve issues an order of magnitude faster.
I have found that conceptually, a great mental model for understanding programming in the abstract is a pattern of data flow. When a user manipulates data, how is it altered by a program for digestion and storage? How is it transformed to re-present to the user in a form that makes sense to them? Fundamentally code is about transformation of data, and all code can be broken down into constructs of various sizes whose purpose is to alter data in one way or another, bugs forming around the mismatch between what the programmer was expecting from the data, how high level libraries the coder is using treat the data, and how the data actually arrives. Following code with a debugger helps you fully understand this transformation in action by observing changes as they occur.
Standard answer is to make something; picking an easy language to do it in is good, but not essential. It's more the working out stuff in your own head, fixing it because it won't work, that really teaches you. For me, this always happens when I try my eternal dream projects (games) which I never finish but always learn from.
I think the thing I would avoid is learning a language in isolated snippets that don't really hang together but just teach various facets of a particular language. As others have said, the really hard and important thing is to learn design. I think the best way to do this is through a tutorial that walks you through creating an actual application, teaching design along the way. That way you can learn why certain decisions are made and learn how to accomplish what's needed to implement the design choices.
For example, I found Agile Web Development with Rails to be a really easy way to learn Ruby on Rails, much better than simply reading a Ruby manual or even poking my way around scattered web tutorials.
Another thing that I would avoid is developing code in isolation, that is, not having people look at it as I go along. Getting feedback from a mentor will help keep you on the right track with respect to the choices you are making and the correct use of language idioms.
Find a problem in your life or something you do that you just feel could be more efficient and write a small solution to it. It might just be a single script but you will gain much more confidence in your abilities when you start to see useful results of your work. You will also be more motivated to finish it as you are interested in using the solution. Start simple and small and then gradually move up to bigger projects.
And as your working on a small project, focus on building everything with quality. I think this is lost on some programmers who feel that their software is more impressive if it contains a ton of features but usually those features aren't well done or usable. If you focus on building quality solutions to real problems you'll be a fantastic programmer.
Good luck!
Work on projects/problems that you already know how to solve partially
You should read Mike clark's article : How I Learned Ruby. Essentially, he used the test framework for Ruby to exercise different elemnents of the languages.
I used this technique to learn python and it was very, very helpful. Not only did i learn the language, but I was very proficient in the test framework for Python at the end of the excercise. Once you have the basics you can start reading code and then working on building some larger project.

How do you prevent over complicated solutions or designs? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Many times we find ourselves working on a problem, only to figure out the solution being created is far more complex than the problem requires. Are there controls, best practices, techniques, etc that help you control over complication in your workplace?
Getting someone new to look at it.
In my experience, designing for an overly general case tends to breed too much complexity.
Engineering culture encourages designs that make fewer assumptions about the environment; this is usually a good thing, but some people take it too far. For example, it might be nice if your car design doesn't assume a specific gravitational pull, nobody is actually going to drive your car on the moon, and if they did, it wouldn't work, because there is no oxygen to make the fuel burn.
The difficult part is that the guy who is developed the "works-on-any-planet" design is often regarded as clever, so you may have to work harder to argue that his design is too clever.
Understanding trade-offs, so you can make the decision between good assumptions and bad assumptions, will go a long way into avoiding a needlessly complicated design.
If its too hard to test, your design is too complicated. That's the first metric I use.
Here are some ideas to get design more simpler:
read some programming books and articles, and then apply them in your work and write code
read lots of code (good and bad) written by other people (like Open Source projects) and learn to see what works and what does not
build safety nets (unit tests) to enable experimentations with your code
use version control to enable rollback, if those experimentations take wrong turn
TDD (test driven development) and BDD (behaviour driven development)
change your attitude, ask how you can make it so, that "it simply works" (convention over configuration could help there; or ask how Apple would do it)
practice (like jazz players -- jam with code, try Code Kata)
write same code multiple times, with different languages and after some time has passed
learn new languages with new concepts (if you use static language, learn dynamic one; if you use procedural language, learn functional one; ...) [one language per year is about right]
ask someone to review you code and actively ask how you can make your code simpler and more elegant (and then make it)
get years under your belt by doing above things (time helps active mind)
I create a design etc., and then I look at it and try and remove (agressively) everything that doesn't seem to be needed. If it turns out I need it later when I am polishing the design I add it back in. I do this over several iterations, refining as I go along.
Read "Working Effectively With Legacy Code" by Michael C. Feathers.
The point is, if you have code that works, and you need to change the design, nothing works better than making your code unit testable, and breaking your code into smaller pieces.
Using Test Driven Development and following Robert C. Martin's Three Rules of TDD:
You are not allowed to write any production code unless it is to make a failing unit test pass.
You are not allowed to write any more of a unit test than is sufficient to fail; and compilation failures are failures.
You are not allowed to write any more production code than is sufficient to pass the one failing unit test.
In this way you are not likely to get much code that you don't need. You will always be focused on making one important thing work and won't ever get too far ahead of yourself in terms of complexity.
Test first may help here, but it is not suitable for all situation. And it's not a panacea anyway.
Start small is another great idea. Do you really need to stuff all 10 design patterns into this thing? Try first to do it "stupid way". Doesn't quite cut it? Okay, do it "slightly less stupid way". Etc.
Get it reviewed. As someone else wrote, two pairs of eyes are better. Even better are two brains. Your mate may just see a room for simplification, or a problematic area you thought was fine just because you spend many hours hacking it.
Use lean language. Languages such as Java, or sometimes C++ sometimes seem to encourage nasty, convoluted solutions. Simple things tend to span over multiple lines of code, and you just need to use 3 external libraries and a big framework to manage it all. Consider using Python, Ruby, etc. - if not for your project, then for some private use. It can change your mindset to favor simplicity, and to be assured that simplicity is possible.
Reduce the amount of data you're working with by serialising the task into a series of smaller tasks. Most people can only hold half a dozen (plus or minus) conditions in their head while coding, so make that the unit of implementation. Design for all the tasks you need to accomplish, but then ruthlessly hack the design so that you never have to play with more than half a dozen paths though the module.
This follows from Bendazo's post - simplify until it becomes easy.
It is inevitable once you have been a programmer that this will happen. If you seriously have unestimated the effort or hit a problem where your solution just doesn't work then stop coding and get talking to your project manager. I always like to take the solutions with me to the meeting, problem is A, you can do x which will take 3 days or we can try y which will take 6 days. Don't make the choice yourself.
Talk to other programmers every step of the way. The more eyes there are on the design, the more likely an overcomplicated aspect is revealed early, before it becomes too ossified in the codebase.
Constantly ask yourself how you will use whatever you are currently working on. If the answer is that you're not sure, stop to rethink what you're doing.
I've found it useful to jot down thoughts about how to potentially simplify something I'm currently working on. That way, once I actually have it working, it's easier to go back and refactor or redo as necessary instead of messing with something that's not even functional yet.
This is a delicate balancing act: on the one hand you don't want something that takes too long to design and implement, on the other hand you don't want a hack that isn't complicated enough to deal with next week's problem, or even worse requires rewriting to adapt.
A couple of techniques I find helpful:
If something seems more complex than you would like then never sit down to implement it as soon as you have finished thinking about it. Find something else to do for the rest of the day. Numerous times I end up thinking of a different solution to an early part of the problem that removes a lot of the complexity later on.
In a similar vein have someone else you can bounce ideas off. Make sure you can explain to them why the complexity is justified!
If you are adding complexity because you think it will be justified in the future then try to establish when in the future you will use it. If you can't (realistically) imagine needing the complexity for a year or three then it probably isn't justifiable to pay for it now.
I ask my customers why they need some feature. I try and get to the bottom of their request and identify the problem they are experiencing. This often lends itself to a simpler solution than I (or they) would think of.
Of course, if you know your clients' work habits and what problems they have to tackle, you can understand their problems much better from the get-go. And if you "know them" know them, then you understand their speech better. So, develop a close working relationship with your users. It's step zero of engineering.
Take time to name the concepts of the system well, and find names that are related, this makes the system more familiar. Don't be hesitant to rename concepts, the better the connection to the world you know, the better your brain can work with it.
Ask for opinions from people who get their kicks from clean, simple solutions.
Only implement concepts needed by the current project (a desire for future proofing or generic systems make your design bloated).

Resources