Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
The community reviewed whether to reopen this question 1 year ago and left it closed:
Opinion-based Update the question so it can be answered with facts and citations by editing this post.
There's a lot of homework questions here on SO.
I would guess that 90%+ can be solved by stepping through the code in a debugger, and observing program/variable state.
I was never taught to use a debugger. I simply printed and read the GDB manual and stepped through their examples. When I used Visual Studio for the first time, I remembered thinking, Wow! how much simpler can this be, click to set a breakpoint, mouse over a variable for the value, press a key to step, the immediate window, debug.print, etc...
At any rate, are students "taught" to use a debugger? If not, why not? (Perhaps a better question is, why can't they learn to use a debugger themselves... maybe they need to be told that there is such a tool that can help them...)
How long does it take to learn to use a debugger?
I don't think the problem is teaching. Using a modern graphical debugger is not rocket science (at least not for most user-mode programs running on a single computer). The problem is with the attitudes of some people. In order to use a debugger effectively, you should:
Admit it's your fault and select isn't broken.
Have the perseverance to spend a couple nights debugging, without forgetting the previous point.
There's no specific algorithm to follow. You should guess educatedly and reason effectively from what you see.
Not many non-programmers have these attitudes. At college, I have seen many friends who give up after a relatively short period of time and bring me some code and tell me the computer is doing something wrong. I usually tell them I trust their computer more than them (and this hurts some feelings, but that's the way it is).
In my high school and university, most of the people in the classes didn't really care about programming at all.
If by students you mean Computer Science students, I think the answer is fairly obvious. The subject matter for courses is generally theory, with the programming language / framework / library there as an aid. The professor can't go very far in depth on a particular tool, since it would take away from time he is teaching networking or systems or whatever. Maybe if there were a course called "Real World Programming" or something like that, they'd cover debuggers, but in general I don't see too much wrong with expecting students to read the language / tool documentation in order to accomplish the coursework.
Debuggers were introduced in my second year Intro to C course, if I recall correctly. Of course the problem most students were struggling with at that point was getting their work to compile, which a debugger will not help with. And once their ten line command line program compiles and then crashes, well, they already have some printfs right there. Fighting to master GDB is overkill.
In my experience, it's fairly rare to actually deal with a code base large enough to make more than a cursory familiarization with a debugger worth the time investment in most Comp. Sci curriculums. The programs are small and the problems you face are more along the lines of figuring out the time-space complexity of your algorithm.
Debuggers become much more valuable on real world projects, where you have a lot of code written by different people at different times to trace through to figure out what keeps frotzing foo before the call to bar().
This is a good question to ask to the faculty at your school.
At my university, they gave a very brief example of debugging, then pointed us to the "help" files and the books.
Perhaps they don't teach it because there is sooo much stuff to cover and so little time for the lecturers. The professors aren't going to hold everybody's hand.
Not entirely related, but people need to use debuggers not just for debugging but to understand working code.
I'll put in a cautionary note on the other side. I learned to program with Visual Basic and Visual C ( mid 80s ), and the debuggers were built-in and easy to use. Too easy, in fact... I generally didn't think about how to solve a problem, I just ran it in the debugger and adjusted the behavior. Oh, that variable is one too high... I must have to subtract one here!
It wasn't until I switched to Linux, with the not-quite-as-easy gcc/gdb combo, that I began to appreciate design and thinking about your code first.
I'll admit, I probably go too far the other way now. I use a debugger to analyze stack traces and that's about it. There should be a middle ground between analyzing the problem and stepping through it in a debugger. Certainly people should be shown all the tools available too them.
I was taught to use a debugger in college. Not much, late (it should be almost the second thing to teach), but they taught me.
Anyway, it's important to teach to DEBUG, not only to "use a debugger". There are situations where you can't debug with gdb (e.g. try to debug a program running 10 concurrent threads) and you need a different approach, like the old-fashioned printf.
I can certainly agree with you that usually one learn and make use of debug techniques much later that the first time you could use them.
From a practicality standpoint, most likely (due to policy or technical restrictions) you cannot use a debugger on a production application. Not using the debugger as too much of a crutch promotes adding the proper amount of logging to your application.
Because there is not text book on debugging, period.
In fact, it is very hard to create a teaching situation where get the incentive to use a debuigger. Typical assignments are too simple to really require a debugger. Greg Wilson raised that topic at last year's SUITE workshop, and consensus was it is very hard to get students to use a debugger. You can tell them about debugging, but creating situation where they will actually feel the pain of resorting to the debugger is hard.
Maybe a lecture on game cracking could motive students to step through the game with a debugger? At least, that was why I told myself how to use a debugger as a 12-year old :)
A found that there is a lot of negative attitude towards debuggers amongst academia and seasoned systems programmers. I have come up against one quite talented programmer who claimed that "Debuggers don't work, i just use log files." Fair enough, for multi-threaded server apps you must have logging, but there's no denying a debugger is useful for 99% of the code that is not multi-threaded.
In answering your question, yes debuggers should be covered in programming syllabus, but as one of the tools of debugging a program. Tracing and logging are important as well.
Having recruited at 4 universities (Rensselaer, Purdue, Ohio State, University of Washington) - students that write code for money for incubators associated with their university tend to learn the art of debugging really well because the incubation companies want people to solve problems and do it in fewer hours and invest some time to teach them to use good debugging techniques. Depending on the sophistication of the particular incubator company they might invest in mentoring patterns and performance to help the student be more productive for them but often debugging is the first investment.
Left to the traditional cs classes the students don't seem to walk away with the same set of skills that help them narrow a problem, manipulate the data while the program/service/page/site/component is running andreally understand the implications of what they've written vs. what they needed to write to make it right.
I went to Rensselaer and I 'learned on my own' because I was paid flat rate for some projects and I wanted to minimize my own time spent on programming - and it was further enforced by working as an intern #Microsoft in 1994 where I got to see how useful an Integrated Dev Environment really was.
Wagering a hypothesis based on my experience as a TA and aspiring CS prof:
It would actually confuse the kids who have little to no programming experience more than it would help.
Now first of all: I completely agree that teaching how to use a debugger would be a great thing, but I think the barrier to doing so stems from the greater, systematic problem that software engineering and computer science are not separate majors. Most CS programs will require 2-4 classes where learning to code is the focus. After these, coding ability is required but not the topic of the class.
To my main point: It's very hard to teach something using the guise of "you don't get this now but do it because it'll be useful later." You can try, but I don't think it really works. This as an extension of the idea that people only really learn from doing. Going through the motions but not understanding why is not the same as doing.
I think kids learning to code for the first time aren't going to understand why using a debugger is more effective than inserting print lines. Think about a small to medium sized script you code: would you use the debugger barring odd behavior or some bug you couldn't work out quickly? I wouldn't, seems like it would just slow me down. But, when it comes to maintaining the huge project I work on every day, the debugger is invaluable beyond a doubt. By the time students get to the portion of the curriculum that requires big projects, they're not in a class that focuses on general coding anymore.
And all of this brings me to my awesome idea I think every CS prof should do when teaching how to code: instead of exclusively asking for projects from the kids, every now and then give them a big piece of complex code and ask them to fix the bugs. This would teach them how to use a debugger
In high school we were taught to debug by writing stuff out to the console.
In college, we were taught a mix of that plus using a debugger.
The tools have only gotten easier to use, so I am really not sure why it is not taught.
I was taught in my first CS class how to use a debugger. It didn't do me much good to be setting breakpoints and stepping through my code when most of what I wrote was "Hello World!". I pretty much ignored the debugger from that point on until I learned to use GDB in a much more advanced course while working on a "binary bomb" homework assignment.
Since I've been out of school and working I've spent a lot more time using a debugger and learning how useful it can be. I would say that learning to use a debugger includes three things - a need for one, being taught/learning how to use one, and experience knowing how to use one to your advantage.
Also, spending some time learning "echo debugging" can be worthwhile for those situations when a debugger isn't available/necessary. That's just my $0.02.
There are more than a few questions here, to my mind. Here's a few that are asked and a few that I'd infer:
I was taught in BASIC and Pascal initially, usually with an interpreter that made it easier to run the program till something blew up. We didn't have breakpoints or many of the fancy things there are now for tracing through code, though this would have been from 1983-1994 using a Commodore 64, Watcom BASIC, and Pascal on a Mac.
Even in my later university years, we didn't have a debugger. If our code didn't work, we had print statements or do manual tracing, in terms of time this would have been 1995-1997.
One cavaet with a debugger is that for something like Visual Studio, do you have any idea how long it could take to go through every feature it has for debugging? That could take years in some cases I think. This is without getting into all the build options and other things that it can do that one might use eventually. Another point is that for all the good things that a debugger gives, there is something to be said for how complex things can get,e.g. using a breakpoint in VS there is the call stack, local variables, watch windows, memory, disassembly and other things that one could want to examine while execution is halted.
The basics of using a debugger could be learned in a week or so, I think. However, to get to the point of mastering what a debugger does, how much goes on when code is executing as well as where it is executing as there are multiple places where things can run these days like GPUs to go along with the CPU, would take a lot longer and I'd question how many people have that kind of drive, even in school.
You're question is kind of similar to, "Why aren't students taught software testing"? I'm sure the do in some places, but typically Universities/Colleges stick to teaching the 'interesting' theoretical computer science stuff, and tend not to teach the practical tools. Like how if you're taking English in school they teach you how to write, not how to use MS Word (yea I'm sure there are some Word courses, but you get my point).
I wasn't taught to use a debugger in my undergraduate degree, because you cannot use a debugger on a deck of punch cards. Even print statements are a luxury if you have a 15 minute turnaround on "jobs", etcetera.
I'm not saying that people should not be taught to use debuggers. Just that it is also important to learn to debug without this aid, because:
it will help you understand your code better if you don't have to rely on a debugger, and
there are situations where a sophisticated debugger won't be available.
On the latter point, I can also remember debugging a boot prom on an embedded device using a (rather expensive) logic analyzer to capture what was happening on the address / data lines.
The same reason students aren't taught version control, or unit testing, or shell scripting, or text editing, or documentation writing, or even (beyond intro courses) programming languages. The class is about computer science, usually a single concept or family of concepts, not programming. You're expected to learn what you need.
This isn't unique to computer science. My chemistry classes (I also have a chemistry degree) didn't teach me how to use any chemistry lab equipment, either. You learned that by hanging around in the lab and watching other students and asking the grizzled old profs who hung out there.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have a question which is not strictly-speaking programming related, but nevertheless caused by being an analyst and a programmer at the same time.
It's about starting new projects which seem to be unfeasible, because they have an unknown domain, lack specifications, and/or require technology which I am not familiar with. I get some sort of panic when I approach such a project, and then relax as I proceed along with domain and technology understanding.
Is this something you experience? How do you cope with it?
The best way that I know of to try to contain and control the human factors in a project is to have a clear idea of your own processes.
Start with some Domain Driven Design, work with the users and help them to understand their domain and the business processes that surround the domain. Often developers are far better at abstraction than the managers/business people so we can often help them to understand their own domain.
Build up a set of acceptance criteria, these form your tests which actually form your spec.
Once you have an idea of the above you know much more about feasibility and how long it will take (and even if the technology that has been specified is the right one)
As for approaching new technologies, start small, build a proof of concept and make your mistakes there rather than on production code. There is a huge amount of best practice on the web and places like StackOverflow are good places to start.
I would suggest working in an agile fashion, get the project owners to prioritise the work that needs to be done, work out what is needed for the next two week sprint and deliver it (which may mean stubbing out a lot of functionality). They'll tell you when it is wrong and it may influence their own decision making.
Don't view the entire project as a nasty whole, break it down into deliverable sections and one step at a time.
Calm down.
If the project is initially infeasible (even if only in your own mind) then start with a feasibility study. This is a sub-project with which you will define the project (or at least the next sub-project).
You've already defined several major tasks within the feasibility study: learn about the domain, write some specifications, learn enough about the new technologies.
As for me, no I never panic about this sort of situation, I love starting with a blank sheet of paper, and experience has taught me how to start filling it in real quick.
So, take a few deep calming breaths and jump in.
Yep, I get this felling all the time. But I always think that technologies are like tools. Once you got how to handle then, the rest will be easy.
Whenever I don't feel like that is when disaster lurks! It's like eating an elephant, just do it one bite at a time. Do some part you do understand, and that gives a handle to the next bit.
unfeasible,
unknown domain,
lack specifications,
require technology which I am not familiar with
I think that's how we start our life too. As long as you are confident that you can pull it off, just stick to it and you will see that things are working in your favor provided:
You understand the importance of being Self Starter
You take responsibility for who you are
You ask the right questions at right time
All the best!!!
Often the trouble with these infeasible projects is that the client is on a limited budget and will go bust before you complete your feasibility study. In this case it might be worth taking a step back from technology and looking at economics. May be sub-contracting to someone with the required knowledge will ease the pain.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I dunno about USA and the UK, but in India, schools still teach GW-BASIC. Yes, it's:
10 PRINT "HELLO WORLD"
20 GOTO 10
As far as my experience goes, even writing assembler is easier than this mess of a language. It could easily be replaced by something like Python, which would make it easier for students to actually understand the basic concepts of programming and help them to understand the logic behind what they're doing better.
Because Basic is the most uhh... basic introduction into von-Neumann architecture which is what all modern computers and (by extension) programming languages are based on.
Think about it:
Line numbers = Memory Addresses
Variables = CPU Registers
Current
Line = CPU Instruction Pointer
Goto
= Jump instruction
Ever try teaching programming to someone with no idea what it's about?
I did for 4 years. For absolutely starting out, GWBASIC is pretty good. You can get the most action for the least effort, while still conveying basic ideas, like:
The computer finishes one statement before starting the next. (Newbies are inclined to think the computer does everything "at once".)
A program is like something built out of tinker-toys. There are only a few basic pieces, and you assemble them to make it do what you want. (Newbies often think since the language has words like IF and PRINT that it will just understand whatever they type in.)
Variables are a key concept. They have a name that you give them, and they have values that they get when the programs runs. That's complicated. The name and the value are not the same thing, and there is a distinction between write-time and run-time.
Once you get past some basic concepts with the help of GWBASIC you can begin to introduce a more modern disciplined language.
GW-Basic was taught to me in 7th grade about 10 years ago. I found it was a great language and easy to experiment with as a beginner. Even the non-pc-freaks had little problem learning the language.
In my opinion it is a great tool to motivate beginners to learn more advanced programming languages.
GW-Basic is a great language for new programmers. If someone has never done any programming before, something simple like GW-Basic will be a lot easier for them to comprehend as compared to something like Python. Also, Java has a lot better support for Object Oriented programming as compared to C++. More commercial applications these days are written in Java than C++.[source]. Hence I would say that its a good thing they are switching to Java over C++.
As far as teaching in India is concerned and why they use GW-Basic, I can only guess (being from the USA):
It's cheap. Perhaps they have received old hardware with GW-Basic on it. Hey, it's there, it's free, why not use it to teach children.
The teacher knows it. If the teacher knows/understands it, he/she can teach it.
At a prev. employer, I met a number of people who immigrated to the USA from India and explained that the first time they worked with Windows was when they arrived over here, none of the schools (not even college/university) had it. It might depend on the school they went to, but maybe its a matter of the available equipment. It's possible this GW-Basic usage you speak of works the same way: they used what technology they had.
Maybe it means they are, well, resourceful.
As to whether its good that they are learning something so old, I'm not so sure it's such a good idea. But as the famous (American West) folk wisdom says, "Do with what you got. It'll pay off in the end." Better to expose them when they are young.
It's funny how fast humans forget.
Remember the first time you struggled with the concept of a loop? With the idea of a variable and how it retained values? With remembering syntax?
Basic has a relatively small built-in syntax, it has fairly flexible structures for loops and other constructs.
I guess over all it's "loose". This helps a lot in learning.
Loose is very bad for good, stable programs. You want very little flexibility, you want patterns that you can count on and very few options (even if you don't know that this is what you want, you will understand it as soon as you have to lead a team of 5 developers from another country).
If any here haven't really considered it, the reason we don't like basic isn't a lack of "power" or speed--is because it's loose--the exact same reason it's good for teaching.
You don't start out running, you learn to crawl in a wobbly sort of way, then you stumble, etc.
But once you are running sprints, you really want to make sure that every footfall is placed exactly where you want it, and if the guy ahead of you decides he suddenly wants to start crawling, you're screwed.
Of course, if you're running along the track alone or in a small, in-sync team, it doesn't matter much what you do. Feel free to use any language you want :)
If someone is truly interested in programming, they will take what they learn in that class and apply it to a language learned on their own time.
There's also something to be said for starting in a language that is much less powerful than Java or C++.
so you'll learn NOT to use GOTO
Thats easy to learn,school dont target to teach new technology,school want to teach basics of informatics
I think in my school GW Basic is still taught at 6-7 years (of 10) and the reason of it is that little girls and boys can't understand anything harder than basic :)
Even more, in my university we program on QBasic o_O omg you say? yeah, i'm shoked too :) oh, and they promise one semester of C++ on 4th grade.. yay!
I am from India and GW-BASIC was my first language way back in 1995. It was fun. Things have changed now. My school now teaches another BASIC variant, QBASIC as the first language. Then students move to C++ and Java in standards 8,9,10. Hopefully, Python will take over sometime.
As someone already pointed out, its plain inertia. Its not much of inexpensive hardware which is the reason. Its just the mindset to continue doing whatever has been going on.sigh.
I think GW-BASIC is a good tool to teach programming to children. I am teaching programming to school children for about 10 years. GW-BASIC provides an easy to learn enviornment without going into techniqual details.
If we use some hi-fi programming language to teach kids they will learn the programming language not the programming. Using GW-BASIC it is easy to teach programming, and we can concentrate on programming techniques rather then discussing the structures of programming languages. It has very easy and english like syntax so students understand it easily.
Another thing to keep in mind is its an interpreter to BASIC so we can execute different instructions line by line and can execute any part of the program, this give clear understanding to students.
Direct mode of GW-BASIC provides great help to explain the memory concepts as we can monitor the changing states of variables (memory addresses and values)
As far as GW-BASIC is concerned I couldn't agree more. This is why a Ruby programmer known only as "_why the lucky stiff" created an amazing platform for learning to program called "Hackety Hack". He in fact had quite a lot of insight into teaching programming to young people at the Art & Code symposium:
http://vodpod.com/watch/2078103-art-code-symposium-hackety-hack-why-the-lucky-stiff-on-vimeo
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
This may be a hopelessly vague question. But I am interested to hear whatever logical thought processes people go through when learning a new concept or trying to get their brain around code they might not have ever seen before.
Basically, what general steps does one take to to break down problems and what does it take to "get it"? If you were to diagram a flowchart of how your mental process works when you look at code or try to solve a problem what might it look like?
What common references, tips, and mental assumptions do you find useful in problem solving?
How is this different between different domains? For example in what ways is a web programmer's thought process similar or different from a traditional desktop app developer's process?
I'm a big believer that no matter what type of application you're looking at for the first time, may it be a web app, a desktop app, a device driver, or whatever else, there are three steps one developer usually follows in order to understand how it works:
Get the big picture :
What kind of app is this (web, desktop, ...)?
How is it layered (standalone, client-server, n-tier, ...)?
What is the app's purpose? What is it supposed to do?
Who is the app made for?
See how it works :
What language(s) is (are) used?
How is the code structured?
How is the data structured?
Understand (or at least try to) the way the app has been thought through:
Has it been thought through at all?
Is the app clearly optimized? (For performances? For readability?)
Is the app finished? Or is there room for evolutions?
Are there signs of multiple releases?
etc...
The 1st and 2nd steps are purely technical, while the 3rd MUST be as untechnical as possible... it's more about psychology and understanding how the app has been built. It obviously requires experience, but as long as you think hard enough and don't waste your brain's time with technical details, you'll eventually get it.
This whole process shouldn't require the use of a keyboard. You're only supposed to read, think, and take notes on a paper (I'm not kidding: pen and paper!).
Ho ho, good luck with this one. It's a great question and I'm sure you'll get a ton of answers. Although I have to say I cannot give a satisfactory answer to this - the last thing I would describe my thought processes as is a flow chart - I don't think there is any golden formula for this.
The only tip in problem solving I can recommend is discussing it with somebody else. In those times when you hit a brick wall, going through it with a colleague is invaluable. Quite often, as well, they will actually not even add much to the discussion - in the process of getting all your thoughts out in the open, the solution can become clear.
People are notoriously bad at examining their own thought processes, but I'll give it a whirl. I test very high for visuo-spacial ability in IQ tests, medium-to-high for verbal skills, and moderate for mathematical skills (explains my A-level Maths grade, I suppose). amd when I start to design software, I think in terms of shapes and the connections between them. When it comes to describing these thoughts to others (or clarifying them for myself), I use simple block diagrams or the object diagrams taken from Jacobson's Objectory method - NOT the over complex stuff that UML suggests. I sometimes write textual descriptions of complex things, mostly as reminders to myself, but never use numbers or maths.
Of course this is just me - I've worked with maths whizzes who were just as good or even better programmers than myself.
I don't think... I process.
This is actually less flip than it sounds. I always break down tasks into their components and then break these down further, and that doesn't just go for writing software! Much like #Mark Pim U go through things sequentially.
My wife gets really annoyed when I make dinner because I take so long to get started.
Divide & Conquer
I start by trying grasp the entire problem as it is, and then start to find patterns I can recognize, and do the same for them in a kind of recursive process, until I have a broken down solution I can implement and follow more easily.
This is one of the rare times I would answer with "it just works." I learn things by steamrolling through them. I don't have gimmicks, or devices to help me. Took me some time to learn PHP, but after that Javascript was much easier. Once you tackle one thing, the next items become cumulatively-easier.
Personally, I conduct an internal dialogue with myself 'OK so we need to loop over this list of integers.' 'But we can break when we find the value we want.' 'OK, will the list definitely be initialised when we start?'
I'd be interested to see if any psychological research had been done on problem solving techniques.
Similar to Jonathan Sampson - it kind of just works.
When I'm attacking a real problem, I try and think of the most logical way of getting through it is.
Then, when everything goes wrong (as it usually does), I have to make hundreds of sidesteps to get things done. Just keep focusing on that end goal, that logical way and you'll get there.
Eventually though, it decides to work for me and I end up with a finished product that is usually nothing like I planned it out to be. As long as the customers are happy, I am!
Personally, I see code in my head pictorally rather than textually (like Neil Butterworth) - it's a bit hard to describe since (quoting STIV) "there's no common frame of reference."
My main skill is identifying similarities between models or systems I already know about and the task at hand. The connections between some of these can seem quite abstract; the key is to spot the connections. This leads to the abstraction of common patterns and approaches which are widely applicable. Related to this, the most important thing I learnt about algorithms was that the problem is never 'come up with a smart algorithm to solve X'. It's 'model problem X such that it can be solved by existing smart algorithm Y'.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Am I a lazy developer?
Is it being lazy to use automated tools, such as code generators and such?
Now, I could, if I had to, create all the data layers and entities I needed, but I choose to use CodeSmith to generate my datalayers and entities.
I also use Resharper and I would say it fights with MSDeploy as to which gets installed first after Visual Studio.
Again if I had to, I could code without it, but prefer not to.
Both these tools from my point of view are no brainers as they improve output massively.
But is this lazy?
I'm sure there are purists out there that would say everything should be wirtten by you so you know what everything is doing, but if you can read through the code and see what is happening is that ok?
So am I being lazy or am I just using all the cards in my hand?
In programmers, laziness is a virtue, so don't worry.
It's only lazy if you use a tool to produce code and use it as-is without verifying that the code meets your needs and abides by your standards.
You don'nt need to reinvent the wheel n times, this is done often enough. Briefly I'd state that using tools like the ones you mentioned (within reason) is absolutely no problem...
For you? No, you're not being lazy.
For the guy that doesn't understand what code generators are doing and how they do it? Yes, it's being lazy.
That's the important distinction: You have to know what you gain and know what you're missing by using a code generator. If you don't, it's only a matter of time before you come across a case where you have to be able to produce those classes and not know how.
Both these tools from my point of view are no brainers as they improve output massively.
This means you're not being lazy, you are using the appropriate tools to enable you to concentrate on the important aspects of the job.
It's not being lazy - it's being smart. There's nothing wrong with using every tool at your disposal...as long as it makes you more productive. Using tools for the sake of using tools is a bad idea.
However, if you don't know what your tool is doing under the hood, you should learn about it so if you don't have the tool available for some reason, you can get the job done.
I think that's the wrong question. Laziness is a virtue. I've seen too many programmers who do things the hard way rather than sitting back and thinking for a few minutes to come up with an easier way. I've had so many times that I've said to a junior programmer something to the effect of, "Yes, I respect your diligence in working through lunch and staying late to write the code to do X, but if you'd taken a few minutes to check the documentation you might have seen that there is already a function in the library that does that". Or similar stories.
I'm not familiar with the specific tools you describe, but to me, the question always is, Does this tool actually save me any work? I've tried plenty of "code generators" that basically just create code stubs. So gee, thanks, you wrote the "function x(int, float)", now all I have to to is fill in the actual parameter names and write the code. What did that save me? I've also seen plenty of code generators that write really awful code. So now I have to try to add the "custom" code to this jumbled mess. Wouldn't it have been easier to just write the whole thing cleanly the first time? I've seen plenty of productivity tools where I found it takes me more time to set up the parameters to run the tool than I actually saved by using it. (Like the old joke that it's been proven that jogging regularly really does make you live longer: for every 60 minutes you spend jogging, it adds 30 minutes to your life.) Some tools may produce code or data structures or whatever that is difficult to maintain, so you save an hour today but it costs you ten hours in maintenance over the life of the project. Etc.
My conclusion isn't that you shouldn't use productivity tools, but rather that you should make sure they really are increasing your productivity, and not just giving an illusion of doing so. If in your case you find these tools really do help you, then using them is not "cheating", it's simply smart.
As everyone else already pointed out there's nothing wrong in your use of code generators.
Still I can see downsides and reasons to avoid it in certain particular sitations.
choice of language. Sometimes the very same fact you need a code generator to get your coding started could imply you're using the wrong language for the task. Most times language cannot really be chosen, so code generators remain the best way to go.
code redundancy. Depending on the actual generators used, generated code could be redundant, if this happens and generation happens once, isn't automated, and generated code goes into the main repository maintenance problems could arise in the long run. Not really a problem with code generation itself, but with the way it should and should not be used.
adding development platforms requirements. We have to concede many programmmers out there work on bread-toasters doubling as PCs. It's really a bad, (and sad) reality of cheap business practices meeting sharp minds. (sharp minds go to waste in the process) It could become a concern if our project (which could have a port in store for the future, and in an external facility either) needs an hefty, ram hogging, not enough cross platform, IDE handy to compile every little modification.
So, no definitive answer on code generating lazyness and programmming: it depends. Then again, using the wrong tools for the job is bad for your health, (and business) so... don't.
You're using all the cards in your hand. Why reinvent the wheel when there are tools available to make your job easier. Bear in mind these tools DON'T do your job, they only assist.
What you create is down to you, so using the tools is not lazy... it's just intelligent.
I'd say you're more efficient rather than lazy.
Programming is primarily a thinking exercise not a typing one. So long as you understand what the tools are doing you're shifting the balance away from typing to thinking. Doing more of what your job is about? Doesn't sound like lazy to me!
I'm sure there are purists out there that would say everything should be wirtten by you so you know what everything is doing
This might have been a viable point of view during the early days of programming. But nowadays, this is simply not feasible (or even preferable). After all, you've already obscured a certain level of understanding just by using a high-level language.
That said, I've found it to be a great learning exercise to write some of these things by hand occasionally. Not only do you get to learn more, but they teach you how helpful these tools really are (or aren't). Note that I'd only do this on a personal project though. I wouldn't do this for any project someone was paying me for (unless I were working for a masochist or something).
Ask yourself why there are so many ORM and other code-generation tools around. I'd say go for it with the proviso that you leave it maintainable for the next guy/gal.
Programming is about being lazy, about automating repetitive tasks. If you can't do that inside your language, using code generators and similar things is a useful workaround.
It depends on what you're writing, of course. I am suprised nobody's brought this up. If you're writing device drivers, operating systems, protocols, or server software (web servers, tcp driven servers, etc), you should probably do it by hand.
But with what I do and probably what a lot of us do is implement business processes in code for web pages or web services. And in those areas, if you can improve on your code with code generators, go for it.
Yes you are being a lazy developer, be honest to yourself, if you take the time to do it the hard way you can call yourself less lazy than you are now.
The point is, being lazy isn't inefficient at all.
Lazy people take time to look at the problems from different direction before acting upon it, this avoids unnecessary errors which saves you valuable time.
So you're being lazy, but that's ok. People don't hire hyperactive coders that make 10 applications each day but leave a trail of bugs on their path. bug-fixing costs time, time is money.
conclusion:
Laziness = profit
Go for it.
I think the best developers are also the laziest. Basically, all you're doing should be focused on getting the end result with the least amount of work. This often provides the best result and also avoids developers from being distracted by fun things to include in a project. A lazy developer would e.g. never add an easter egg to his code, simply because this would be more code, which could introduce more bugs that need to be fixed later on. Adding code is bad, since you'd also add more bugs that you need to resolve later. Still, you will need to add code, else you won't get paid. So, as a lazy developer you would of course choose for the most optimized code, the best-tested code which would almost never fail and you'd work in a way that the chance of errors is reduced to a minimum.
Do keep in mind that lazy developers should focus on avoiding work in the future, not on avoiding work right now! So stop reading here and get back to work! ;-)
Laziness is a trait that most good programmers have. Unless they work for Adobe, in which case they are often lazy in a bad way.