How to justify to your colleagues that they produce crappy code? - coding-style

I am finding somewhat difficult to carry on working in my current job.
The codebase has become a bit wild lately (but definitely not the worse I've seen), and I'm having a hard time dealing with some parts of the code. I could be stupid, but most likely it's just that it demotivates me a lot to start working on something that is hard to reason about.
My boss is already aware of my thoughts - I expressed what it feels like to work like this. He asked me to provide examples of what was wrong. When I pointed out two or three small issues, he said "yeah, ok" but that refactoring costs him a lot of money, and that we have to get the product out (not the first time I hear this).
I have to admit that the examples were not the most compelling, but the problem is actually tough to explain. It's made up of a lot of tiny "bad decisions" throughout the codebase. (We also see this issue is absolutely subjective). For instance, bad naming, dealing with nulls, boilerplate, not making code reusable (or the opposite) and so on. It can be tiring to re-think someone else's code over again to justify I would have done it differently.
Do you have thoughts on how to deal with this?
I am a bit fed up of having to go hacking around a quick 'n dirty codebase every time!

Sometimes your fellow programmers do things very differently than you, and things you might feel are way wrong might actually have positive aspects. We all have our schools we come from. I think I've come across programmers who complain about things I don't understand equally as often that I myself have felt something needs to be complained about.
Make sure you can deduce what you complain about into a concrete disadvantage. If for no other reason so that you can motivate middle management about improvements to make. Things that are hard to deduce into measurable facts usually originates from difference in taste/style rather than quality (there are boooks to read about this subject). The answer posted by smacl have good and concrete advice!
If you can deduce your concern into a real disadvantage, then I really do not agree when people say that one have to "accept" situations like this. I've been exposed to this problem more than once, and let me tell you, refactoring is not the solution to the problem. Refactoring only fixes the symptoms.
Accepting a situation like this is the same as saying "bad quality product lines and expensive and frustrating maintenance is something my company can live with". This is ofcourse seldomly the case. However management (i.e. those with the go/no-go on what projects to prioritize) are very often not technically aware of what the problems are, or why development is expensive. They shouldn't have to be for that matter.
That's why you need a development organization with technical leads, chief architects, a good organisational structure and tiered model etc. Experienced software professionals who have seen where the road leads to if you ignore certain aspects of development. It's about changing the "culture" of your team(s).
Either you stick with your company and try to change how you do things from the roots, or you find another place to work and make sure you find out during the interview exactly how they work in every-day development.
Good luck

I recently faced a very similar problem and a friend gave me some advice that helped a great deal. He said: "keep yourself out of it."
What he meant was, that you must communicate the problems because they are real, costly problems with consequences in terms of time and money. But when you do communicate, talk only about the consequences for the organization. Do not mention the consequences to you, because then it just sounds like whining and will be ignored.
For example:
Not keeping yourself out of it:
"The other developers use these obscure, misleading identifiers and then I have to spend hours going over the code trying to discover what they meant. It's taking up a lot of my time."
Keeping yourself out of it:
"It would be very helpful and cost effective to do some refactoring of class and variable names and also establish some coding standards around identifiers. The immediate payoff will be an easier-to-understand codebase for everyone, leading to better productivity. The longer-term payoff will be that later we'll be able to modify the code and fix things faster. If a critical bug is discovered right before a release, an understandable codebase will be really important."
I hope that helps.

1) Make the problem more visible and get management buy-in
Keep a very detailed diary of the time spent on various coding tasks over the period of about a month. At the end of the month analyse and summarise the contents for your boss, i.e. time wasted and hence money wasted, to illustrate that change of some form is necessary.
2) Think of a cost effective way of moving forward
For example; Rather than refactoring the entire code base, seperate interfaces from implementations, and enforce tighter standards, including unit tests, naming conventions, etc.. at an interface layer. Thus each programmer can have confidence in using code that they have not written. While this is sweeping the crap under the carpet to a certain extent, it is a good way of preparing for larger scale refactoring.
It is important from a management perspective that workflow is not interrupted, and positive results are visible, so plan accordingly.
3) Agree longer terms improvements with your co-workers
Sit down and agree reasonable coding standards for future code with the other programmers.

Perhaps you could setup monthly meetings and at those meetings you could demonstrate good and bad code. Obviously you don't want to point fingers so you'd want to use generic code examples that are based off of stuff you saw in your project. This way you can constructively gather support from others in your style. You might want to compile these after the meetings so people can easily reference them.
I think it is real easy to point out issues, and complain but to mentor people and help them change requires effort. It isn't an easy task but if you are having trouble being motivated with your job perhaps this would give you a nice burst of motivation. You might learn some things a long the way.

You'll find that this is common-place. What you can do is accept that things are done differently by different people. As you fix bugs or add features, you'll get a brief window into a sub-section of the application that you can improve. When you work on the code, you can make it better, and they don't need to know that you're piecemeal improving the code.
Be very careful though. Sometimes code is written in a way that looks 'hacked', but solves a bug that is not easy to discern. Especially if it is older code which has been tried and tested.
On another note, complaining will only get you viewed as a complainer. Think about what outcome you want, and what actions will most likely produce that outcome. You will always hear the answer 'No' when you ask, 'Can I do X-days of work for absolutely no noticable result?'

You could quit and hope to find something better.
Or, you could stick it out and try improve the code that you can control, when you can control it. No matter how well intentioned the developers are, if there is more than one developer the code base will be "ugly" by a competent developers standards. Work with the other developers to improve their abilities and refactor code as you make enhancements.

For starters:
Enforce the use of static code analysis tools. Every language has a few well known tools.
Show some before and after refactored code examples, and explain why you think it's better. Try not to put any one person on the spot.
Code reviews by experienced developers.
keep in mind, some developers can't be helped no matter how much you try...
If someone critiques your code be polite and open minded, you might learn something.
Cyclomatic complexity / number of changesets/bugs. Complex code is more likely to break, cause more bugs which causes more changes, which cost more money!

99% of the time you never get to choose the people you work with. Not all relationships work out, be they work or otherwise.
It would be best if your project was broken up enough so that each developer can contribute to a spec of what the other needs, so programmers don't step on each other's toes.
Getting people to change their coding style is hard. It takes a cast iron technical lead committed to such things and will help when you bring it up. Management types can't do this, leadership needs to provide technical details.

It sounds to me like you don't have a problem with the code so much as your coworkers. It will probably be very difficult for you to force the changes you want to see. Your best bet would probably be to start updating your resume and keep your eyes open for other opportunities

I think that once you're in the middle of the weeds, you do not really have a good chance of getting things done right, you just have to get them done. I would say most developers do not like firefighting and want the ideal code base, but in my opinion this requires you to spend the time up front planning the system out.
I'd recommend trying to work with your manager to ensure that the areas you feel are lacking now are not lacking in the next project. Maybe its putting you on the lead, having more code reviews with peers, maybe it is further training for the entire team.
Either way, I think this is something that most of us go through. I do agree with the other person advising some caution on this. I know that code I wrote yesterday seemed great at the time and looking back on it, can probably find 10 other ways to do it and make it look cleaner.

Have you considered maybe adding fxcop to the automated builds to enforce coding style? Other than that, you could try suggesting TDD which would give the power to whomever writes the test to enfore that the interfaces for each class are structured in a particular way.
Off the top of my head, that's all that I can think of.

Things in life are not perfect and if you start nitpicking, feathers will be ruffled and relationships soured.
The best method is to pick your battles carefully. If something is small enough ignore it and live with it. If it is big and worthwhile (i.e. the management sees ROI in backing you) go for it.
This is apt for your situation...
God, grant me the serenity to accept the things I cannot change courage to change the things I can and the wisdom to know the difference.

One thing I try to do and it may help you. If a part of code is bad, and the idea you propose to fix it is agreed as best but "no time" excuse is given, why dont you rewrite it? say on your own time? If you decide on sticking around at that job for a while it will only help you. And only you will learn and become a better programmer.
Note that it is a good idea and I would even say required, to do a complete code review of that change before check-in and you should try to time the check-in so that it is before a complete regresion test cycle for a release. That way your refactoring is completely tested out. Over a period of 6 months or so, it will start showing a beneficial impact and you can then ask for time allocation for this, with proof to back it up.

The only thing that has a chance of convincing management is demonstrating that the things you are citing as perceived problems become actual problems.
To try to take advantage of this, try to keep the "complainer" tone down to a minimum, that is, focus on how this affects the bottom line rather than how it makes you feel. Point out possible consequences of poor decisions that you see being made. If those consequences come to pass, and they cost more than an up-front fix would have, gently remind management that you foresaw the difficulty and provide a helpful suggestion as to how future similar costs can be avoided with a little up-front effort.
The problem is, in many organizations, the problems will never cause enough of a problem for management to care, or if they do, they won't see the connection between your perception of the problem and the actual problem the way it occurs. In these cases, you end up seemin like a needlessly persnickety technical person, which isn't a reputation you want to have.
So my advice is, pick your battles. If there is something very egregious that others are about to let slip, then you can speak up and perhaps be vindicated later. For the little details that just grind away at you, I'm afraid there's not much you can do but put up with it.

Show them their own forgotten code disguised as yours for critique.
Take an old piece of their code they have forgotten about
Pretend you wrote it
Ask them to figure out something with it
Make sure they point out how bad the code is for whatever reason
Add your own items. Brainstorm what should be done since it's your fault.
Let them know you didn't know how to bring it up to offend them, but it's their code.
If they recall that they wrote it, they might catch on..

If you have a good relationship with your manager, you might be able to use this to work yourself into a "Senior" or "Lead" Developer role. You could propose that it would be best if one person on the team takes technical leadership of the code base. It would be your job to review the code of others and ask them to make improvements when you feel it is necessary. If you go this route, just make sure to take it slowly. If you ask for a lot very quickly, then you could end up pissing off all the other developers.

Related

What are some good strategies to fix bugs as code becomes more complex?

I'm "just" a hobbyist programmer, but I find that as my programs get longer and longer the bugs get more annoying--and harder to track. Just when everything seems to be running smoothly, some new problem will appear, seemingly spontaneously. It may take me a long time to figure out what caused the problem. Other times I'll add a line of code, and it'll break something in another unit. This can get kind of frustrating if I thought everything was working well.
Is this common to everyone, or is it more of a newbie kind of thing? I hear about "unit testing," "design frameworks," and various other concepts that sound like they would decrease bugginess, make my apps "robust," and everything easy to understand at a glance :)
So, how big a deal are bugs to people with professional training?
Thanks -- Al C.
The problem of "make a fix, cause a problem elsewhere" is very well known, and is indeed one of the primary motivations behind unit testing.
The idea is that if you write exhaustive tests for each small part of your system independently, and run them on the entire system every time you make a change anywhere, you will see the problem immediately. The main benefit, however, is that in the process of building these tests you'll also be improving your code to have less dependencies.
The typical solution to these sort of problems is to reduce coupling; make different parts less dependent on one another. More experienced developers sometimes have habits or design skills to build systems in this manner. For example, we use interfaces and implementations rather than classes; we use model-view-controller for user interfaces, etc. In addition, we can use tools that help further reduce dependencies, like "Dependency injection" and aspect oriented programming.
All programmers make mistakes. Good and experienced programmers build their programs so that it is easier to find the mistakes and restrict their effects.
And it is a big deal for everyone. Most companies spend more time on maintenance than on writing new code.
Are you automating your tests? If you do not, you're signing up creating bugs without finding them.
Are you adding tests for bugs as you fix them? If you do not, you are signing up for creating the same bugs over and over.
Are you writing unit tests? If not, you are signing up for long debugging sessions when a test fails.
Are you writing your unit tests first? If not, your unit tests will be hard to write when your units are tightly coupled.
Are you refactoring mercilessly? If not, every edit will become more difficult and more likely to introduce bugs. (But make sure you have good tests, first.)
When you fix a bug, are you fixing the entire class? Don't just fix the bug; don't just fix similar bugs throughout your code; change the game so you can never create that kind of bug again.
Bugs are a big deal to everyone. I've always found that the more I program, the more I learn about programming in general. I cringe at the code I wrote a few years back!! I started out as a hobbyist and liked it so much that I went to engineering college to get a Computer Science Engineering major (I am in my final semester). These are the things that I have learned :
I take time to actually design what I am going to write and document the design. It really eliminates a lot of problems down the line. Whether the design is as simple as writing down a few points on what I am going to write or full blown UML modeling (:( ) doesn't matter. Its the clarity of thought and purpose and having material to look back at when I come back to the code after a while that matter the most.
No matter what language I write in, keeping my code simple and readable is important. I think that it is extremely important not to over complicate the code and at the same time not to over simplify it. (Hard learned lesson!!)
Efficiency optimizations and fancy tricks should be applied at the end, only when necessary and only if they are needed. Another thing is that I apply them only If I really know what I am doing and I always test my code!
Learning language dependant details helps me keep my code bug free. For instance I learned that scanf() is evil in C!
Others have already commented on the zen of writing tests. I would like to add that you should always do regression tests. (i.e. Write new code, test all parts of your code to see if it breaks)
Keeping a mental picture of code is hard at times, so I always document my code.
I use methods to make sure that there is a bare minimum dependence between different parts of my code. Interfaces, class hierarchies etc. (Decoupled design)
Thinking before I code and being disciplined in whatever I write is another crucial skill. I know people who don't format their code so its readable (Shudder!).
Reading other peoples source to learn best practices is good. Making my own list is better!. When working in a team, there must be a common set of them.
Don't be paralyzed by analysis. Write tests, then code, then execute and test. Rinse wash repeat!
Learning to read over my own code and combing it for mistakes is important. Improving my arsenal of debugging skills was a great investment. I keep them sharp by helping my classmates fix bugs regularly.
When there is a bug in my code, I assume its my mistake, not the computers and work from there. That is a state of mind that really helps me.
A fresh pair of eyes aids in debugging. Programmers tend to miss even the most obvious errors in their own code when exhausted. Having someone to show your code to is great.
having someone to throw ideas at and not be judged is important. I talk to my mom (who is not a programmer) , throw ideas at her and find solutions. She helps me bounce my ideas back and forth and refine them. If she is unavailable, I talk to my pet cat.
I am not so be discouraged by bugs anymore. I've learned to love removing bugs almost as much as programming.
Using version control has really helped me manage different ideas I get while coding. That helps reduce errors. I recommend using git or any other version control system you might like.
As Jay Bazzuzi said - Refactor code. I just added this point after reading his answer, to keep my list complete. All credit goes to him.
Try to write reusable code. Reuse code, both yours and from libraries. Using libraries which are bug free to do some common tasks really reduces bugs (sometimes).
I think the following quote says it best - "If debugging is the art of removing bugs, programming must be the art of putting them in."
No offense to anyone who disagrees. I hope this answer helps.
Note
As others Peter has pointed out, use Object Oriented Programming if you are writing a large amount of code. There is a limit to code length after which it becomes harder and harder to manage if written procedurally. I like procedural for smaller stuff, like playing with algorithms.
There are two ways to write error-free programs; only the third one works. ~Alan J. Perlis
The only way for errors to occur in a program is by being put there by the author. No other mechanisms are known. Programs can't acquire bugs by sitting around with other buggy programs. ~Harlan Mills
Obviously, bugs are a big deal to any programmer. Just look through the list of questions on Stack Overflow to see this illustrated.
The difference between a hobbyist and an experienced professional is that the pro will be able to use his experience to code in a more "defensive" way, avoiding many types of bugs in the first place.
All the other answers are great. I'll add two things.
Source control is mandatory. I'm assuming you're on windows here. VisualSVN Server is free and maybe 4 clicks to install. TortoiseSVN is also free and it integrates into Windows Explorer, getting around the VS Express limitations of no add-ins. If you create too many bugs, you can revert your code and start over. Without source control, this is next to impossible. Plus you can sync your code if you have a laptop and a desktop.
People are going to recommend many techniques like unit testing, mocking, Inversion of Control, Test Driven Development, etc. These are great practices, but don't try to cram it all into your head too quickly. You have to write code to get better at writing code, so work these techniques slowly into your code writing. You have to crawl before you walk and walk before you can run.
Best of luck in your coding adventures!
This is a common newbie thing. As you get more experience, of course, you'll still have bugs, but they'll be easier to find and fix because you'll learn how to make your code more modular (so that changing one thing doesn't have ripple effects everywhere else), how to test it, and how to structure it to fail fast, close to the source of the problem, rather than in some arbitrary place. One very basic but useful thing that doesn't require complex infrastructure to implement is to check the inputs to all functions that have non-trivial precondtions with asserts. This has saved me several times in cases where I would have otherwise gotten weird segfaults and arbitrary behavior that would have been near impossible to debug.
If bugs weren't a problem then I'd be able to write a 100,000 line program in 10 minutes!
Your question is like, "As an amateur doctor, I worry about my patients' health: sometimes when I'm not careful enough, they sicken. Is patients' health a problem for you professional doctors too?"
Yes: it's the central problem, even the only problem (for any sufficiently all-inclusive definition of 'bug').
Bugs are common to everyone -- professional or not.
The larger and more distributed the project, the more careful one must be. One look at any open source bug database (ex: https://bugzilla.mozilla.org/ ) will confirm this for you.
The software industry has evolved various programming styles and standards, which when used right, make wrong code easier to spot or limited in its impact.
Therefore, training has a very positive on code quality... But at the end of the day, bugs still sneak through.
If you're just a hobbyist programmer, learning full bore TDD and OOP may involve more time than you're willing to put in. So, going on the assumption that you don't want to put in the time on them, a few easily digestible suggestions to cut down on bugs are:
Keep each function doing one thing. Be suspect of a function more than, say, 10 lines long. If you think you can break it into two functions, you probably should. Something that will help you control this is naming your functions according to exactly what they are doing. If you find that your names are long and unwieldy then you function is probably doing too many things.
Turn magic strings into constants. That is, instead of using:
people["mom"]
use instead
var mom = "mom";
people[mom]
Design your functions to either do something (command) or get something (query), but not both.
An extremely short and digestible take on OOP is here http://www.holub.com/publications/notes_and_slides/Everything.You.Know.is.Wrong.pdf. If you get this, you've got the gist of OOP and are quite frankly ahead of a lot of professional programmers.
The prevailing wisdom seems to be that the average programmer creates 12 bugs per 1000 lines of code - depends on who you ask for the exact number, but it's always per lines of code - so, the bigger the program, the more the bugs.
Subpar programmers tend to create way more bugs.
Newbies are often trapped by idiosyncrasies of the language, and lacking experience tends towards more bugs too. As you go on, you will get better, but never will you create bug-free code... well I still have bugs, even after 30 years, but that could be just me.
Nasty bugs happen to everyone from pros to hobbyists. Really good programmers get asked to track down really nasty bugs. It's part of the job. You'll know you've made it as a software developer when you stare at a nasty bug for two days and in frustration you shout, "Who wrote this crap!?!?" ... only to realize it was you. :-)
Part of the skill of a software developer is the ability to keep a large set of interrelated items straight in his/her head. It sounds like you're discovering what happens when your mental model of the system breaks down. With practice you will learn to design software that doesn't feel so brittle. There are tons of books, blogs, etc. out there on the subject of software design. And Stack Overflow of course for specific questions.
All that said, here's a couple of things you can do:
A good debugger is invaluable. Often you have to step through your code line by line to figure out what went wrong.
Use a garbage-collected language such as Python or Java if it makes sense for your project. GC will help you focus on making things work instead of getting bogged down by maddening memory errors.
If you write C++, learn to love RAII.
Write LOTS of code. Software is somewhat of an art form. Lots of practice will make you better at it.
Welcome to Stack Overflow!
What really changed my odds against code complexity and bugs was using a coding standart - how to place brackets an so on. It may seem like just boring and useless thing but it really unifies all the code and makes it much easier to read and maintain. So do you use a coding standart?
If you're not well organized, your codebase will become your very own Zebra Puzzle. Adding more code is like adding more people/animals/houses to your puzzle, and soon you have 150 various animals, people, houses and cigarette brands in your puzzle and you realize that it just took you a week to add 3 lines of code because everything is so inter-related that it takes forever to make sure the code still executes how you want it to.
The most popular organizational paradigm seems to be Object Oriented Programming, if you can break your logic down into small units which can be constructed and used independently of each other, then you will find bugs far less painful when they occur.

How not to rush yourself? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I often find that I do a less than complete work on a feature, especially in the Design phase. I detect several reasons:
I'm over-optimistic
I feel the need to provide quick solutions, so sometimes I fool myself into thinking the design is fool-proof when in fact it's still full of holes, just to get the job done faster. Of course I end up paying dearly later.
I'm aware of this behavior of mine for some time, yet I still find I don't manage to compensate. Have you encountered similar problems? How do you approach solving them?
I use a couple of techniques. The first is a simple paper to-do list. In the morning I write down my tasks for the day. I try to work on a task until I can cross it off. I cross it off only when I'm done to my own satisfaction. My to-do list helps me stay focused. When an interruption comes in, I can consciously choose whether it is important enough to interrupt what I'm doing now.
The second technique I use is to give up on the idea of "done" for a design. Instead, I focus on what I've started calling "successions", where a design goes through predictable stages. Each stage supports the current functionality well and will be succeeded at some point by the next stage. This lets me do a good job, a job I can be proud of, without over-designing.
I have the intuition that there is a small catalog of such successions (like http://www.threeriversinstitute.org/FirstOneThenMany.html) that would cover most of design. In the meantime, I try to remember that "sufficient to the day are the troubles thereof".
I run into this problem a lot.
My solution is a notebook. (The old fashioned paper kind).
I write out how I'm planning on implementing the solution as an bulleted overview list, and then I try and flesh out each point on the list.
Often, during that process, I come across issues I hadn't thought of.
Of course, the 80/20 rule still applies... I still come across things when I'm actually doing the implementation that hadn't occurred to me, but with experience these tend to diminish.
EDIT: If I'm still not sure at the end of this process, I put together a throwaway prototype testbed... It's important to make sure it's throwaway, because otherwise you run the risk of including some nasty hacks in your real codebase.
It's very common to miss edge-cases and detail when you're in the planning phase of a project, especially in the software development field. Please don't feel that this is a personal failing; it's something endemic.
To counter this, many software development methodologies have emerged. Most recently there has been a shift by many development teams to 'agile' methods, where there is a focus on rapid development with little up-front technical design (after all, many complexities are only discovered when you actually begin developing). I'm currently using the Scrum system, which has been excellent in my small team:
http://en.wikipedia.org/wiki/Agile_methods
http://en.wikipedia.org/wiki/Scrum_%28development%29
If you find that your organisation will not accept what they may regard as a radical shift in approach, it may be worth investigating whether they will agree to the development of a prototype system. This means that you could code up a feature to investigate the technologies involved and judge whether it's feasible, without having to commit to full development, a quality bar, testing schedules etc. The prototype should be thrown away once the feasibility has been proved or disproved, then proper development may begin, including all that you've learned in the process.
If your problem is more related to time management, then I'd recommend the Getting Things Done approach (http://en.wikipedia.org/wiki/Getting_things_done). This is pragmatic and simple, concentrating on making you productive without overloading you with information that isn't immediately relevant to your current work. I've found that I get overwhelmed with project/feature ideas at times and it really helps to write everything down and file it for a later time when I have the resources available to work effectively.
I hope this helps and best of luck!
Communication.
The best way to not rush yourself into programming mistakes is communication. Yes, good ol' fashioned accountability. If another person in the office is involved in the process, the better the outcome. If a programmer just takes on the task without any concern for anybody else, then there is a higher possiblity for mistakes.
Accountability Checklist:
How do we support this?
Who needs to know what has changed?
Why are we doing this in the first place?
Will there be anybody who doesn't want this changed?
Will someone else understand how I did this?
How will the user perceive and use this change?
A skepticle comrad is usually good enough to help. Functional Specifications are good, they usually answer all of these thoughts. But, sometimes a conversation with another person can help you with it and you can get changes out the door faster.
I have learned, through years of mistakes (though still making them), that almost anything I want to use repeatedly, or distribute, needs to be designed properly. So getting burned enough times will end your optimism.
When getting pressure from management, I tell them I will have to put in the thought anyway, so I should do it when it's cheap. I think on paper as well, so I can actually prove that I'm doing something and it keeps my fingers on the keyboard, both of which provides a soothing effect to management. ;-)
At the risk of sounding obvious - be pessimistic. I had a few experiences where I thought "that should take a few hours" and it ended up taking a couple days because of all the little things that pop up unexpectedly.
By far the best way I've found to manage things is to (much like Andrew's answer) write out the design and requirements as a starting point. Then I go through and look for weak points in the design, gotchas and additional use cases etc. I try to look at this as a critical exercise - there's no code written yet, so this is the time to be totally ruthless and look for every weak point. Look for error conditions you'll have to handle, and whatever amount of time you think it will take to complete each feature/function, pad that amount by a lot. I've had times where I've doubled my initial estimate and still not been that far off the mark.
It's very hard as a programmer to realistically project debugging time - writing the code is easy to estimate, but debugging that into functioning, valid code is something else entirely. Therefore I find there's no exact science to it but I just pad tasks by a whole bunch, so that I have plenty of breathing room for debugging.
See also Evidence Based Scheduling which is a fascinating concept in scheduling developed by FogCreek for their FogBugz product.
You and the rest of the world.
You need more a more detailed design, more accurate estimate, and the willingness to accept that sometimes the optimal solution is not necessarily the best solution (e.g., you could code some loop in assembler to get optimal performance, but that's going to take a lot longer than just doing
for (i=1; i<=10; i++) {}
). Is the time spent doing it really worth it for an accounting package over a missile system.
I like to designing, but over time I've found that much design up front is a lot like building castles into the sky - it's too much speculation, however well-educated, missing critical feedback from actually implementing and using the design.
So today I'm much more into accepting that while implementing a design I will learn a lot of new stuff about it, and need to feed that learning back into the design. Doing that is a skill that is fun to learn, including the skills to keep a design flexible by keeping it simple, free of duplication and cohesive and decoupled, of changing the design in small, controlled steps (=refactoring), and writing the necessary extensive suite of automated tests that make this kind of changes safe.
This seems to be a much more effective approach to me than getting better at "up front design speculation" - and addtionally it makes me equally well prepared for the inevitable moment when the design needs to be changed due to a simply unforseeable change in the requirements.
Divide, divide, divide. List all the steps that will be required to finish the project, then list all the steps those steps will require to be concluded, and so on until you reach atomic items you are absolutely sure you can finish in a day or less. Add the duration of all these values to arrive at a length of time.
Then double it. Now you have a number that, if depressing, is at least somewhat realistic.
If possible "Sleep on your design" before publishing it. I find after I leave work, I usually think of things I have missed. This usually happens while I am lying in bed before falling asleep or even while showering the next day.
I also find it valuable to have a peer/friend that I trust review what I have before distributing it. Somebody else almost always sees something I didn't think of or miscommunicated.
I like to do as others stated here. Write down in pseudo code what the flow of your app will be. This immediately highlights some detailed areas that may require further attention that where not apparent up front.
Pseudo code is also readable to business users who can verify your approach meets their needs.
Using pseudo code also creates a nice set of methods that could be put to use as an interface in the final solution. Once the pseudo code is fairly tight, look for patterns and review some common GOF patterns. They do not have to be perfect but using them will sheild you from having to rewrite the code later during the revisions that are bound to come along.
Just taking an hour or two write psuedo code, yields some invaluable time saving pieces later on:
1. An object model emerges
2. The program's flow is clearly defined for others
3. It can be used as documentation of your design with some refinement
4. Comments are easier to add and will be clearer for someone else reviewing your code.
Best of luck to you!
I've found that the best way to make sure you've chosen a good design is to make sure that you understand the problem, know the limitations you have, and know what things are must-haves vs. nice-to-haves.
Understanding the problem will involve talking to the people who have the need and keeping them anchored to what needs to get done first instead of how they think it ought to get done. Once you know what actually has to happen, you can go back and talk over requirements about how.
Knowing your limitations may be quite easy: needs to run on the iPhone; has to be a web application; needs to integrate with the already-existing Java code and deployment setup; and so on. It may be quite difficult: you don't know what the potential size of your user base is (hundreds? thousands? millions?); you don't know whether you'll need to localize it (though if you're not sure, assume you will have to).
Must-haves vs, nice-to-haves: this is possibly the most difficult part. Users very often have emotional attachments to "requirements" ("It should look just like Excel") that are not actually part of the "has to happen" stuff. You often have to juggle functionality vs. desires to get an acceptable implementation. You can't always give everyone a pony.
Make sure you write all this down! Even if it evolves along the way, or the design is small, having a "this is what we're planning to do now" guide to refer to when you need ot make a decision about committing resources makes it easier to restrain yourself from implementing a really cool whiz-bang feature instead of a boring must-do.
Since you recognize that you feel the need to provide a quick solution, perhaps it will slow you down to realize that you can probably solve the problem faster and deliver it sooner if you spend more upfront time in design. For instance if you spend 3 hours designing and 30 hours writting code, it probably means that if you spend 6 hours designing you might need to only spend 10 hours writing code. (These are not actual figures just examples). You might try to quantify this for yourself on the next few projects you do. Do a couple where you behave as you normally would and see what ratio of design/codewriting/testing&debugging you actually do. Then on the next project deliberately increase the percentage of time you spend on design phase and see if it does shorten the time needed for the other phases. You will have to try for several projects on this as well to get a true baseline since the projects may be quite different. Do it as a test to see if you can improve your performance on the the other phases and thus deliver a faster product if you spend 20% more time or 50% more time or 100% more time on design.
Remember the later in the process you find the problem with a design the harder (and more time-consuming) it is to fix.

How do you fight design complexity? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I often find myself fighting overengineering -- the person in charge of designing the software comes up with an architecture that's, way, way overcomplicated.
It's all fine and dandy to have all the esoteric features that users will never know about and get that sense of achievement when you're doing something that all the magazine articles are telling you is the latest, cool thing, but we are going to spend half of our engineering time on this monument to our cleverness, and not, you know, the actual product that our users need and upper management expects to be completed within a reasonable or at least a bounded time frame.
And you'll probably just have to revert back to the simpler solution anyway when you start running out of time, that is, if you get that chance.
We've all heard the refrain: Keep It Simple, Stupid™.
How do you fight with overcomplexity in your team?
One example I've had to work with repeatedly lately is when the decision has been made to go to a fully denormalized database design rather than an RDBMS. "because it's faster!" Fully denormalized databases are really hard to get right, and are only appropriate for really specialized data problems like Flickr or ebay, and which can be extremely expensive in terms of developer time relative to the rest of your development.
Tooth and nail, and sometimes you lose. The problem is that it's always easy to be tempted to build something cool.
Why build something simple and efficient when it can be complicated and wonderful?
Try to remind people of the XP rule of building the simplest thing that can possibly work.
Make sure to bounce any ideas you have off of someone else. Oftentimes, we get so wrapped up in doing things a certain way that it takes another set of eyes to set you right. There've been many times that I've figured out difficult problems by having somebody else there to say "do we really need that?" This helps make my code simpler.
On the point of dealing with people you disagree with, Ward Cunningham has a good point:
It was a turning point in my programming career when I realized that I didn't have to win every argument. I'd be talking about code with someone, and I'd say, "I think the best way to do it is A." And they'd say, "I think the best way to do it is B. I'd say, "Well no, it's really A." And they'd say, "Well, we want to do B." It was a turning point for me when I could say, "Fine. Do B. It's not going to hurt us that much if I'm wrong. It's not going to hurt us that much if I'm right and you do B, because, we can correct mistakes. So lets find out if it's a mistake. ... Usually it turns out to be C. It's a learning experience for both of us. If we decide without the experience, neither of us really learns. Ward won, and somebody else didn't. Or vice versa. It's too much of a battle. Why not say, "Well, let's just code it up and see what happens. If it doesn't work, we'll change it.""
My advice? If you want to do something better, come up with a simple prototype that demonstrates that it's better. Opinions are great, but code talks.
I have seen this formula somewhere:
skill = complexity of problem / complexity of solution http://img39.imageshack.us/img39/1586/whatisskill.png
In other words, it requires skill to create a simple solution to a complex problem. If somebody purposefully designs and takes pride in complex overengineered solutions, then he is unconsciously incompetent.
Personally, what helps me to keep my designs simple, is the TDD cycle. First write a test that specifies what you're trying to reach, and then produce "the simplest thing that could possibly work". And every now and then, reflect on what you have produced, and think about how to make it more simple.
Never build extra flexibility and abstraction layers into the system, until it is required by something that you have now. Changing the code is easy, when you have a good unit test suite, so you can add those abstraction layers later, when the need arises, if it ever arises. Otherwise, "you ain't gonna need it".
Some symptoms of too complex design are when writing tests is complicated. If the tests require a long setup code, maybe you have too many dependencies or in some other way too much complexity. If you run into concurrency bugs, then maybe you should think about how to design the system so that concurrency is restricted to the absolute minimum number of classes. Maybe use a message-passing architecture, such as the Actor model, and make practically every component single-threaded, even though the system as a whole is multi-threaded.
At least for me, the bigger issue is that it's often hard to tell what feature is in there because of its buzzword-friendly, magaziney enterprisey goodness and which is in there because it adds a level of flexibility that will be useful in the future.
It's been shown that people are generally terrible at anticipating future complexity, and side-effects of current decisions. Unfortunately this doesn't always mean simplest is best - in my case, there've been loads of things I thought were too complicated at first and didn't see the value of until much later (er... spring). Also things I thought made sense that turned out to be wildly overcomplicated (EJB1). So I know that my intuition about these things is faulty.
Best bet - any kind of indirection layer should be supported with an argument supporting the value of the flexibility it adds vs. its added dev complexity.
However, people who are dogmatically maintaining a particular db setup on abstract grounds are probably in the "building it because I read that it's the right thing" camp. It might be unrealistic, but some people might be convinced if you build a test version and benchmark, especially if the results show more effort leading to an insignificant performance increase.
It's all fine and dandy to have all
the esoteric features that users will
never know about and...
This would be feature creep, not unnecessarily complicated design. It's different from your example on databases.
One example I've had to work with
repeatedly lately is when the decision
has been made to go to a fully
denormalized database design rather
than an RDBMS. "because it's faster!"
In this case several things may be going on. One of them is, you might be wrong and these people could really know what they are saying because they have worked with very similar examples. Another is they might be wrong, i.e. their design doesn't offer the speed advantages they claim. In this case there could be two different situations: (1) They are giving speed too much weight in their design, or (2) speed is really critical. If speed is indeed so relevant, the team shouldn't rely only in assumptions - they should try different prototypes and evaluate their speed in the critical paths. You don't build an F1 car in one way just "because it's faster", instead you keep trying several alternative design solutions and pick the fastest one which still doesn't increase maintenance costs too much.
Sometimes you can argue it and reach an agreement, sometimes you can't. It's life.
A final word, though. You don't fight complexity. You treat it. You identify the really important things and act accordingly.
I assume you mean "fully denormalized database design rather than a normalized (e.g., third or fourth normal form) model", because a relational model is managed by an RDBMS regardless of how normalized it is.
Can't judge without knowing more about your requirements, your abilities, and those of your teammates.
I fear that your KISS admonition might not work in this case, because one big, denormalized table might be defended as the simplest thing possible.
How does anybody solve these kinds of problems? Communication, persuasion, better data, prototypes of alternative technologies and techniques. This is what makes software development so hard. If there was only one way to do these things, and everyone agreed on them, we truly could script it or get anyone to develop systems and be done with it.
Get some data. Your words might not be enough. If a quick prototype can demonstrate your point, create it.
"Strong opinions, lightly held" should be your motto.
Sometimes a technical point isn't worth alienating your entire team. Sometimes it is. Your call.
You fight someone else's overdesign/feature creep in several ways:
Request feature priority, based on actual user requirements. Mock up features for alpha and beta testers, and ask if they would trade N months of delay for it.
Aggressively refactor to avoid special-casing. Break code into layers or modular components whenever appropriate. Find a balance between "works just fine now" and "easy to extend later".
Notify your management when you disagree with design decisions, prepare to be overruled, and accept the decision. Don't go over anyone's head or sabotage code.
The best way I have found is to relentlessly ask - again and again - 'What is the business problem we are trying to solve' and 'How does this decision help to solve that problem'.
I find that too often folks jump to solutions instead of being crystal clear on what the problem is.
So in your example of how to organize a database, my question would be 'What do we think are the transaction requirements for this project today, next month, next year, five years from now'. It could be that it makes sense to spend a lot of time to get the data model right, it could be a waste of time. You don't know what the parameters are until you get the problem definition clear.
You may suffer from "too many architects in the team" syndrome. One or two people at most should design/architect a system which will be coded by a team of 5 to 10 people. Input is welcome from everyone but architectural decision makers should be few and experienced.
(the numbers are semi-random and could be different depending on other factors as well)
I try to be open when discussing matters. But when i am discussing with someone else between something that seems simple and another one complicated, i get as stubborn as can be. It helps quite a lot, so long as you are very coherent from one decision to another.
Your example isn't really a complicated design, it's a design choice that you don't agree with. Since you're working on the code, you could easily be right because many of these decisions are made by people reading an article in an article and thinking it sounds like a good goal, or the decision could have been made by someone who ran into problems before and was trying to prevent it from happening again.
Personally I've done a lot of stuff the easy way and a lot of it the hard way, and I'm never happy when I choose doing something the easy way over the hard way. Now I've learned tricks like "never pass around a naked collection, always wrap it in a business class".
If I were to explain the reasoning behind that to someone who hadn't been through the same experiences, they wouldn't understand it until they tried comparing the "easy way" to the "hard way" a few times.
The solution should be no more complex than the problem.
The question intertwines itself with the thought of essential complexity. A sort must touch each element, by its essence. How much more complex must it then get, to solve the problem, given the technical constraints existing on it?
Do the people involved have enough time and incentive to find a simple solution? Without care, complexity will increase. If you spend most of your time trying to do the quickest possible bug fix or feature addition then saying "keep it simple" will not be enough.
Ensure that there are some people on the team with war wounds from maintaining large programs, and people with experience of refactoring, and then give them time to sort the software out. If you can arrange that certain features and opportunities are out of scope, that will help people remove unneeded code. If you want a metric, aim to reduce lines of code; but try not to obsess over it. Improve the test coverage; consider eliminating bits of code which are hard to test.
Don't try to do everything at a stretch. Break every problem/task into manageable chunks. Then prioritize, keeping KISS and YAGNI in mind. This will help you focus on building what you need. If you've done it right, you'll have a good core you can add to later, given time, money, resources and inspiration.

Debugging is a bad smell - how to persuade them?

I've been working on a project that can't be described as 'small' anymore (40+ months), with a team that can't be defined as 'small' anymore (~30 people). We've been using Agile/Scrum (1) practices all along, and a healthy dose of TDD.
I'm not sure if I picked this up from Agile or TDD, more likely a combination of the two, but I'm now clearly in the camp of people that looks at debugging as a bad smell. By 'debugging' I'm not referring to the more abstract concept of figuring out what might be wrong with the system, but the specific activity of running the system in Debug mode, stepping through the code to figure out details that are otherwise inscrutable.
Since I'm fairly convinced, this question is not about whether debugging is a bad smell or not. Rather, I'd like to know how I can persuade my team-mates about this.
People that believe debugging mode is the 'standard' mode tend to write code that can be understood only by debugging through it, which leads to a lot of time wasted since every time you work an item on top of code developed by someone else, you get to first spend a considerable amount of time debugging it (and, since there's no bug involved.. the term is becoming increasingly ridiculous) - and then silos happen. So I'd love to convince a few of my team-mates that avoiding debug mode is a Good Thing (2). Since they are used to live in Debug mode, however, they don't seem to see the problem; to them, spending hours debugging someone else code before they even start doing anything related to their new item is the norm; they don't see anything wrong with it. Plus, as they spend time 'figuring it out' they know eventually the developer that worked that area will become available and the item will be passed on to them (leading to yet another silo).
Help me come up with a plan to turn them from the Dark Side !
Thanks in advance.
(1) Also referred to as SCRUM (all caps). Capitalization arguments aside, I think an asterisk after the term must be used since - unsurprisingly - our organization 'tweaked' the Agile and Scrum process to fit the perceived needs of all stakeholders involved. So, in all honesty, I won't pretend this has been 100% according to theory, but that's beside the point of my question.
(2) Yes, there will always be times when we'll have to get in debug mode, I'm not trying to absolutely avoid it, just.. trying to minimize the number of times we have to dive into it.
If you want to persuade your coworkers that your programming practices are better, first demonstrate by your productiveness that you are more effective than they are, at least for some tasks. Then they'll believe you when you explain how you get so much done.
It's also sometimes easier to focus on something concrete. Do your coworkers even talk in terms of "code smell"? Perhaps you could focus on specifics like "When the ABC module fails, it takes forever to debug it; it's much faster to use technique XYZ. Here, let me demonstrate." Then afterwards you can mention your basic principle, which is yeah the debugger is a useful tool, but there's usually other more useful ones.
This is a cross-post, because the first time around it was more of an aside on someone else's answer to a different question. To this question it's a direct answer.
Debugging degrades the quality code of
the code we produce because it allows
us to get away with a lower level of
preparation and less mental
discipline. I learnt this from an
accidental controlled experiment in
early 2000, which I now relate:
I took on a contract as a Delphi
coder, and the first task assigned was
to write a template engine
conceptually similar to a reporting
engine - using Java, a language with
which I was unfamiliar.
Bizarrely, the employer was quite
happy to pay me contract rates to
spend months becoming proficient with
a new language, but wouldn't pay for
books or debuggers. I was told to
download the compiler and learn using
online resources (Java Trails were
pretty good).
The golden rule of arts and sciences
is that whoever has the gold makes the
rules, so I proceeded as instructed. I
got my editor macros rigged up so I
could launch the Java compiler on the
current edit buffer with a single
keystroke, I found syntax-colouring
definitions for my editor and I used
regexes to parse the compiler output
and put my cursor on the reported
location of compile errors. When the
dust settled, I had a little IDE with
everything but a debugger.
To trace my code I used the good old
fashioned technique of inserting
writes to the console that logged
position in the code and the state of
any variables I cared to inspect. It
was crude, it was time-consuming, it
had to be pulled out once the code
worked and it sometimes had confusing
side-effects (eg forcing
initialisation earlier than it might
otherwise have occurred resulting in
code that only works while the trace
is present).
Under these conditions my class
methods got shorter and more and more
sharply defined, until typically they
did exactly one very well defined
operation. They also tended to be
specifically designed for easy
testing, with simple and completely
deterministic output so I could test
them independently.
The long and the short of it is that
when debugging is more painful than
designing, the path of least
resistance is better design.
What turned this from an observation
to a certainty was the success of the
project. Suddenly there was budget and
I had a "proper" IDE with an
integrated debugger. Over the course
of the next two weeks I noticed a
reversion to prior habits, with
"sketch" code made to work by
iterative refinement in the debugger.
Having noticed this I recreated some
earlier work using a debugger in place
of thoughtful design. Interestingly,
taking away the debugger slowed
development only slightly, and the
finished code was vastly better
quality particularly from a
maintenance perspective.
Don't get me wrong: there is a place
for debuggers. Personally, I think
that place is in the hands of the team
leader, to be brought out in times of
dire need to figure out a mystery, and
then taken away again before people
lose their discipline.
People won't want to ask for it
because that would be an admission of
weakness in front of their peers, and
the act of explaining the need and the
surrounding context may well induce
peer insights that solve the problem -
or even better designs free from the
problem.
So, FOR, I not only agree with your position, I have real data from a controlled experiment to support it. It is, however, a rather small sample. More elaborate tests are required before my conclusions are supportable.
Why don't you take what I've said to your team and suggest trials. You have more data than they do (I just gave it to you) and in order to have a credible basis for disagreeing with you they basically have to test the idea, and the only way to do that is to give your idea a go.
You should be ready for it to all fall apart, though, because the whole thing is predicated on the assumption that the developers have the talent and experience to rise to the challenge of stronger design in the absence of step-through debugging.
Step-through debugging was created to make debugging easier. The direct effect of lowering the bar is that people with less talent can participate - if you build a tool that even jackasses can use, you will get jackasses using it -- a lot of them, if the newly accessible activity is well-remunerated.
This causes an exodus of people with talent because they generally use that talent to do rare and precious things in order to be well paid without working too hard, and the market doesn't want to pay for excellence because it cannot distinguish talent well enough to know when paying for it is justified.
Another thought: more recent work with problems on production servers, where it was impossible to install a debugger, has shown the importance of having a codebase for which maintenance doesn't depend on the availability of a debugger. Code that's grown in the absence of debuggers is much less hassle. Choose not to use them when you can change your mind, and then when you can't change your mind it won't be so awful.
Since I'm fairly convinced, this question is not about whether debugging is a bad smell or not.
Well, your local Church might be more appropriate place for your question then.
That aside, convince them by arguments. You might want to reconsider your fundamentalist stance, however, because this is the very opposite of persuasive. One thing you might want to do is drop the term “debugging” in your whole discussion and replace it by “stepping through the code” or the likes, emphasizing that you oppose the uninformend guesswork/patchwork practice of probing that you condemn rather than an informed reflection about the code.
(I would still disagree with you, but that's besides the point since you didn't want a discussion.)
I think the real problem here is
People that believe debugging mode is
the 'standard' mode tend to write code
that can be understood only by
stepping through it
This, if true, should be self evidently wrong and there should be no need to discuss it. If it's not evident it's because they don't see how the badly written code could be improved. Show them, do code reviews where you show how that code could be refactored in a way that is clear without stepping through it.
Code stepping will automatically diminish once better code is written, it just doesn't work the other way around. People will still write bad code and if they avoid stepping through it that will only lead to more wasted time (damn I wish I could step through this spaghetti mess), not to better code.
There is something wrong here, but it's hard to put my finger on it. Perhaps the real issue is that the code has other smells that make it difficult to readily understand. I agree that with TDD one ought to use the debugger less rather than more, since you'll be developing the code in small increments. But, if you can't look at the code and understand it, perhaps it's because the design is too coupled -- there are too many interrelated classes required to make things work.
If the code really needs to be so complex that observation won't suffice, then maybe you need to invest in some good commenting, explaining what is happening -- though I would prefer to see things refactored to the point where comments are not needed. My suspicion is that the debugger may be a symptom rather than the problem.
I know that for me, switching from traditional, code-first development to test-first development has resulted in less time spent debugging...and it's not something I miss. Typically I'll only involve the debugger when its not obvious why the code I just wrote to pass a test, didn't.
This is going to sound like the argument you said you don't want to have, but I think if you want to convince your teammates, you're going to have to make a stronger case. I don't understand your objection. I frequently step through code I'm trying to understand with the debugger. It's a great way to see what's going on. You have not established your claim that people who use the debugger in this way tend to write code which is otherwise difficult to understand. The only convincing way to do so would be through some kind of case/control study which tried to measure and compare the readability of code written by people with varying approaches to the debugger. And you have not even told a plausible story explaining why you think using a tool to understand code execution tends to lead to sloppier code construction. For me it's a complete non sequitur.
A "plan" to convince them of the advantage of another approach is by establishing metrics linked to the number of time you debug the same function for different bugs.
By analysis the trend of that metric, you may convince them that non-regression tests are more useful to spend time writing, and will help them to debug more efficiently.
That way, you do not write completely off the "debug" habit, but you convince them of establishing a solid set of test, allowing them to focus on really useful debug session, if needed.
Should you consider this course of action (metrics), you should know its implementation involves the all hierarchy (stakeholder, project manager, architect, developers). They all need to be implicated in those metrics in order to act on them.
Regarding developers, you could try to suggest:
some new ways of closing a bug case (close it only with the test scenario played to reproduce that bug, meaning they need an independent test in order to, if needed, launch their debug session)
a clear relationship between those metrics and their evaluation by the management (it would be a bad practice to debug over and over the same function)
a larger involvement in architectural decisions: sometimes, knowing some functional or applicative features rather than just classes and code can incite a developer to think more in term of black-box test rather than white-box (which can more easily lead to debug session)
a participation into "operational architecture" process (where you need to deploy your app, and make full front-to-back integration test). Again, a larger picture of the all system can help a developer to get more interested in features rather than 'lines of code'
I think a better phrasing of this question would be "Is non-TDD a code smell?" TDD seems to lead to less time spent in the debugger due to more time spent writing/failing/passing tests. Without TDD, you are more likely to spend time in the debugger to diagnose errors.
At least within Visual Studio, using the debugger is not that painful, so the challenge for you would be to explain to your teammates how TDD would make their development more enjoyable, productive and successful. Just avoiding the debugger is probably not reason enough for a team to switch their development methodology.
Right on roadwarrior.
debugging isn't the problem, it's poorly commented and or documented code and bad archetecture. I work on a smaller team but when a bug does surface, I do step through the code. frequently it's a very small job because the app is well planned out and the doc's on the code are clear.
That said lets get to my point. Want the team to not debug... comment, comment comment. Nothing beats down the urge to debug faster. Sure they'll still do it, but they'll be more likely to step over well documented code.
Oh and though it should go without saying, I'll do it anyway. don't have bugs in your code. :)
I agree with those above who expressed the relative irrelevance of this "debugger issue."
IMO, the 2 most important goals of a developer are:
1) Make the software do what it's supposed to do.
2) Write the code so that a maintenance developer 2 years down the road enjoys the experience of changing existing or adding new features.
Before you make a plan, you should decide how important this change is to you. Although I agree that debugging is a smell, it is also a very well accepted and ingrained practice for developers, so convincing them that they should stop doing it won't be easy or quick - and for good reasons. How much energy do you want to put into this topic?
Second, why do you want to persuade them in the first place? If your motivation is to help them, is it really their top priority problem? When you help people in ways they want to be helped, change becomes easy.
Once you have decided that you want to go on with your change initiative, you need to take into account that different people are convinced by different things. Some people will already be convinced by trying something new and exciting. Some will be convinced by numbers (metrics). Some by getting told about it while eating their favorite type of cookie (seriously!), some by hearing about it from their favorite guru. Some by reading about it in a magazine. Some by seeing that "everyone else is doing it, too". Etc. pp.
There is an insightful interview with Linda Rising on this topic at InfoQ: http://www.infoq.com/interviews/Linda-Rising-Fearless-Change. She can say it much better than me. The book is quite good, too.
Whatever you do, don't press too much, but also don't give up. Change can happen - especially if you take resistance as a resource -, and sometimes it happens at unexpected times, so always keep a sense of wonder.
#FOR : You have a second problem too, here it is :
sadly it doesn't seem the devs are interested in being more productive (they get paid the same anyway)
How do you intend to make them want to be more productive when there is nothing (visible) for them to gain?
Designing software by debugging is a good practice.
The number of environments supporting this way of developing is very small: the best known is Smalltalk. In Smalltalk, you can write a test describing your objects protocol without the methods being implemented. Running this test will then trigger the debugger, and you can add the method to the right class in the debugger, and can continue stepping through the code until all functionality is implemented and the test is green.
This needs a compiler to be available at run-time, and first-class invocations. It offers a very short feedback cycle, and is one of the primary reasons for Smalltalks' productivity

How do you tell someone they're writing bad code? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Closed 10 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've been working with a small group of people on a coding project for fun. It's an organized and fairly cohesive group. The people I work with all have various skill sets related to programming, but some of them use older or outright wrong methods, such as excessive global variables, poor naming conventions, and other things. While things work, the implementation is poor. What's a good way to politely ask or introduce them to use better methodology, without it coming across as questioning (or insulting) their experience and/or education?
Introduce questions to make them realise that what they are doing is wrong. For example, ask these sort of questions:
Why did you decide to make that a global variable?
Why did you give it that name?
That's interesting. I usually do mine this way because [Insert reason why you are better]
Does that way work? I usually [Insert how you would make them look silly]
I think the ideal way of going about this is subtly asking them why they code a certain way. You may find that they believe that there are benefits to other methods. Unless I knew the reason for their coding style was due to misinformation I would never judge my way as better without good reason. The best way to go about this is to just ask them why they chose that way; be sure to sound interested in their reasoning, because that is what you need to attack, not their ability.
A coding standard will definitely help, but if it were the answer to every software project then we'd all be sipping cocktails on our private islands in paradise. In reality, we're all prone to problems and software projects still have a low success rate. I think the problem would mostly stem from individual ability rather than a problem with convention, which is why I'd suggest working through the problems as a group when a problem rears its ugly head.
Most importantly, do NOT immediately assume that your way is better. In reality, it probably is, but we're dealing with another person's opinion and to them there is only one solution. Never say that your way is the better way of doing it unless you want them to see you as a smug loser.
Start doing code reviews or pair programming.
If the team won't go for those, try weekly design reviews. Each week, meet for an hour and talk about a peice of code. If people seem defensive, pick old code that no one is emotionally attached to any more, at least at the beginning.
As #JesperE: said, focus on the code, not the coder.
When you see something you think should be different, but others don't see it the same way, then start by asking questions that lead to the deficiencies, instead of pointing them out. For example:
Globals: Do you think we'll ever want to have more than one of these? Do you think we will want to control access to this?
Mutable state: Do you think we'll want to manipulate this from another thread?
I also find it helpful to focus on my limitations, which can help people relax. For example:
long functions: My brain isn't big enough to hold all of this at once. How can we make smaller pieces that I can handle?
bad names: I get confused easily enough when reading clear code; when names are misleading, there's no hope for me.
Ultimately, the goal is not for you to teach your team how to code better. It's to establish a culture of learning in your team. Where each person looks to the others for help in becoming a better programmer.
Introduce the idea of a code standard. The most important thing about a code standard is that it proposes the idea of consistency in the code base (ideally, all of the code should look like it was written by one person in one sitting) which will lead to more understandable and maintainable code.
You have to explain why your way is better.
Explain why a function is better than cutting & pasting.
Explain why an array is better than $foo1, $foo2, $foo3.
Explain why global variables are dangerous, and that local variables will make life easier.
Simply whipping out a coding standard and saying "do this" is worthless because it doesn't explain to the programmer why it's a good thing.
First, I'd be careful not to judge too quickly. It's easy to dismiss some code as bad, when there might be good reasons why it's so (eg: working with legacy code with weird conventions). But let's assume for the moment that they're really bad.
You could suggest establishing a coding standard, based on the team's input. But you really need to take their opinions into account then, not just impose your vision of what good code should be.
Another option is to bring technical books into the office (Code Complete, Effective C++, the Pragmatic Programmer...) and offer to lend it to others ("Hey, I'm finished with this, anyone would like to borrow it?")
If possible, make sure they understand that you're critizising their code, not them personally.
Suggest a better alternative in a non-confrontational way.
"Hey, I think this way will work too. What do you guys think?" [Gesture to obviously better code on your screen]
Have code reviews, and start by reviewing YOUR code.
It will put people at ease with the whole code review process because you are beginning the process by reviewing your own code instead of theirs. Starting off with your code will also give them good examples of how to do things.
They may think your style stinks too. Get the team together to discuss a consistent set of coding style guidelines. Agree to something. Whether that fits your style isn't the issue, settling on any style as long as it's consistent is what matters.
By example. Show them the right way.
Take it slow. Don't thrash them for every little mistake right off the bat, just start with things that really matter.
The code standard idea is a good one.
But consider not saying anything, especially since it is for fun, with, presumably, people you are friends with. It's just code...
There's some really good advice in Gerry Weinberg's book "The Psychology of Computer Programming" - his whole notion of "egoless programming" is all about how to help people accept criticism of their code as distinct from criticism of themselves.
Bad naming practices: Always inexcusable.
And yes, do no always assume that your way is better... It can be difficult, but objectivity must be maintained.
I've had an experience with a coder that had such horrible naming of functions, the code was worse than unreadable. The functions lied about what they did, the code was nonsensical. And they were protective/resistant to having someone else change their code. when confronted very politely, they admitted it was poorly named, but wanted to retain their ownership of the code and would go back and fix it up "at a later date."
This is in the past now, but how do you deal with a situation where they error is ACKNOWLEDGED, but then protected? This went on for a long time and I had no idea how to break through that barrier.
Global variables: I myself am not THAT fond of global variables, but I know a few otherwise excellent programmers that like them A LOT. So much so that I've come to believe they are not actually all that bad in many situations, as they allow for clarity, ease of debugging. (please don't flame/downvote me :) ) It comes down to, I've seen a lot of very good, effective, bug free code that used global variables (not put in by me!) and great deal of buggy, impossible to read/maintain/fix code that meticulously used proper patterns. Maybe there IS a place (though shrinking perhaps) for global variables? I'm considering rethinking my position based on evidence.
Start a wiki on your network using some wiki software.
Start a category on your site called "best practices" or "coding standards" or something.
Point everyone to it. Allow for feedback.
When you do releases of the software, have the person whose job it is to put code into the build push back on developers, pointing them to the Wiki pages on it.
I've done this in my organization and it took several months for people to really get into the hang of using the Wiki but now it's this indispensable resource.
If you have even a loose standard of coding, being able to point to that, or indicating that you can't follow the code because it's not the correct format may be worthwhile.
If you don't have a coding format, now would be a good time to get one in place. Something like the answers to this question may be helpful: https://stackoverflow.com/questions/4121/team-coding-styles
I always go with the line 'This is what I would do'. I don't try and lecture them and tell them their code is rubbish but just give an alternative viewpoint that can hopefully show them something that is obviously a bit neater.
Have the person(s) in question prepare a presentation to the rest of the group on the code for a representative module they have written, and let the Q&A take care of it (trust me, it will, and if it's a good group, it shouldn't even get ugly).
I do love code, and never had any course in my live about anything related to informatics I started very bad and started to learn from examples, but what I always remember and kept in my mind since I read the "Gang Of Four" book was:
"Everyone can write code that is understood by a machine, but not all can write code that is understood by a human being"
with this in mind, there is a lot to be done in the code ;)
I can't emphasize patience enough. I've seen this exact sort of thing completely backfire mostly because someone wanted the changes to happen NOW. Quite a few environments need the benefits of evolution, not revolution. And by forcing change today, it can make for a very unhappy environment for all.
Buy-in is key. And your approach needs to take into account the environment you are in.
It sounds like you're in an environment that has a lot of "individuality" to it. So... I wouldn't suggest a set of coding standards. It will come across that you want to take this "fun" project and turn it into a highly structured work project (oh great, what's next... functional documents?). Instead, as someone else said, you'll have to deal with it to a certain extent.
Stay patient and work toward educating others in your direction. Start with the edges (points where your code interacts with others) and when interacting with their code try to take it as an opportunity to discuss the interface they've created and ask them if it would be okay with them if it was changed (by you or them). And fully explain why you want the change ("it will help deal with changing subsystem attributes better" or whatever). Don't nit-pick and try to change everything you see as being wrong. Once you interact with others on the edge, they should start to see how it would benefit them at the core of their code (and if you get enough momentum, go deeper and truly start to discuss modern techniques and the benefits of coding standards). If they still don't see it... maybe you'll need to deal with that within yourself (especially on a "fun" project).
Patience. Evolution, not revolution.
Good luck.
I don a toga and open a can of socratic method.
The Socratic Method named after the Classical Greek philosopher Socrates, is a form of philosophical inquiry in which the questioner explores the implications of others' positions, to stimulate rational thinking and illuminate ideas. This dialectical method often involves an oppositional discussion in which the defense of one point of view is pitted against another; one participant may lead another to contradict himself in some way, strengthening the inquirer's own point.
A lot of the answers here relate to code formatting which these days is not particularly relevant, as most IDEs will reformat your code in the style you choose. What really matters is how the code works, and the poster is right to look at global variables, copy & paste code, and my pet peeve, naming conventions. There is such a thing as bad code and it has little to do with format.
The good part is that most of it is bad for a very good reason, and these reasons are generally quantifiable and explainable. So, in a non-confrontational way, explain the reasons. In many cases, you can even give the writer scenarios where the problems become obvious.
I'm not the lead developer on my project and therefore can't impose coding standards but I have found that bad code usually causes an issue sooner rather than later, and when it does i'm there with a cleaner idea or solution.
By not interjecting at the time and taking a more natural approach i've gained more trust with the lead and he often turns to me for ideas and includes me on the architectural design and deployment strategy used for the project.
People writing bad code is just a symptom of ignorance (which is different from being dumb). Here's some tips for dealing with those people.
Peoples own experience leaves a stronger impression than something you will say.
Some people are not passionate about the code they produce and will not listen to anything you say
Paired Programming can help share ideas but switch who's driving or they'll just be checking email on their phone
Don't drown them with too much, I've found even Continuous Integration needed to be explained a few times to some older devs
Get them excited again and they will want to learn. It could be something as simple as programming robots for a day
TRUST YOUR TEAM, coding standards and tools that check them at build time are often never read or annoying.
Remove Code Ownership, on some projects you will see code silos or ant hills where people say thats my code and you can't change it, this is very bad and you can use paired programming to remove this.
Instead of having them write code, have them maintain their code.
Until they have to maintain their steaming pile of spaghetti, they will never understand how bad they are at coding.
Nobody likes to listen someone saying their work sucks, but any sane person would welcome mentoring and ways of avoiding unnecessary work.
One school of teaching even says that you should not point out mistakes, but focus what is done right. For instance, instead of pointing out incomprehensible code as bad, you should point out where their code is particularly easy to read. In the first case you are priming others to think and act like crappy programmers. In the later case you are priming for thinking like a skilled professional.
I have a similar senario with the guys i work with.. They dont have the exposure to coding as much as i do but they are still usefull at coding.
Rather than me letting the do what they want and go back and edit the whole thing. I usually just sit them down and show them two ways of doing things. Thier way and My way, From this we discuss the pro's and cons of each method and therefore come to a better understanding and a better conclusion on how should we go about programming.
Here is the really suprizing part. Sometimes they will come up with questions that even i dont have answers to, and after research we all get a better concept of methodology and structure.
Discuss.
Show them Why
Don't even think you are always right.. Sometimes even they will teach you something new.
Thats what i would do if i was you :D
Probably a bit late after the effect, but that's where an agreed coding standard is a good thing.
I frankly believe that someone's code is better when it's easier to change, debug, navigate, understand, configure, test and publish (whew).
That said I think it is impossible to tell someone his/her code is bad without having a first go at having him / her explaining what it does or how is anyone supposed to enhance it afterwards (like, creating new funcionality or debugging it).
Only then their mind snaps and anyone will be able to see that:
Global variables value changes are almost always untrackable
Huge functions are hard to read and understand
Patterns make your code easier to enhance (as long as you obay to their rules)
( etc...)
Perhaps a session of pair programming should do the trick.
As for enforcing coding standards - it helps but they are too far away from really defining what is good code.
You probably want to focus on the impact of the bad code, rather than what might be dismissed as just your subjective opinion of whether it's good or bad style.
Privately inquire about some of the "bad" code segments with an eye toward the possibility that it is actually reasonable code, (no matter how predisposed you may be), or that there are perhaps extenuating circumstances. If you are still convinced that the code is just plain bad -- and that the source actually is this person -- just go away. One of several things may happen: 1) the person notices and takes some corrective action, 2) the person does nothing (is oblivious, or doesn't care as much as you do).
If #2 happens, or #1 does not result in sufficient improvement from your point of view, AND it is hurting the project, and/or impinging on you enough, then it may be time to start a campaign to establish/enforce standards within the team. That requires management buy-in, but is most effective when instigated from grass roots.
Good luck with that. I feel your pain brother.

Resources