Best way to perform software testing - debugging

I just finished (mostly) a major application that I've been working on for a little over a year (off and on). It is around 86k lines of code, 50k of those is from Visual Studio's auto-generated dataset. It's largely a GUI to interacting with the database, generating reports, etc. It deals with money and manages contracts so it is important for it to be as bug free as possible.
I've walked through the code, and ran the program myself. I, for the most part, cannot find more bugs. I am however, sure there are, I've just been working on the system so long I can't see them anymore. I know there are some, because of some intermittent issues I run across, but can never pinpoint.
How should I go about software testing in order to discover the remaining bugs?

I know this is a little late, but have you heard of Test-Driven Development?
There are lots of tests you could build to discover the "remaining" bugs:
1) Unit tests
2) Integration tests
3) Behavior/Business/Acceptance tests
You could always attend a Developer Testing Bootcamp to get more ideas.

You can involve some of your end users and do a beta test that way. The less experience they have with the application until now, the more likely they are to try things you didn't think of.

Since you didn't use TDD to write it. Your best bet now is to add as many automated tests as possible to cover common scenarios. That way, when you do find bugs, and there are ALWAYS bugs in programs, you can hopefully minimize the risk to the rest of the system when you fix them.

Related

What's the benefits of TDD from project management point of view?

who learning more deeply about TDD, I clearly see the benefits of writing test first from a software engineering point of view. Could someone tell me, what should I say to my project manager, when I propose the idea to switch for TDD?
What are the benefits of TDD on a project management level?
There are quite a few advantages for project managers as it applies to TDD.
Bugs are notorious for knocking a project off schedule. Things seem to be going smoothly until the end of the project, when the developers and QA team start working together to find and fix all these issues. It isn't uncommon to find issues that require major refactoring, eating up time and resources. TDD helps mitigate this by enforcing testing from the beginning and throughout the project. It won't catch all the bugs, but the list will be more manageable, and that makes Project Managers very happy.
With the proper tools project managers can track productivity and code quality with TDD. At any time she can see that status of the project as it applies to tests being written, and how many are passing and failing. This quantifier helps the project manager gauge where the developers, and indeed the entire team, are in the project. Project Managers love this sort of stuff
TDD helps project managers identify areas where the team may be weak or not performing well. If you have a developer who isn't working well or keeping up, he knows early on and can address the issue before it begins to affect the product schedule.
Here's a decent article about the subject.
It's not so easy to convince project managers to go with TDD and may be developers can use TDD even without informing PMs about it.
But I guess such points can be helpful:
It gives you 100% code coverage, that decreases probability of bugs (just decrease, but not eliminate it at all).
It pushes you towards following YAGNIE principle, so you don't write any code you don't need right now.
It helps you to decrease or even eliminate over-design, over-complication and premature optimization.

Should TDD be applied at initial prototype stage?

I have a question about applying tdd on early stages of development. Frequently, when starting developing a project, the client does not know exactly what the precise requirements are and consequently changes them after seeing the first prototypes. If we apply tdd from the very beginnning of the project, it turns out that a big portion of our tests (acceptance, integration, unit) will be soon either deleted or updated. Is this normal? If not, how to proceed at this initial phase of product development?
I will comfort you, Markus: it is normal that clients change their minds at the beginning. The funny thing is that, even with time, the only thing they don't change their minds about is changing their minds about stuff all the time.
So do not be worried that some portion of your tests will go to the bin along with the implementation because it will be the same in later stages. What you can do as a developer/BA/whatever is to try and point them in the right direction as soon as possible, discuss things with them so you don't develop too much "useless" stuff.
Especially if you're working in a very agile fashion the requirements might change from iteration to iteration, this by no means should make you think that tests are useless at any stage of the project.
Also it is very normal for tests to get updated when the requirements change. People need to start taking tests more seriously (it is srs business, k?!) and realize that it is something that: a) is there not just to annoy you, b) should evolve with the project because it will most probably save you a lot of trouble.
Prototypes suggested by Yishai are a good solution. Sometimes. BUT you really need to watch out. In a lot of situations when a client sees a prototype he likes/is very similar to what he wants he will think "wow you're almost already done! when can we launch it?"? And then it is really hard to explain them that it is only a prototype and that you need to start from scratch. In many cases people just start using the prototype as the main project and they do not feel like adding missing tests or improving the existing codebase. That is almost how the application I am currently working with got created (10+ years now!).
Yes, it is normal.
Another option, though is you can build true prototypes to get some feedback to nail down some technical architecture questions that can't be changed later, and build them without tests. The requirement is, though, that those prototypes be literally deleted when the real project work is underway (they may stay around initially to look at how some things were done, etc. but not a drop of code is left in the final product).
You should also be sure you are writing acceptance tests that represent functionality the user actually wants, not just artifacts you happen to build. It seems odd that a lot of acceptance tests would get thrown out. They may need to get updated, to represent new requirements, but they should, in general, be quite accurate, as not that much should change, if you do the most important things first.
But sometimes they truly don't know what they want, or what the technology is capable of, so things can change radically. Those are high risk/high cost projects no matter what.
Things always change, if anything unit tests will help you clarify requirements earlier. Integration tests can probably come a bit later but unless your prototypes are throw away I'd try write unit tests from the start. Like anything there are compromises but writing tests sooner rather than later helps. changing tests when requirements change will help validate those changes.
I agree with Zenzen - showing a client a working prototype earns a puzzled, suspicious look when you state it needs re-writing.
And like Yishai, i believe in making disposable prototypes first as putting a visual mock-up in front of the client reveals so much about what they actually want (most haven't got a clue until you show them)..
However, this visual prototype inspires a whole world of new thoughts in the client so, inevitably, requirements change..
So, my method:
Keep the prototype as simple and static a mock-up as possible (No TDD).
The moment i'm asked to demonstrate real-world functionality, I consider this the beginning (Full TDD).

TDD: possible to bootstrap when enhancing existing large app?

Chapter about TDD from Martin's "Clean Code" caught my imagination.
However.
These days I am mostly expanding or fixing large existing apps.
TDD, on the other hand, seems to only be working only for writing from scratch.
Talking about these large existing apps:
1. they were not writted with TDD (of course).
2. I cannot rewrite them.
3. writing comprehensive TDD-style tests for them is out of question in the timeframe.
I have not seen any mention of TDD "bootstrap" into large monolite existing app.
The problem is that most classes of these apps, in principle, work only inside the app.
They are not separable. They are not generic. Just to fire them up, you need half of the whole app, at least. Everything is connected to everything.
So, where is the bootstrap ?
Or there is alternative technique with results of TDD
that'd work for expanding the existing apps that were not developed with TDD ?
The bootstrap is to isolate the area you're working on and add tests for behavior you want to preserve and behavior you want to add. The hard part of course is making it possible, as untested code tends to be tangled together in ways that make it difficult to isolate an area of code to make it testable.
Buy Working Effectively with Legacy Code, which gives plenty of guidance on how to do exactly what you're aiming for.
You might also want to look at the answers to this related question, Adding unit tests to legacy code.
Start small. Grab a section of code that can reasonably be extracted and made into a testable class, and do it. If the application is riddled with so many hard dependencies and terrifying spaghetti logic that you can't possibly refactor without fear of breaking something, start by making a bunch of integration tests, just so you can confirm proper behavior before/after you start messing around with it.
Your existing application sounds as if it suffers from tight coupling and a large amount of technical debt. In cases such as these you can spend a LOT of time trying to write comprehensive unit tests where time may be better spent doing major refactoring, specifically promoting loose coupling.
In other cases investing time and effort into unit testing with Mocking frameworks can be beneficial as it helps decouple the application for purposes of testing, making it possible to test single components. Dependency Injection techniques can be used in conjunction with mocking to help make this happen as well.

How to convert a software shop to TDD?

I would really love to push for TDD development inside the shop I'm working in. Lots of the seniors over there didn't work with unit testing or did unit testing that was hitting the database.
I would love to bring some good arguments, books for training, possible coach to ease the transition.
I have found that it is often very hard to push TDD from the developer up. What I tend to do is talk about the benefits of TDD as much as possible and wherever possible, introduce elements of TDD myself bit by bit.
If they don't mind, start a new project with unit tests in it (managers seldom mind more test coverage) and start developing that way yourself. Slowly show the rest of your team the benefits and try to win some converts. Once you have a few other developers on your side, start pushing management for some training.
You could also offer to run some lunch-n-learns about it for the other developers. Teaching is the best way of learning and your will hopefully gain allies. If you are lucky, you can talk your boss into buying the pizza for the lunch-n-learn and everyone benefits.
Like Rob P said - I also found that preaching ended me with a hoarse voice and no one listening. I got results faster and more widespread by doing it and keeping that part visible. Be open to questioning and don't force it. Encourage and praise but don't preach.
Combine it with publishing the results of your testing- and have that automated - you can send out an email perhaps. You want many subtle reminders to show people how good your method is.
I think a good way to sneak in TDD principals into an existing product is to start writing unit tests for bugs. This way you slowly start to build up a set of unit tests for regression testing which become an integral part of the project, especially if you can get them run as part of your build process.
The only hurdle will be the existing code might be resistant to testing, but that's just another excuse to do some refactoring.
Once people start to realise the benefits the momentum will grow, but you need to pioneer the way.
While I can't tell you what will work, I can tell you some things that definitely will not work and should be avoided:
I'll write the code, you write the test
This always comes up at first. People assume that since you're so gung-ho about testing, you should be the one writing the tests. This doesn't work at all and misses the whole point.
You wrote the test that's breaking, so you have to fix it.
If you start writing tests for your code, inevitably someone else will break those tests. Then, if you ask them to fix it, they'll often say it's your responsibility. This isn't necessarily them being a jerk, it could just be that they don't understand the process. This is where you'll need management backup.
I'll just start, and everyone will follow.
Like others have said, TDD without management support is very hard. If there are any devs who don't "drink the Cool-Aid" then they will constantly be breaking your tests and not caring. If you can't make them believe, then you need management telling them it's their job.
What finally brought me around was watching a project collapse due to too many bugs. It convinced me that I was doing something fundamentally wrong. A little research brought me to automated testing, and with a little determination I taught myself the basics. Perhaps talking to your fellow devs about similar projects (we all have at least one...) will help them realize that they might want to try something new.
Lead by example:
use TDD on all the code you write
show them the benefits as soon as you have the opportunity (regression detected by the unit test or incident recreated in your unit testing environment)
deliver "clean code that works"
propose your assistance to others
don't be dogmatic - TDD is not silver bullet
make your unit tests visible: they should compile along with the code they test
If the project doesn't have enough unit tests, you can point out bugs in the issue database that would probably have been avoided if there had been unit tests.
As for pushing TDD, or some other code religion, don't bother.
For some people (and some types of code), TDD is great. Some people don't work that way, and don't benefit from test-first. As long as they don't avoid testing altogether, I don't think it matters.
A great challenge with TDD that is brought in "bottom up" is that, when push comes to shove (as it inevitably does when a deadline approaches), management is going to over-ride the emphasis on tests: "We can't afford to test! We have to finish the project!"
Of course, this is the very situation (deadline looming, significant backlog, progress not on track with promises, leading to rapidly shifting priorities and tasks) where the benefits of TDD really kick in. Management over-rides it, the project / iteration starts to come apart in the same-old same-old, and management looks back and says "We tried TDD and it didn't help at all."

Application Testing

Is the real benefit in TDD the actual testing of the application, or the benefits that writing a testable application brings to the table? I ask because I feel too often the conversation revolves so much around testing, and not the total benefits package.
TDD helps you design your software. The tests becomes the design. By writing the test first you think about your code from a consumer perspective, making a more user friendly and more compact software design.
Also, by applying TDD you typically end up writing your code in a way where you can supply test mocks and stubs. This leads to less coupled software, making it easier to change and maintain over time.
So I guess allot of the talk around TDD is about testing, but by doing that other big benefits follow, such as quality (coverage), flexibility (decoupling), better design (think as the consumer of the API).
The real improvement is that it is a good way to force you to really think through the design and implementation. Then, once you've prepared the tests and written the code, solutions to unforeseen problems appear more easily.
Something that usually happens to me that is a good analogy: When I'm going to post a question to a forum or IRC channel, I like to have the problems well written and fully described, many times the process of preparing a well written and complete description of the problem magically makes the solution appear.
The real benefit of TDD is supposed to be that it allows you to modify/refactor/enhance your application without worrying about whether you've broken existing functionality. The fact that writing unit tests tends to result in loosely coupled code and better architecture isn't necessarily the point of TDD, but I think it's hard to have one without the other.
You can't really experience the benefit of TDD unless you have unit tests with good coverage. In order to do that, you're going to have to write testable code. That's why the two are often used in conjunction or in place of one another.
Automated testing is such a time saver and confidence booster when you are developing a product that you'll ship multiple versions of. With automated tests, you know that you haven't broken anything between versions. This especially helpful when your product is something that people can write add-ons for - you don't want to break their add-ons between versions.
With TDD, you get a good suite of tests as you develop. Without TDD writing those tests is much more difficult.
Michael Feathers has an insightful blog post about this titled The Flawed Theory Behind Unit Testing. Seriously, go read it. The punch line is
All of these techniques have been shown to increase quality. And, if we look closely we can see why: all of them force us to reflect on our code.
but you should read the full post for the context.
Automated testing keeps humans from doing a machine's job.
Test-driven development maximizes the amount of automated testing.
Beyond a certain point, of course, a human is still required. You reach diminishing returns when you try to apply TDD beyond that point.

Resources