I know there is done some research on TDD at the North Carolina State University. They have published a paper called 'An Initial Investigation of Test Driven Development in Industry'. Other publications by NCSU can be found here.
Can anybody point me to other good publications on this topic?
On the Effectiveness of the Test-First Approach to Programming, by Hakan Erdogmus, Maurizio Morisio, and Marco Torchiano.
Despite the name it covers TDD:
Abstract:
Test-Driven Development (TDD) is based
on formalizing a piece of
functionality as a test, implementing
the functionality such that the test
passes, and iterating the process.
This paper describes a controlled
experiment for evaluating an important
aspect of TDD: In TDD, programmers
write functional tests before the
corresponding implementation code. The
experiment was conducted with
undergraduate students. While the
experiment group applied a test-first
strategy, the control group applied a
more conventional development
technique, writing tests after the
implementation. Both groups followed
an incremental process, adding new
features one at a time and regression
testing them.
We found that test-first
students on average wrote more tests
and, in turn, students who wrote more
tests tended to be more productive. We
also observed that the minimum quality
increased linearly with the number of
programmer tests, independent of the
development strategy employed.
The ACM Digital Library has quite a few papers on TDD. Simply Search for Test Driven Development.
The top results from Google's Test driven development academic research:
Test-Driven Development: Concepts, Taxonomy, and Future Direction in the IEEE Computer Society.
software Architecture Improvement through TDD at the ACM
As a TDD Practitioner myself, I have launched a new site WeDoTDD.com that lists just that. Companies practicing it, and stories behind how they practice Test Driven Development!
Related
I understand that TDD has many advantages (some of them are below). How ever I am not sure how it drives the design?
Serves as documentation
writing tests before actual code helps maximum test coverage
Help determine input value boundaries
Usually when we start implement new piece of functionality, we will have rough idea of the design. And we start with TDD implementation of a class, which is used by other classes as per design. This understanding of mine seems to be in conflict with statement "TDD drives the design"
Please help me understand this by an example.
Most people think that Test-Driven Development is tool to write code with lesser number of bugs. But ,in reality, that is the by-product of TDD. TDD is more of a tool for code designing.
Overall, TDD helps in quality code development in following ways:-
It makes you think about your code design and requirements at every stage, thereby ensuring that you are actually implementing what is required.
You are forced to write testable code, thereby ensuring that your code has loose coupling and high cohesion.
If your code is getting difficult to test, mostly is signifies that there is some issue with your design(Your code is too coupled or not isolated enough)
With that said, I tend to disagree with people that think if you follow TDD blindly you'd always end up with good code design(because that depends more on you and your knowledge of Sofware Design), but I believe there is a good chance you would.
TDD doesn't drive design, but it's an excellent feedback mechanism, which enables you to get rapid feedback on your design ideas.
It also has the beneficial side-effect that it leaves behind a nice regression test suite.
The most basic idea behind this, is that when you want to test something, you want to keep the tests simple and focused. This in turn forces you to start with the smallest bits of your problem, driving towards the Single responsibility principle.
Another thing, for larger problems, is when you are forced (because you want your tests to be focused) to use mocks and stubs. This drives your design towards using the Dependency inversion principle, which in turn makes your code loosely coupled.
TDD in general
The goal of Test-driven Development (TDD) is to create software components that are precisely conforming to a set of functional specifications. The task of the tests is to validate conformity and to identify disparities from these specifications.
Design Guidance
By writing tests first, you're forced to make up your mind what your software component should do. Thinking about meaningful test cases first will guide your design, as you're forced to find answers to a set of questions: How should process inputs? To which components does it interface? What are the outputs I'm expecting? ...
TDD does not replace SW design. It sharpens your design!
In essence, TDD will help you to sharpen your design - but one thing is crucial: You need to find reasonable tests. And trying to replace functional specifications - call it a requirements documents - with tests only, is rather dangerous. Why?
Unit tests as used by TDD can only guarantee that your components exhibit the expected behavior for the given test data, but not for any data! As Dijkstra said in the 1960s: "Testing shows the presence, not the absence of bugs".
> Usually when we start implement new piece of functionality, we will have rough idea of the design.
That's the core thing of your question. If you just have a rough idea of the design, you better should spend more time at the drawing board, asking your self questions like: What are the individual tasks my software should carry out? How can I split the general task into subtasks? How can I map these subtasks to components? Which data needs to be passed among them?
At that time, you might consider doing TDD. But without thinking about a design or software architecture first, you'll end up with a "spaghetti system" that is hard to understand and hard to maintain later on.
Great observation by Mark Seemann. The principle that pushes me to value TDD is fast feedback (and so fast learning). You can't substitute good design techniques for just doing TDD - learn good design principles and use TDD to provide fast feedback on your work.
I find that when using TDD most deeper thinking about my design happens during the refactor step and the tests then provide a space for that step and a safety net for change, but it doesn't automatically create good design.
TDD allows me to hack some working code and use the learning I get while doing that to iterate towards a better design. Of course this can be done without TDD too, but the test helps by providing that safety net.
I'm writing a short paper to expound the benefits of unit testing and TDD. I've included a short section at the end entitled "Beyond TDD" in which I'd hoped to cover a few different approaches based on TDD, BDD and ATDD in particular.
I'm sort of familiar with BDD (I've played with SpecFlow) but after reading up on ATDD, it sounds very similar. Are BDD and ATDD just two names for what is essentially the same process - document the behaviours in a 'ubiquitous' language', generate an automated acceptance test suite, then go about making the acceptance tests pass?
While I agree generally with gishu's post, there are a couple of areas in which I disagree. In the IMHO section, he presents BDD specification as the user story specification developed by Rachel Davies et al: As a... I want... so that.
The BDD specification is Given... When... Then... as in
Given that the user is logged in, when the user clicks on x, then we should see Y.
This is about conditions, actions, and expectations, and is core to BDD.
ATDD is, as gishu suggests, the practice of driving development through the use of acceptance test specifications, implemented as executable acceptance criteria. The specification, in the BDD form, is neither required nor "best practice." In practice, however, it is effective in focusing the thinking and language on how to verify that the work has been done satisfactorily and meets requirements.
Note that BDD is not particularly based on TDD. ATDD is loosely based on TDD in that it is testing that is done before development is done. Beyond that, it is not focused on developer work, but on the overall direction and validation of the project. ATDD meshes well with Story Mapping, in that it plays well during the discovery phase when higher level requirements are being written, and it's important to know "how will we know when it has been done properly?"
BDD (Dan North, Dave Astels, Dave Chelimsky, et. al) is a movement to make the whole delivery process agile.
That said, teams doing BDD would be employing the practice of ATDD - i.e. the process of starting with executable specifications of acceptance criteria. An effective graphic to put the point across is where ATDD wraps the inner cycle of TDD.
ATDD is just the practice of starting with executable acceptance criteria before development and using it to shape the design of the underlying code base (much like TDD but at a more chunkier level).
What follows is totally an opinion and may not be entirely accurate:
You could be doing ATDD but still not be doing BDD:
e.g. I could be writing automated acceptance tests but which are not readable.. which do not convey intent. I could be writing a comprehensive suite of automated 'regression' tests but which do not tell me what does the system does/ how does it work.
BDD stresses on language and communication strongly. e.g. specifying behavior i.e. instead of saying
testXDoesY
BDD would specify it as
As a StakeHolder, X should do Y so that I can Z.
So to close, I think the major difference (which may occur but doesn't have to) is that ATDD could turn into a comprehensive automated suite that just acts as a target for active development + regression. BDD would implore you to move the needle further onto shared language between problem and solution domains + living documentation via executable examples that enables future constructive conversation
ATDD is often used synonymously with Behavior Driven Development (BDD), Story Test Driven Development (SDD) and Specification By Example. The main distinction of ATDD compared to other agile approaches is, its focus on making developer, testers, business, product owners and other stakeholders collaborate and come up with a clear understanding of what needs to be implemented.
I personally like the concept of ATDD as it aligns with the “Shift Left paradigm” where development and testing should start as early as possible in the SDLC. It helps to bring more visibility into automation as we start writing automated tests right from the beginning of the SDLC and in turn helps in increased collaboration within the team.
Remember, ATDD is not one all-fits-all kind of solution. It is one of the agile approaches. There are various other ways to help to improve processes in teams but I specifically found this approach to focus on better acceptance tests and most importantly its emphasize on collaboration; which is the integral part of this approach.
I would say that nothing much. My first assumption would be that ATDD, BDD, specification by example, agile acceptance testing, etc. all mean the same thing. If someone uses these terms so that they would mean separate things, they better explain the difference in that context.
The 'London Style' of TDD suggests focusing on object roles, responsibilities and collaborations, using mock objects to drive out layers of large scale component design in a top down process (as opposed the 'Classic Style' which focuses on algorithmic refinement of methods).
I understand the basic approach, but am wondering how you reconcile this form of TDD (which still emphasises writing a test first) with the more natural way in which we design components, ie. sketching logical designs of related classes and defining their roles and responsibilities on paper / in our heads well before we start writing any code.
Appreciate some real-world advice.
I don't see a need to reconcile TDD (in any form) with "natural" component design. Testing implies that you have an idea of what you test, at the very least you have an interface to some level of precision. Starting out at a coarse-grained component definition seems very "natural" to me.
:)
'London Style' is basically good OOP combined with Outside-in (acceptance test) driven TDD (I am assuming you mean an approach similar to the GOOS book).
That is the way that it "should" be done ; although the "classical" people should have been more explicit about it. I'm not sure there is such a classification among the practitioners (although there are people who are faster with TDD and people who struggle with it).
State-based and interaction-based are styles and are not fits-all-sizes approaches. You need to choose the style for the task at hand.
The problem with doing "TDD in a corner" is that you may end up with well tested code that works but still does the wrong thing from the customers perspective.
Evolution has landed us now into an ATDD cycle which is TDD done at the customer/acceptance level which drives an inner TDD cycle for developers to make the acceptance test pass.
On the "reconcilation":
I've found 'listening to the tests' quite enlightening once you have tuned your ears.. let the tests drive the design.
This is also aligned to the BDD folks. I recommend picking up the RSpec book which has a walkthrough in the first section of the book.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Lately I've been a lot into agile methodologies. Some papers I've read at Martin Fowler's website looked thoroughly unflawed to me to say the least, and I was wondering what agile practices are most suited to game development, particularly for small-budget, small-sized and inexperienced team projects (bold because it's really important).
Concepts such as refactoring look like they could absolutely combine with a small and inexperienced team. The idea of "embracing the change" also combines with inexperienced teams whose ideas twist and turn all the time. But things like TDD are rather problematic.
It's hard and suboptimal to test a class that interacts with Direct3D, for example. It doesn't make much sense.
I'd be most deeply grateful if you could list a bunch of practices that made sense for game development. Ones that aid in the organization of artistic production are a plus. Citations of real-world cases are another plus.
Thanks in advance.
edit --
Plus, my team is composed of 3 people: one programmer, one graphics designer and one programmer/graphics designer combomix. We do not have a client, so we must make all decisions alone.
I've read in an article by Fowler that agile sort of depends on developer-client interaction, but he also mentioned that clients unwillingly to adhere to agile could still be well-served by agile development (the article was called New Methodology). How does that apply to my case?
Conclusions --
I think questions at StackOverflow can help others too, so I'll try to summarize my thoughts on the subject here.
Through the use of mock objects, even hard-to-test elements such as graphics interfaces and their relationship to client classes might be manageable.
For example, instead of letting every client of the interface truly test its use under many conditions (fullscreen/windowed mode switch, for example, which affects almost everything in the game), they could be tested against a mock that seemingly to them behaves the same as the original class, and additionally test that mock's fidelity to the original object.
That way, the slow part (actually opening the window and stuff) is only done once in checking the mock's fidelity to the implementation, and everything else just run smooth over the mock. [thanks to Cameron]
A BDD mindset aids at easening the paranoid seek for thorough testing of units, "replacing" testing by specification of actual behaviours, not squeezed units which in many cases are better off let untested (or only indirectly tested, if you prefer putting it that way) to avoid too much one-to-one test vs. unit (class, method, variable, etc) parity, which adds to test (now "specification") fragility. [thanks to Kludge]
I would recommend trying out VersionOne (www.versionone.com). VersionOne is free for a small team working on a single project, and has easy-to-use tools for agile status tracking and project planning. Their site also has links to explanations of the various Agile development methodologies.
There are different flavors of Agile development; I'd recommend taking a look at the Extreme Programming (XP) model, as a good example:
http://www.extremeprogramming.org/map/project.html
Agile development is as much concerned with project planning and requirements tracking as it is with the actual programming practice.
The idea is to make sure you record game features that need to be developed (as "user stories"), give a (very rough) estimate of how long each will take, and figure out which ones are important. Box out a small amount of time for each release, and schedule the most-important, least-expensive features that you can release in that time. This system ensures steady forward progress, protects you against constant, unpredictable priority changes, and makes sure you don't develop a huge monolithic code base that doesn't work.
With regard to Test-Driven Development, I think Cameron's and Andrew Grimm's comments are both on point. You can do a lot more unit testing if you abstract away things like graphics API calls.
You definitely want to look at Extreme Programing (XP), take a look at Kent Beck's Extreme Programming Explained: Embrace Change, 2nd Edition
The most useful thing you can do though is to do some research on Behaviour-Driven Development which is basically Test-Driven Development done right. It takes the focus off of tests and back onto specifications. You don't worry about what classes do so much as what behavior your program exhibits.
So saying you aren't going to use TDD, or BDD, is just plain crazy talk. One of the core concepts of Agile development is developing your software from your tests/specs. You have to get out of the mindset that tests/specs are testing your classes. That's not really what they're for. They are for describing the behaviors your application should exhibit then using that test/spec to write the behavior into your application.
You might write something like this
Describe Startup
it "should display a welcome screen" do
game = Game.new
game.start
game.output_buffer.should match("Welcome")
end
end
Then you go write the code to make it happen. You describe the code you want, then you go write it. It allows you to write your code in little, bite sized chunks and best of all when someone else picks up your code they can run the tests and see that everything works. When they want to add new functionality they use the same process so now when you go back to the code you can have faith that their code works too.
Agile/Lean methods such as Scrum, XP and Kanban have been successfully applied to game development since 2003.
There are a number of blogs including:
http://blog.agilegamedevelopment.com/
and a book as well. See the book link in the blog above.
If you have good model view controller (MVC) separation, you should be able to test "business logic" without graphics. For example, testing that a given weapon produces the correct amount of damage, can't shoot through walls, and has a given area of effect.
Agile is not at all compatible with GameDev. They are two completely opposite methodologies.
Agile is about development being flexible to changing business requirements and breaking down projects into clear and managable deadlines and work units. GameDev is about regularly and dramatically changing business requirements without caring about the impact on development, and breaking down development teams through unmanagable deadlines and volumes of work.
I don't believe there's actually any element of Agile that is incompatible with game development. I think you're mistaken with respect to difficulty of testing graphic functions; running a graphic function generates an image, and it's easy to compare images to manually-approved "golden masters" (see Approval Tests for one way to go about this).
Refactoring is a fine technique, but it should be done with a safety net of unit tests. The best way to get unit tests is to develop your code test-first, so TDD is another Agile practice you should be following. And it's all better - refactoring, testing, and the rest - if you pair. You have enough people on the team to be pair programming. Try it and see how much faster you get running tested features!
I need to give a short presentation (2-4 hours) on Test-Driven Development and need to come up with a small class that I can build using the TDD methodology. The class has to be relatively small, but "sell" the concept of TDD.
If anyone has read James Newkirk's book, Test-Driven Development for in Microsoft.Net, the Stack example is perfect. It’s a small class, has a manageable list of tests/requirements, and the creation process sells TDD (IMHO).
I don't want to use the Stack example or similar data structures (queue, lists, etc) for fear on impinging on Newkirk’s work.
So, I’m looking for a few good ideas for a sample class.
Thanks.
How about using the 1st section of Kent Beck's Money example. It starts out very simply but when you arrive at addition of two different currencies, TDD suddenly shows you the falisity of up front design, or YAGNI (you aren't going to need it).
Another good example is uncle Bob's bowling score TDD example. I think this is a good example of how a TDD narrative brings you to a clean solution that would have been explicitly unapproachable via an up front design.
To make it a really exciting presentation, up-front you could challenge the audience to design the two scenarios using whatever methods they find appropriate. You then would show the TDD way of designing them.
The real WTF moment for me with TDD was when Beck removed the two subclasses of Money, and the tests worked. This is not a trivial action; the man deleted two classes! The confidence to do something like this can be found only by two means.
1) gathering all the senior players in a code base and running through scenarios, followed by an extensive follow through to confirm it works
2) TDD
=D
If you have time for it, I would pick an example with an external dependency of some sort that is going to get abstracted in the test. Either a database, calls to a GUI, calls to a remote system, etc.
The reason is that one of the blocks to TDD is that the example seems too self contained. "Sure, when everything is a self-contained unit you can unit test, but when I have 15 systems to integrate, what is the point?" kind of thing.
I would also at least show one example at the end (look at Michael Feather's book Working Effectively with Legacy Code for how-tos) of migrating an existing class to bring it under TDD. Don't dwell on that as an example, but odds are your audience will be thinking about how to migrate the class they wrote that morning, no reason to let that fester as an "unmentionable."
TDD problems has a list of problems, ranging from simple to less simple.
Some have a list of tests to start from no solution yet.
If you have a perfect example from a book, then you should use it and promote the book. No author would object to that.
Aside from that, I attended a TDD presentation a few years ago where the example was a simple calculator and it worked beautifully.
Three I like, in roughly increasing order of difficulty:
Range (of integers; implement
isEmpty(), contains(), intersects(),
length())
Natural Sort
Snake
If I had half an hour, I'd do Range; 90 minutes, probably Natural Sort; more: Snake. Depends on the audience, though.
I would try to find something small from a wellknown domain. I recently gave a presentation on BDD / TDD based on ASPNET.MVC. That included one controller, one action and a view model. That also gave me the opportunity to introduce a dependency container and a mocking framework.
How about a simple math class, with addition, subtraction, multiplication, and such?
Another classical example from the TDD / Extreme / Agile community is the Bowling Game example; I seem to recall it has been used in both Beck and Martin, as well as multiple times at xprogramming.com for examples and explorations of different techniques within TDD.
Go out on a limb and take requests from the audience. :)
If the goal is selling TDD, you also want to show a small refactoring of a large test base. It is easy to make it work with small samples, most developers buy into that now. There is much more doubt about scalability. An avanced topic would then be how to handle a large base of legacy (no unit tests) code.
A simple card game would be nice, especially as you can provide some visual representation of the result
And I assume you are going to use a coding dojo as presentation form, aren't you? No fancy powerpoint. If the public is non-programmers, use the excel sample
Essential Skills for Agile Development - Ka Iok Tong. This book is about Agile but containing several chapters about test especially TDD. The author explains TDD by coding from requirement, also note down his thought on how to solve problem in TDD. You may find this book here. To update more about concept and modern tools you may go here
I'd suggest you buy yourself the book Test driven design by example from Kent Beck.
The book almost completely focuses on building one class through TDD.
Roman numerals. Count non-comment lines of source code. Towers of Hanoi. There's plenty of ideas out there.
If the intended audience is naive in TDD then I would recommend using below examples. It really gives you good understanding of TDD concepts and implementation.
Bank Account
Bowling Game