What is a good sample class to demonstrate TDD? - tdd

I need to give a short presentation (2-4 hours) on Test-Driven Development and need to come up with a small class that I can build using the TDD methodology. The class has to be relatively small, but "sell" the concept of TDD.
If anyone has read James Newkirk's book, Test-Driven Development for in Microsoft.Net, the Stack example is perfect. It’s a small class, has a manageable list of tests/requirements, and the creation process sells TDD (IMHO).
I don't want to use the Stack example or similar data structures (queue, lists, etc) for fear on impinging on Newkirk’s work.
So, I’m looking for a few good ideas for a sample class.
Thanks.

How about using the 1st section of Kent Beck's Money example. It starts out very simply but when you arrive at addition of two different currencies, TDD suddenly shows you the falisity of up front design, or YAGNI (you aren't going to need it).
Another good example is uncle Bob's bowling score TDD example. I think this is a good example of how a TDD narrative brings you to a clean solution that would have been explicitly unapproachable via an up front design.
To make it a really exciting presentation, up-front you could challenge the audience to design the two scenarios using whatever methods they find appropriate. You then would show the TDD way of designing them.
The real WTF moment for me with TDD was when Beck removed the two subclasses of Money, and the tests worked. This is not a trivial action; the man deleted two classes! The confidence to do something like this can be found only by two means.
1) gathering all the senior players in a code base and running through scenarios, followed by an extensive follow through to confirm it works
2) TDD
=D

If you have time for it, I would pick an example with an external dependency of some sort that is going to get abstracted in the test. Either a database, calls to a GUI, calls to a remote system, etc.
The reason is that one of the blocks to TDD is that the example seems too self contained. "Sure, when everything is a self-contained unit you can unit test, but when I have 15 systems to integrate, what is the point?" kind of thing.
I would also at least show one example at the end (look at Michael Feather's book Working Effectively with Legacy Code for how-tos) of migrating an existing class to bring it under TDD. Don't dwell on that as an example, but odds are your audience will be thinking about how to migrate the class they wrote that morning, no reason to let that fester as an "unmentionable."

TDD problems has a list of problems, ranging from simple to less simple.
Some have a list of tests to start from no solution yet.

If you have a perfect example from a book, then you should use it and promote the book. No author would object to that.
Aside from that, I attended a TDD presentation a few years ago where the example was a simple calculator and it worked beautifully.

Three I like, in roughly increasing order of difficulty:
Range (of integers; implement
isEmpty(), contains(), intersects(),
length())
Natural Sort
Snake
If I had half an hour, I'd do Range; 90 minutes, probably Natural Sort; more: Snake. Depends on the audience, though.

I would try to find something small from a wellknown domain. I recently gave a presentation on BDD / TDD based on ASPNET.MVC. That included one controller, one action and a view model. That also gave me the opportunity to introduce a dependency container and a mocking framework.

How about a simple math class, with addition, subtraction, multiplication, and such?

Another classical example from the TDD / Extreme / Agile community is the Bowling Game example; I seem to recall it has been used in both Beck and Martin, as well as multiple times at xprogramming.com for examples and explorations of different techniques within TDD.

Go out on a limb and take requests from the audience. :)

If the goal is selling TDD, you also want to show a small refactoring of a large test base. It is easy to make it work with small samples, most developers buy into that now. There is much more doubt about scalability. An avanced topic would then be how to handle a large base of legacy (no unit tests) code.
A simple card game would be nice, especially as you can provide some visual representation of the result
And I assume you are going to use a coding dojo as presentation form, aren't you? No fancy powerpoint. If the public is non-programmers, use the excel sample

Essential Skills for Agile Development - Ka Iok Tong. This book is about Agile but containing several chapters about test especially TDD. The author explains TDD by coding from requirement, also note down his thought on how to solve problem in TDD. You may find this book here. To update more about concept and modern tools you may go here

I'd suggest you buy yourself the book Test driven design by example from Kent Beck.
The book almost completely focuses on building one class through TDD.

Roman numerals. Count non-comment lines of source code. Towers of Hanoi. There's plenty of ideas out there.

If the intended audience is naive in TDD then I would recommend using below examples. It really gives you good understanding of TDD concepts and implementation.
Bank Account
Bowling Game

Related

Does TDD preclude designing first?

I've been reading about TDD lately, and it is advocated because it supposedly results in code that is more testable and less coupled (and a bunch of other reasons).
I haven't been able to find much in the way of practical examples, except for a Roman numeral conversion and a number-to-English converter.
Observing these two examples, I observed the typical red-green-refactor cycles typical of TDD, as well as the application of the rules of TDD. However, this seemed like a big waste of time when normally I would observe a pattern and implement it in code, and then write tests for it after. Or possibly write a stub for the code, write the unit tests, and then write the implementation - which might arguably be TDD - but not this continuous case-by-case refactoring.
TDD appears to incite developers to jump right into the code and build their implementation inductively rather than designing a proper architecture. My opinion so far is that the benefits of TDD can be achieved by a proper architectural design, although admittedly not everyone can do this reasonably well.
So here I have two questions:
Am I right in understanding that using TDD pretty much doesn't allow you to design first (see the rules of TDD)?
Is there anything that TDD gives you that you can't get from doing proper design before you start coding?
well, I was in your shoes some time ago and had the same questions. Since then I have done quite some reading about TDD and decided to mess with it a little.
I can summarize my experience about TDD in these points:
TDD is unit testing done right, ATDD/BDD is TDD done right.
Whether you design beforehand or not is totally up to you. Just make sure you don't do BDUF. Believe me you will end up changing most of it midways because you can never fully understand the requirements until your hands get dirty.
OTOH, you can do enough design to get you started. Class diagrams, sequence diagrams, domain models, actors and collaborators are perfectly fine as long as you don't get hung up in the design phase trying to figure everything out.
Some people don't do any design at all. They just let the code talk and concentrate on refactoring.
IMHO, balance your approach. Do some design till you get the hang of it then start testing. When you reach a dead end then go back to the white board.
One more thing, some things can't be solved by TDD like figuring out an algorithm. This is a very interesting post that shows that some things just need to be designed first.
Unit testing is hard when you have the code already. TDD forces you to think from your API users perspective. This way you can early on decide if the public interface from your API is usable or not. If you decide to do unit testing after implementing everything you will find it tedious and most probably it will be only for some cases and I know some people who will right only passing test cases just to get the feature done. I mean who wants to break his own code after all that work?
TDD breaks this mentality. Tests are first class citizens. You aren't allowed to skip tests. You aren't allowed to postpone some tests till the next release because we don't have enough time.
Finally to answer your question if there anything that TDD gives you that you can't get from doing proper design before you start coding, I would say commitment.
As long as your doing TDD you are committed to apply good OO principles, so that your code is testable.
To answer your questions:
"Test Driven Development" (TDD) is often referred to as "Test Driven Design", in that this practice will result in a good design of the code. When you have written a failing unit test, you are forced into a test driven design approach, so that you can implement just what is needed to make the test pass i.e. you have to consider the design of the code you are writing to make the test pass.
When using a TDD approach a developer will implement the minimum amount of code required to pass the test. Doing proper design beforehand usually results in waste if the requirements change once the project has started.
You say "TDD appears to incite developers to jump right into the code and build their implementation inductively rather than designing a proper architecture" - If you are following an Agile approach to your software development, then you do still need to do some up front architectural investigation (e.g. if you were using the Scrum methodology you would create a development "Spike" at the start of a project) to ascertain what the minimum amount of architecture needed to start the project. At this stage you make decisions based on what you know now e.g. if you had to work with a small dataset you'd choose to use a regular DB, if you have a huge DB you might to choose to use a NoSQL big data DB etc.
However, once you have a general idea of the architecture, you should let the design evolve as the project progresses leaving further architectural decisions as late in the process as possible; Invariably as a project progresses the architecture and requirements will change.
Further more this rather popular post on SO will give you even more reasons why TDD is definetly worth the effort.

TDD and Responsibility Driven Design - how to reconcile test-first with the design process?

The 'London Style' of TDD suggests focusing on object roles, responsibilities and collaborations, using mock objects to drive out layers of large scale component design in a top down process (as opposed the 'Classic Style' which focuses on algorithmic refinement of methods).
I understand the basic approach, but am wondering how you reconcile this form of TDD (which still emphasises writing a test first) with the more natural way in which we design components, ie. sketching logical designs of related classes and defining their roles and responsibilities on paper / in our heads well before we start writing any code.
Appreciate some real-world advice.
I don't see a need to reconcile TDD (in any form) with "natural" component design. Testing implies that you have an idea of what you test, at the very least you have an interface to some level of precision. Starting out at a coarse-grained component definition seems very "natural" to me.
:)
'London Style' is basically good OOP combined with Outside-in (acceptance test) driven TDD (I am assuming you mean an approach similar to the GOOS book).
That is the way that it "should" be done ; although the "classical" people should have been more explicit about it. I'm not sure there is such a classification among the practitioners (although there are people who are faster with TDD and people who struggle with it).
State-based and interaction-based are styles and are not fits-all-sizes approaches. You need to choose the style for the task at hand.
The problem with doing "TDD in a corner" is that you may end up with well tested code that works but still does the wrong thing from the customers perspective.
Evolution has landed us now into an ATDD cycle which is TDD done at the customer/acceptance level which drives an inner TDD cycle for developers to make the acceptance test pass.
On the "reconcilation":
I've found 'listening to the tests' quite enlightening once you have tuned your ears.. let the tests drive the design.
This is also aligned to the BDD folks. I recommend picking up the RSpec book which has a walkthrough in the first section of the book.

Which agile practices are compatible with game development? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Lately I've been a lot into agile methodologies. Some papers I've read at Martin Fowler's website looked thoroughly unflawed to me to say the least, and I was wondering what agile practices are most suited to game development, particularly for small-budget, small-sized and inexperienced team projects (bold because it's really important).
Concepts such as refactoring look like they could absolutely combine with a small and inexperienced team. The idea of "embracing the change" also combines with inexperienced teams whose ideas twist and turn all the time. But things like TDD are rather problematic.
It's hard and suboptimal to test a class that interacts with Direct3D, for example. It doesn't make much sense.
I'd be most deeply grateful if you could list a bunch of practices that made sense for game development. Ones that aid in the organization of artistic production are a plus. Citations of real-world cases are another plus.
Thanks in advance.
edit --
Plus, my team is composed of 3 people: one programmer, one graphics designer and one programmer/graphics designer combomix. We do not have a client, so we must make all decisions alone.
I've read in an article by Fowler that agile sort of depends on developer-client interaction, but he also mentioned that clients unwillingly to adhere to agile could still be well-served by agile development (the article was called New Methodology). How does that apply to my case?
Conclusions --
I think questions at StackOverflow can help others too, so I'll try to summarize my thoughts on the subject here.
Through the use of mock objects, even hard-to-test elements such as graphics interfaces and their relationship to client classes might be manageable.
For example, instead of letting every client of the interface truly test its use under many conditions (fullscreen/windowed mode switch, for example, which affects almost everything in the game), they could be tested against a mock that seemingly to them behaves the same as the original class, and additionally test that mock's fidelity to the original object.
That way, the slow part (actually opening the window and stuff) is only done once in checking the mock's fidelity to the implementation, and everything else just run smooth over the mock. [thanks to Cameron]
A BDD mindset aids at easening the paranoid seek for thorough testing of units, "replacing" testing by specification of actual behaviours, not squeezed units which in many cases are better off let untested (or only indirectly tested, if you prefer putting it that way) to avoid too much one-to-one test vs. unit (class, method, variable, etc) parity, which adds to test (now "specification") fragility. [thanks to Kludge]
I would recommend trying out VersionOne (www.versionone.com). VersionOne is free for a small team working on a single project, and has easy-to-use tools for agile status tracking and project planning. Their site also has links to explanations of the various Agile development methodologies.
There are different flavors of Agile development; I'd recommend taking a look at the Extreme Programming (XP) model, as a good example:
http://www.extremeprogramming.org/map/project.html
Agile development is as much concerned with project planning and requirements tracking as it is with the actual programming practice.
The idea is to make sure you record game features that need to be developed (as "user stories"), give a (very rough) estimate of how long each will take, and figure out which ones are important. Box out a small amount of time for each release, and schedule the most-important, least-expensive features that you can release in that time. This system ensures steady forward progress, protects you against constant, unpredictable priority changes, and makes sure you don't develop a huge monolithic code base that doesn't work.
With regard to Test-Driven Development, I think Cameron's and Andrew Grimm's comments are both on point. You can do a lot more unit testing if you abstract away things like graphics API calls.
You definitely want to look at Extreme Programing (XP), take a look at Kent Beck's Extreme Programming Explained: Embrace Change, 2nd Edition
The most useful thing you can do though is to do some research on Behaviour-Driven Development which is basically Test-Driven Development done right. It takes the focus off of tests and back onto specifications. You don't worry about what classes do so much as what behavior your program exhibits.
So saying you aren't going to use TDD, or BDD, is just plain crazy talk. One of the core concepts of Agile development is developing your software from your tests/specs. You have to get out of the mindset that tests/specs are testing your classes. That's not really what they're for. They are for describing the behaviors your application should exhibit then using that test/spec to write the behavior into your application.
You might write something like this
Describe Startup
it "should display a welcome screen" do
game = Game.new
game.start
game.output_buffer.should match("Welcome")
end
end
Then you go write the code to make it happen. You describe the code you want, then you go write it. It allows you to write your code in little, bite sized chunks and best of all when someone else picks up your code they can run the tests and see that everything works. When they want to add new functionality they use the same process so now when you go back to the code you can have faith that their code works too.
Agile/Lean methods such as Scrum, XP and Kanban have been successfully applied to game development since 2003.
There are a number of blogs including:
http://blog.agilegamedevelopment.com/
and a book as well. See the book link in the blog above.
If you have good model view controller (MVC) separation, you should be able to test "business logic" without graphics. For example, testing that a given weapon produces the correct amount of damage, can't shoot through walls, and has a given area of effect.
Agile is not at all compatible with GameDev. They are two completely opposite methodologies.
Agile is about development being flexible to changing business requirements and breaking down projects into clear and managable deadlines and work units. GameDev is about regularly and dramatically changing business requirements without caring about the impact on development, and breaking down development teams through unmanagable deadlines and volumes of work.
I don't believe there's actually any element of Agile that is incompatible with game development. I think you're mistaken with respect to difficulty of testing graphic functions; running a graphic function generates an image, and it's easy to compare images to manually-approved "golden masters" (see Approval Tests for one way to go about this).
Refactoring is a fine technique, but it should be done with a safety net of unit tests. The best way to get unit tests is to develop your code test-first, so TDD is another Agile practice you should be following. And it's all better - refactoring, testing, and the rest - if you pair. You have enough people on the team to be pair programming. Try it and see how much faster you get running tested features!

TDD and UML together

I'm new to TDD approach so I'm wondering if anyone experienced wit this could enlighten me a little. I would like to get some leads how to use UML and TDD methodology together.
I've been used to: Design with UML --> Generate skeleton classes (and then keep it synchronized) -> Implement and finally Test. I must admit that testing part was the worst one, so I started to look for something else - TDD. So I have some general knowledge what is it about but before I proceed further, I am interested knowing how it goes together with software design especially UML.
So when I first design/create test, how can UML fit in? Would it be possible to design classes first, from them create skeleton classes, from them generate Unit tests which would be "filled" before actual implementation of UML pregenerated classes, would this approach break whole TDD? Or is there any other way that would keep UML and TDD together?
The TDD cycle is test, code, refactor, (repeat) and then ship. As the TDD name implies, the development process is driven by testing, specifically that means writing tests first before developing or writing code.
The first paragraph is purely definitional ... from how Kent Beck defines TDD ... or how Wikipedians generally understand TDD ... I thought it was necessary to belabor the point with this definition, because I am not certain whether everyone is really discussing the same TDD or if others really understand the implications the [most-important part or the] definiton of TDD, the implications of writing tests first. In other words, I think more of the focus of the answers to this question should delve a bit deeper into TDD, rather than explaining a bias for/against UML. The lengthy part of my answer relates to my opinion on using UML to support TDD ... UML is just a modelling language, certainly not required to do TDD; it could get in the way if applied inappropriately ... but UML can help with understanding requirements to write tests, how modeling can help refactor if needed and how collecting the artifacts of the process speeds the documentation of shipped code. I would welcome any comments, criticisms or suggestions, but please don't vote my answer up or down because you agree or don't agree with the first paragraph ... the TDD acronym is Test-Driven Development which means Test-First Development.
Writing a test first implies that the developer understands the specifications and requirements FIRST ... obviously, any test is written should fail until the code gets written, but in TDD, the test must be written first -- you can't do TDD without being focused on understanding a requirements specification before you write tests, before you write code. There are situations where the requirements do not exist at all; requirements elicitation involves a bit of hacking a pre-pre-alpha version to "throw mud at the wall to see what sticks" ... that sort of thing should not be confused with development, certainly not test-driven development, it's basically just one form of requirements-elicitation for a poorly-understood application domain.
UML diagrams are one form of requirements input to TDD. Not the only one, but probably better than written specifications if people who are knowledgeable in creating UML diagrams are available. It is often better work with visual diagrams for better communication in exploring the problem domain [with users/clients/other systems providers] during pre-implementation requirements modeling sessions ... where simulating performance is necessary for really understanding requirements (e.g. CANbus network interactions); it is often ideal if we can work with a specification language or CASE tool like Rhapsody or Simulink RT Workshop that can be executable, modular and complete ... UML is not necessarily part of TDD, but it is part of an approach design synthesis that involves expending more effort understanding what is required first before writing any code [and then wasting time trying to market that code to someone who cares]; generally, more focus on understanding the systems requirements lays the groundwork for more efficient, productive agile TDD iterations.
Refactoring is about improving design -- cleaning up working, tested code to make it simpler, easier to maintain. You want to tighten it up as much as possible to remove obfuscated tangles where for bugs might be hiding or could spawn in future releases -- you don't refactor because a customer requires it; you refactor because it's cheaper to ship clean code rather than to continue to pay the cost of supporting/maintaining complexicated mess. Generally, most of TDD is more code-focused; but, you could employ UML for looking at larger scope or taking a step back to look at the problem, e.g. creating a Class diagram to help identify [missing] tests or possible refactorings. This is not something you'd need to mandate or want to do on a general basis, but where appropriate.
The last step, 'ship' is a serious step ... 'ship' is not shorthand for "toss it over the wall and forget it, because good code doesn't need support" or "abandon and hope that there are no more iterations." From a financial or business perspective, shipping is the most important step of TDD, because it is where you get paid. Shipping does a involve "shifting gears" because it includes systems integration, preparation for support and maintenance, getting ready for the next round of development, etc. The primary use of UML diagrams will be to communicate [in abstract terms] how the code does what it does ... UML is useful because hopefully the diagrams are an artifact of the requirements and development processes; it's not necessary to start from scratch when the code ships ... as a communication tool, UML would be appropriate for reducing integration errors multi-module systems, larger projects that might involve modules written in different languages, networks of embedded systems where different companies must collaborate on safety-critical systems but need the abstraction to be stingy with or protective of their "proprietary knowledge."
Just as you should avoid using big hammer in situations where a tiny screw driver is appropriate OR you aren't going to get anywhere by asking all developers to standardize on using Emacs as their editor. Sometimes the view is not worth the climb -- you don't want to always haul out the UML banner or become known a the guy who was alway pushing UML ... particularly not in situations where there is no substitute for writing tests or reading code. If you stick to appropriate situations, you should not be afraid to use UML as a communication language in all situations where the language helps you.
So when I first design/create test,
how can UML fit in? Would it be
possible to design classes first, from
them create skeleton classes, from
them generate Unit tests which would
be "filled" before actual
implementation of UML pregenerated
classes, would this approach break
whole TDD? Or is there any other way
that would keep UML and TDD together?
If you create an entire skeleton class - by which I assume you mean a class with all methods defined but empty - before writing your first test, then I would say you are not doing TDD, and lose the benefits of TDD. As we do TDD - Test-Driven Design - our tests incrementally lead us to the next elements - methods and classes - that our program needs. If you have predetermined, pre-specified in UML, what your classes and methods are, a great deal of your design is already done, and either your subsequent development is constrained to follow it, or you have wasted effort that subsequent work will undo.
There may be ways to use both UML and TDD together for design, but as you have described it, you are doing the design within UML, before TDD ever gets a chance. That won't give you the full benefit of TDD.
Or is there any other way that would keep UML and TDD together?
UML and TDD fit wonderfully together:
Make the initial design in UML - it doesn't have to be complete, just consistent, self-contained
Create the empty tests. This step could also be automatized
All tests will fail at first, as required by TDD (because the generated code from UML does not have any code)
Start writing tests for each class
Start with classes which do not have a great deal of associations if you are confident with your software architecture and UML skills (no, you're not doing waterfall, but sometimes you just know what you're doing - you know the application domain already or you have used expert knowledge at step 1)
Start with classes which have a lot of associations ("central classes") if you are NOT confident in your understanding of the application domain - this will make it easier to eliminate as soon as possible bad design decisions, because you will notice them as early as possible
... The tests are still failing
In parallel to each unit being tested (step 4), write the implementation inside the empty method bodies. DO NOT modify any class, interface or method names or parameter signatures. You ARE only allowed to add private helper methods, NOT more
If at step 6 (which is run in tandem with step 4) you realize you need to make changes in the design:
Go to step 1 and refine the UML, then generate the code again (good UML tools will not overwrite your implementation). Attention: avoid introducing new classes. You want to finish step 13 within a few weeks
Re-run the tests and fix the ones failing which were previously OK
Continue with what you left at step 6
Go to step 6 if not all class tests pass
Proceed to component, package and subsystem tests
Add deployment to UML and deploy to the integration environment (http://en.wikipedia.org/wiki/Development_environment_%28software_development_process%29)
Proceed to integration tests
Go through the test/QA stage
Go through the User Acceptance Testing
Reiterate the previous steps as required (according to your iterative development process)
... months pass
Deploy version 1.0.0 to production
Do not try to do arrive at many design decisions at step 1 or following reiterations of step 1 (refining the design). You want to finish step 13 in the first iteration after a few weeks.
While some people think UML is a design methodology, it is first and foremost a communication tool. Hence the name, Unified Modeling Language. The idea is to have common vocabulary (of shapes) that you can insert in a book and everybody will understand.
TDD on the other hand is a design methodology, the way to construct the system starting from its interface and its consumer, and only then adding the implementation.
Once your design has emerged as a result of applying TDD, you can communicate that design using UML. If you don't need to communicate (like if you're the only developer), you arguably don't need UML.
Some people think of domain analysis (identifying key Nouns, Verbs and Adjectives and building a basic ontological model) as being a part of UML methodology, reflected in the use case & ER diagrams... This may be useful to do before you jump into TDD red/green/refactor cycle. Then again, strictly speaking this is DDD (Domain Driven Design), not UML per se.
If you're designing classes using UML, you're designing what you think the code will look like. But what you create using TDD is the reality.
Leave UML until the end, to document the results of the code you created using TDD.
UML is the design part, of course.
I'd only use it on the system I was ending up with. Rendering test classes in UML seems ludicrous to me.
I'd use it to get a general sense of what I was creating. Then I'd start with TDD to implement it.
As far as keeping it in synch, I'd get a UML tool that was capable of import and export. I'd worry more about the code and less about UML. It's useful for capturing your thoughts and documenting later, but not much else. I'd always prefer working, tested code over diagrams any day.

Exercises to enforce good practices such as TDD and Mocking

I'm looking for resources that provide an actual lesson plan or path to encourage and reinforce programming practices such as TDD and mocking. There are plenty of resources that show examples, but I'm looking for something that actually provides a progression that allows the concepts to be learned instead of forcing emulation.
My primary goal is speeding up the process for someone to understand the concepts behind TDD and actually be effective at implementing them. Are there any free resources like this?
It's a difficult thing to encourage because it can be perceived (quite fairly) as a sea-change; not so much a progression to a goal but an entirely different approach to things.
The short-list of advice is:
You need to be the leader, you need to become proficient before you can convince others to, you need to be able to show others the path and settle their uncertainties.
First become proficient in writing unit tests yourself
Practice writing tests for existing methods. You'll probably beat your head on the desk trying to test lots of your code--it's not because testing is hard or you can't understand testing; it's more likely because your existing code and coding style isn't very testable.
If you have a hard time getting started then find the simplest methods you can and use them as a starting point.
Then focus on improving the testability of the code you produce
The single biggest tip: make things smaller and more to the point. This one is the big change--this is the hardest part to get yourself to do, and even harder to convince others of.
Personally I had my "moment of clarity" while reading Bob Martin's "Clean Code" book; an early chapter talks about what a clean method will look like and as an example he takes a ~40 line method that visually resembled something I'd produce and refactors it out into a class which is barely larger line-count wise but consists of nothing but bite-sized methods that are perhaps 3-7 lines each.
Looking at these itty-bitty methods it suddenly clicked that the unit-testing cornerstone "each test only tests one thing" is easiest to achieve when your methods only do one thing (and do that one thing without having 30 internal mechanisms at play).
The good thing is that you can begin to apply your findings immediately; practice writing small methods and small classes and testing along the way. You'll probably start out slow, and hit a few snags fairly quickly, but the first couple months will help get you pointed in the right direction.
You could try attending (or hosting one if there is none near you!) a coding dojo
I attended one such excercise and it was fun learning TDD.
Books are always a good resource - even though not free - they may be worth your time searching for the good free resources - for the money those books cost.
"Test driven development by example" by Kent Beck.
"Test Driven Development in Microsoft .NET" by James W. Newkirk and Alexei A. Vorontsov
please feel free to add to this list
One thing I worked through that helped me appreciate TDD more was NHibernate and the Unit of Work Pattern. Although it's specific to NHibernate and .NET, I liked the way that it was arranged. Using TDD, you develop something (a UnitofWork) that's actually useful rather than some simple "this is what a mock looks like" example.
How I learn a concept best is by putting it to use towards an actual need. I suggest you take a look at the structure of the article and see if it's along the lines of what you're looking for.
Geeks are excellent at working to metrics, whether they are good for them or not!
You can use this to your advantage. Set up a CI server and fail the build whenever code coverages drops below 50 percent. Let them know that the threshold will rise 10 percent every month until it's 90. You could perhaps use some commit hooks to stop them being able to check code in to begin with but I've never tried this myself.
Let them know the coverage by the team will be taken into effect in any performance reviews, etc. By emphasising it is the coverage of the team, you should get peer pressure helping you ensure good coverage.
This will only ensure they are testing their code, not how well they are testing their code, nor whether they are writing the tests first. However, it is strongly encouraging (or forcing) them to incorporate testing into their daily development process.
Generally, once people have something in their process they'll want to do something as easily/ efficiently as possible. TDD is the easiest way to write code with high coverage as you don't write a line of code without it being covered.
Find someone with experience and talk to them. If there isn't a local developer group, then start one.
You should also try pushing things too far to start with, and then learn when to back off. For example, the whole mock thing started when someone asked "What if we program with no getters".
Finally, learn to "listen to the tests". When the tests look dreadful, consider whether it's the code that's at fault, not your testing technique.

Resources