Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I see the benefit of TDD, and I'm trying to learn how to wrap my head around it. I'm also reading more about DDD and want to start applying both of them to my software projects.
I've purchased a few "hands on" programming books (by "hands on" I mean ones that discuss a real world application with real solutions as opposed to small snippets) and I've noticed that they typically start defining the "infrastructure" layer of the application in traditional code-first fashion, as opposed to using TDD; both books go out of their way to discuss how good TDD is and how the case study will make use of it.
For example in one of the books, ASP.NET 3.5 Social Networking, the entire second chapter develops a Logging wrapper class, an Email wrapper class, Cache and Session wrapper classes (and their associated interfaces) all without touching upon a single unit test. Another book, .NET Domain Driven Design with C#: Problem, Design, Solution does similar, and creates a base class and repository framework code-first before even touching on the "real" code.
I understand that you should test the actual logic and functionality of your domain classes. I had thought the "don't test plumbing" code only applied to code that you didn't write (e.g. built-in .NET classes), but what I'm reading seems to indicate/suggest that you should only test the code that actually has to do with your application and not the plumbing that you write to provide a foundation.
Is this an acceptable way of applying TDD?
When learning TDD, apply everything. Afterward, apply what you need.
If you are writing the plumbing from scratch then you should have tests around it. If you are just using a few interfaces and classes to abstract away your linq2sql calls then, no I wouldn't necessarily put unit tests around that.
To quote someone smarter than I:
I’m not religious about TDD. I
consider it a discipline worth
following. I don’t write all my tests
first. Some small number of them are
simply more convenient to write after
the code. There is even some code I
don’t write tests for at all, because
it’s just not worth it. But those are
exceptions to the rule. The vast
majority of the code I write, I write
test first.
-uncle bob martin via: http://www.infoq.com/news/2009/02/spolsky-vs-uncle-bob
It probably depends on other factors about how you plan to build your project.
If you're following other Agile practices, such as small iterations and deliveries, then you won't have much of an architecture or infrastructure at the start, because you won't have time to develop much while you're implementing and delivering the first few features. Anyway, why spend time on Big Design Up Front, when you don't really know what the code is going to need?
So you'll build your code test-first and over a number of iterations. Test coverage means you can refactor to improve your design, which (in theory) allows exactly the right the infrastructure to emerge for the application as it exists at any moment in time. Ron Jeffries explains it well here. Without the tests, you're probably going to hit a point where you need to stop and figure out what the structure should be, which is going to take time that could better be spent building useful features and that you're going to have to test at the end anyway.
If you're 100% certain you can map out the design correctly before you have written any code, then it's your choice to do so. Make sure to leave plenty of time in the project for testing, though. I think everyone should build at least one significant piece of work both by the "traditional" Waterfall process and by just diving in and coding in order to have some experience that puts the Agile practices into context. Otherwise you're at risk of knowing "what" without knowing "why", and that makes the lessons harder to learn.
If you test some code, and don't test other code, which code will have the bugs?
Hint: it's the untested code.
I am no expert here.
But if you are developing plumbing components, it should be tested.
If I understand correctly, with TDD, code written against interface & in absence of plumbing components, people use Mock objects.
I went into this at some length in another question. Basically, the point of using TDD and mocks and all that stuff is to develop more confidence.
TDD should be applied to any code that you develop. When using TDD you needn't test everything -- i.e., the goal isn't 100% coverage -- but coverage should be pretty high over code that you develop. For example, you don't need to test automatic gettor/settors in C# -- you can trust that the language will do it's job there. When in doubt, though, write the test first.
By all means write code-first to gain experience with a platform - just be brave and throw it away once you're sure you can tackle most tasks on that platform. I've just done that with a small Java program I've started.
Now that I've gained some experience I can see what I have to do to get very high line coverage now that I'm starting to write it test-first, even most of the plumbing.
Related
I understand that TDD has many advantages (some of them are below). How ever I am not sure how it drives the design?
Serves as documentation
writing tests before actual code helps maximum test coverage
Help determine input value boundaries
Usually when we start implement new piece of functionality, we will have rough idea of the design. And we start with TDD implementation of a class, which is used by other classes as per design. This understanding of mine seems to be in conflict with statement "TDD drives the design"
Please help me understand this by an example.
Most people think that Test-Driven Development is tool to write code with lesser number of bugs. But ,in reality, that is the by-product of TDD. TDD is more of a tool for code designing.
Overall, TDD helps in quality code development in following ways:-
It makes you think about your code design and requirements at every stage, thereby ensuring that you are actually implementing what is required.
You are forced to write testable code, thereby ensuring that your code has loose coupling and high cohesion.
If your code is getting difficult to test, mostly is signifies that there is some issue with your design(Your code is too coupled or not isolated enough)
With that said, I tend to disagree with people that think if you follow TDD blindly you'd always end up with good code design(because that depends more on you and your knowledge of Sofware Design), but I believe there is a good chance you would.
TDD doesn't drive design, but it's an excellent feedback mechanism, which enables you to get rapid feedback on your design ideas.
It also has the beneficial side-effect that it leaves behind a nice regression test suite.
The most basic idea behind this, is that when you want to test something, you want to keep the tests simple and focused. This in turn forces you to start with the smallest bits of your problem, driving towards the Single responsibility principle.
Another thing, for larger problems, is when you are forced (because you want your tests to be focused) to use mocks and stubs. This drives your design towards using the Dependency inversion principle, which in turn makes your code loosely coupled.
TDD in general
The goal of Test-driven Development (TDD) is to create software components that are precisely conforming to a set of functional specifications. The task of the tests is to validate conformity and to identify disparities from these specifications.
Design Guidance
By writing tests first, you're forced to make up your mind what your software component should do. Thinking about meaningful test cases first will guide your design, as you're forced to find answers to a set of questions: How should process inputs? To which components does it interface? What are the outputs I'm expecting? ...
TDD does not replace SW design. It sharpens your design!
In essence, TDD will help you to sharpen your design - but one thing is crucial: You need to find reasonable tests. And trying to replace functional specifications - call it a requirements documents - with tests only, is rather dangerous. Why?
Unit tests as used by TDD can only guarantee that your components exhibit the expected behavior for the given test data, but not for any data! As Dijkstra said in the 1960s: "Testing shows the presence, not the absence of bugs".
> Usually when we start implement new piece of functionality, we will have rough idea of the design.
That's the core thing of your question. If you just have a rough idea of the design, you better should spend more time at the drawing board, asking your self questions like: What are the individual tasks my software should carry out? How can I split the general task into subtasks? How can I map these subtasks to components? Which data needs to be passed among them?
At that time, you might consider doing TDD. But without thinking about a design or software architecture first, you'll end up with a "spaghetti system" that is hard to understand and hard to maintain later on.
Great observation by Mark Seemann. The principle that pushes me to value TDD is fast feedback (and so fast learning). You can't substitute good design techniques for just doing TDD - learn good design principles and use TDD to provide fast feedback on your work.
I find that when using TDD most deeper thinking about my design happens during the refactor step and the tests then provide a space for that step and a safety net for change, but it doesn't automatically create good design.
TDD allows me to hack some working code and use the learning I get while doing that to iterate towards a better design. Of course this can be done without TDD too, but the test helps by providing that safety net.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Is Agile really different from TDD ? If so what are the main differences ?
Agile is anything that is inline with the values listed here - http://agilemanifesto.org/
XP (Extreme Programming) is a methodology that qualifies as agile. There are others too (Scrum, Crystal, etc..)
TDD (Test Driven Development) is a specific engineering practice from XP - which is a way to write code + drive design in incremental chunks. You write a test first, make it pass with the simplest possible change and then refactor to improve the structure/design. You do this in a loop till you're done.
http://en.wikipedia.org/wiki/Agile_software_development
http://en.wikipedia.org/wiki/Test-driven_development
In short, TDD and Agile really focus on different aspects, and aren't mutually exclusive of one another.
TDD is very focused on how code gets written (and thus tends to be aimed at the work cycles of individual or small groups of developers exclusively).
Agile is really focused on the overall development process, not just how the code gets written and tested (and thus focuses a lot on project management and groups of developers, as opposed to specifically how a given developer writes code).
I'm going to play devil's advocate and say, Yes. TDD and Agile are the same thing, just at different scales.
Bear with me, here.
First off, TDD encourages you to have a plan of what you want to achieve, before you achieve it. You don't think about how you're going to achieve it. You just think about how to demonstrate that the thing you're about to write is valuable, and you work out how you know. This is very similar to the way in which projects are ideally sourced and requirements are gathered in Agile methodologies.
Then, you do the minimum necessary to get feedback on the work. With TDD this means a red or green bar. At a higher level, Agile methodologies encourage you to showcase and get feedback from stakeholders, or (preferably) release to production and see if it works and if anyone uses it.
Then you work out what the next bit of work to do is, and you write the next bit of the plan.
Agile methodologies may also use estimation and prioritisation, but these are mostly to do with assigning budgets and working out whether the work is worth doing in the first place - which devs do naturally at a small scale.
Agile methodologies are iterative - that is, the periods of planning, implementation, delivery and feedback are cyclical. So is TDD, with exactly the same steps.
Agile methodologies usually emphasise communication, conversation and lightweight documentation. TDD can be used as a form of documentation, too, especially if you make the test-names into meaningful sentences and use the same language that the business use together with realistic examples. This can also encourage communication and conversation.
Agile methodologies encourage reflection and learning from the past. TDD has this lovely red bar which can help you to do that very quickly.
If you scale TDD up, you get Acceptance Tests or BDD-style scenarios. Further up you start getting into Feature Injection and other forms of vision-driven analysis. Further up still you're looking at whether the released project retains your market share, or provides the options you wanted going forward, or achieves whatever its original vision was. The tests get bigger, but the process of writing just enough, getting feedback on it and learning continuously is still the same, no matter what the Agile methodology is.
All the rules which help Agile to be successful at a large scale can be applied to TDD at a small scale, and vice versa. (I'm trying hard to think of any exceptions to that and can't).
So, no. Agile isn't different to TDD. It's just TDD-done-bigger.
The differences are huge because I think there's a very simple distinction:
Agile is a philosophy whereas TDD is a specific methodology.
There are any number of ways you can work that are agile but by and large you are either doing TDD or you're not.
You can be Agile without using TDD (or a variant thereof) and you can use TDD without being agile (though I'd be somewhat surprised).
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Lately I've been a lot into agile methodologies. Some papers I've read at Martin Fowler's website looked thoroughly unflawed to me to say the least, and I was wondering what agile practices are most suited to game development, particularly for small-budget, small-sized and inexperienced team projects (bold because it's really important).
Concepts such as refactoring look like they could absolutely combine with a small and inexperienced team. The idea of "embracing the change" also combines with inexperienced teams whose ideas twist and turn all the time. But things like TDD are rather problematic.
It's hard and suboptimal to test a class that interacts with Direct3D, for example. It doesn't make much sense.
I'd be most deeply grateful if you could list a bunch of practices that made sense for game development. Ones that aid in the organization of artistic production are a plus. Citations of real-world cases are another plus.
Thanks in advance.
edit --
Plus, my team is composed of 3 people: one programmer, one graphics designer and one programmer/graphics designer combomix. We do not have a client, so we must make all decisions alone.
I've read in an article by Fowler that agile sort of depends on developer-client interaction, but he also mentioned that clients unwillingly to adhere to agile could still be well-served by agile development (the article was called New Methodology). How does that apply to my case?
Conclusions --
I think questions at StackOverflow can help others too, so I'll try to summarize my thoughts on the subject here.
Through the use of mock objects, even hard-to-test elements such as graphics interfaces and their relationship to client classes might be manageable.
For example, instead of letting every client of the interface truly test its use under many conditions (fullscreen/windowed mode switch, for example, which affects almost everything in the game), they could be tested against a mock that seemingly to them behaves the same as the original class, and additionally test that mock's fidelity to the original object.
That way, the slow part (actually opening the window and stuff) is only done once in checking the mock's fidelity to the implementation, and everything else just run smooth over the mock. [thanks to Cameron]
A BDD mindset aids at easening the paranoid seek for thorough testing of units, "replacing" testing by specification of actual behaviours, not squeezed units which in many cases are better off let untested (or only indirectly tested, if you prefer putting it that way) to avoid too much one-to-one test vs. unit (class, method, variable, etc) parity, which adds to test (now "specification") fragility. [thanks to Kludge]
I would recommend trying out VersionOne (www.versionone.com). VersionOne is free for a small team working on a single project, and has easy-to-use tools for agile status tracking and project planning. Their site also has links to explanations of the various Agile development methodologies.
There are different flavors of Agile development; I'd recommend taking a look at the Extreme Programming (XP) model, as a good example:
http://www.extremeprogramming.org/map/project.html
Agile development is as much concerned with project planning and requirements tracking as it is with the actual programming practice.
The idea is to make sure you record game features that need to be developed (as "user stories"), give a (very rough) estimate of how long each will take, and figure out which ones are important. Box out a small amount of time for each release, and schedule the most-important, least-expensive features that you can release in that time. This system ensures steady forward progress, protects you against constant, unpredictable priority changes, and makes sure you don't develop a huge monolithic code base that doesn't work.
With regard to Test-Driven Development, I think Cameron's and Andrew Grimm's comments are both on point. You can do a lot more unit testing if you abstract away things like graphics API calls.
You definitely want to look at Extreme Programing (XP), take a look at Kent Beck's Extreme Programming Explained: Embrace Change, 2nd Edition
The most useful thing you can do though is to do some research on Behaviour-Driven Development which is basically Test-Driven Development done right. It takes the focus off of tests and back onto specifications. You don't worry about what classes do so much as what behavior your program exhibits.
So saying you aren't going to use TDD, or BDD, is just plain crazy talk. One of the core concepts of Agile development is developing your software from your tests/specs. You have to get out of the mindset that tests/specs are testing your classes. That's not really what they're for. They are for describing the behaviors your application should exhibit then using that test/spec to write the behavior into your application.
You might write something like this
Describe Startup
it "should display a welcome screen" do
game = Game.new
game.start
game.output_buffer.should match("Welcome")
end
end
Then you go write the code to make it happen. You describe the code you want, then you go write it. It allows you to write your code in little, bite sized chunks and best of all when someone else picks up your code they can run the tests and see that everything works. When they want to add new functionality they use the same process so now when you go back to the code you can have faith that their code works too.
Agile/Lean methods such as Scrum, XP and Kanban have been successfully applied to game development since 2003.
There are a number of blogs including:
http://blog.agilegamedevelopment.com/
and a book as well. See the book link in the blog above.
If you have good model view controller (MVC) separation, you should be able to test "business logic" without graphics. For example, testing that a given weapon produces the correct amount of damage, can't shoot through walls, and has a given area of effect.
Agile is not at all compatible with GameDev. They are two completely opposite methodologies.
Agile is about development being flexible to changing business requirements and breaking down projects into clear and managable deadlines and work units. GameDev is about regularly and dramatically changing business requirements without caring about the impact on development, and breaking down development teams through unmanagable deadlines and volumes of work.
I don't believe there's actually any element of Agile that is incompatible with game development. I think you're mistaken with respect to difficulty of testing graphic functions; running a graphic function generates an image, and it's easy to compare images to manually-approved "golden masters" (see Approval Tests for one way to go about this).
Refactoring is a fine technique, but it should be done with a safety net of unit tests. The best way to get unit tests is to develop your code test-first, so TDD is another Agile practice you should be following. And it's all better - refactoring, testing, and the rest - if you pair. You have enough people on the team to be pair programming. Try it and see how much faster you get running tested features!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
Simple question. Let's put on our engineer/project manager hat for a second:
You have a large project and you will have several different developers working on different parts. You have a solid functional spec and are ready to start figuring out your implementation. You want to practice Test Driven Development. Assume you will be given reasonable but not infinite time and money.
How do you start preparing the modules for implementation by the other developers? Do you start writing interfaces or do you start writing tests? Do you mix n' match?
How would you change your answer if you had a team of veteran coders? A team of less experienced folks? Would your coding environment change your decisions?
Strictly speaking - if you are doing TDD (not just Unit Testing) then you start with the tests first before writing the function that the Unit tests actually test.
All functionality that you write needs to have tests written to verify the code that you are about to write.
The developers themselves would be the ones writing the unit tests - functional/acceptance tests are a separate issue and could be (partially) written prior to work commencing.
If the team is not experienced with Unit Testing, they should start implementing features and then writing Unit Tests as soon as each little class/small piece of functionality is finished
Unit Tests (and TDD as a result) are not to test modules of systems - they are to test at a more granular level - that is that functions and classes do what the dev expects them to.
Testing at a higher level is outside the bounds of TDD, and stepping into other types of tests
If you have a good functional spec, I'd split the work...and start working on test cases.
Once you've got the test cases worked out, the developers can begin coding their modules and the tests to go with them.
...and the approach wouldn't change with different developers or coding environment.
Without searching and being fairly new here, I'm guessing there's been a lot of discussion around this, but let me give an answer anyway.
First, I'd ask what you mean by a 'large' project? I've seen some people label a project taking a few months and involving 4 or 5 programmers as a 'large project'. To others, a 'large project' means a multiple year duration and 30 or 40 devs. I'll assume it's somewhere in the middle, given you mention 'several developers'. To be safe, let's assume it's a half year to a year in duration with 5-10 devs.
As others have said, if you're doing TDD you'd start with the tests first, not a lot of design work. The design is emergent. In practice, however, I think there's a balance between the TDD approach (which I see as valuable but not essential) and simply ensuring you have good unit test coverage, which is essential in my view.
If you're a coder yourself and you have experience with TDD, I'd suggest you should be practicing what you'll be preaching. Rather than trying to design the whole system at an abstract level, definining interfaces, and so on, choose a core piece of the system. Make sure to do the simplest thing possible, get a green bar, refactor, add another test, and so on.
The biggest impediment to using TDD on a project with multiple developers is lack of experience with the methodology. Give your programmers a concrete example (even if it's a small bit of functionality) that really shows them how to do it the right way, pair with people as they come onto the project, have regular reviews of people's unit tests, and make sure it continues to be a topic that's at the forefront of what you're doing, not just an afterthought. If you're doing Scrum, make it part of your definition of 'done' for any task/story.
I'd also spend some time setting up something like EMMA or Cobertura (big fan of Cobertura), so you have some hard metrics by which to assess people's tests . effective your tests are, but they are a data point. If you have 20% test coverage, you can be pretty sure people aren't writing the tests they should. Conversely, just because you have 90% test coverage doesn't ensure the tests are good, which is the reason for things like pairing and reviews.
So, the simple answer is: give your guys an example, focus on pairing/reviews, and put some things in place like a code coverage tool to help keep the team honest.
TDD isn't exactly what you're looking for. TDD happens at the beginning of the project and drives development, hence Test Driven Development.
What you're likely looking for is a test writing strategy. From being on numerous big projects that have implemented testing later on, here are some tips:
Start small. Don't try to get 100% coverage on the entire project, choose one class or one set of functions and begin there.
The impulse will be to start with your smallest/simplest class/functions just to get some quick wins and have code coverage at 100% for one file. Don't.
Instead, as you get bug reports and fix them, write tests to support yourself. It becomes an incredibly effective exercise to write tests to demonstrate bug, fix bugs, run tests and not see a bug.
As you get new feature requests, write tests to support yourself. It's the same effect as fixing bugs.
Once you have a few tests written, run them continuously. There's no point to having tests if they're all broken and/or not being run.
True, you don't end up with 100% code coverage, but you can steadily and regularly apply more and more tests and after a dozen bugs or so, you'll have quite a few useful tests. And more importantly, you'll get more and more tests is the "worst" or most problematic areas of the system. We're doing this now with web2project: http://caseysoftware.com/blog/unit-testing-strategy-web2project.
In traditional development (non TDD), between having a functional spec and writing the first line of code there is a lot of designing involved. You would still need to go through this process and this definitely depends on the skill level of the team. The only thing that would be different in TDD is to write the test cases just before writing the code. You obviously cannot write the tests without knowing what classes/interfaces are involved.
I have a project that I have been working on for a while, just one of those little pet projects that I would like to one day release to open source.
Now I started the project about 12 months ago but I was only working on it lightly, I have just started to concentrate a lot more of my time on it(almost every night).
Because it is a framework like application I sometimes struggle with a sense of direction due to the fact I don't have anything driving my design decisions and I sometimes end up making features that are hard to use or even find. I have been reading about how to do TDD and thought maybe this will help me with some of the problems that I am having.
So the question is do you think it's a good idea to start using TDD on a project that doesn't already use it.
EDIT: I have just added a bit to clarify what I mean by struggle with a "sense of direction", it properly wasn't the best thing to say without clarification.
In my opinion, it's never too late to adopt a better practice - or to drop a worse one - so I'd say "Yes, you should start".
However ... (there's always a "but") ...
... one of the biggest gains of TDD is that it impacts on your design, encouraging you to keep reponsibilties separate, interactions clean and so on.
At this point in your project, you may find it difficult to get tests written for some aspects of your framework. Don't give up though, even if you can't test some areas, your quality will be the better for the areas you can test, and your skills will improve for the experience.
Yes.
Basically, you can't do any harm by adding TDD for any new code you write, and any changes you make to existing code. Obviously it would be tricky to go back and retro-fit accurate tests to existing code, but it certainly couldn't hurt to cover the primary use-cases.
Maybe consider having a look at Brownfield Application Development in .NET? It is full of pragmatic and practical advice for exactly this scenario (one of the definitions offered for "Brownfield" is "without proper unit tests").
Yes, absolutely a good idea to start doing TDD.
You will pay a start-up cost for at least two reasons:
Learning a new skill TDD/unit testing.
Retrofitting your code to be testable.
You'll need to do some of both, but as you work if you find yourself struggling think of which of those two is the source of the effort.
But the end result is worth it. From what you describe this is a project you intend to live with for quite a while. Remember that when you lose an hour here or there. In a year you'll be very happy that you made this investment both in your skill set and the code base.
At worse, you can just do TDD on new stuff, while you slowly create tests for your existing code base.
Yes, it's never too late to start using TDD. I have introduced TDD to a commercial project that was already running for five years when I joined, and it was definitely a good decision.
While you are new to the technique, you should probably concentrate on using it for the code that you are writing from a clean slate - new classes, new methods etc. Once you got a hang on it, start writing tests for code that you change.
For some of the code, the latter might prove to be difficult, because the code you have written until now is unlikely to be written with testability in mind. There are some techniques to deal with that, but it's probably too early to care about them.
If you are missing a sense of direction, though, I doubt that TDD will help you a lot. You might want to look into Acceptance Testing instead, which is at least as important as unit testing, and will help you focus on the functionality of the system instead of single units of code. The TDD book by Lasse Koskela is a good introduction to both techniques.
Another technique that might help you is the Extreme Programming planning game, where you put pieces of functionality on index cards and prioritize them. I typically notice that getting ideas out of my head and in prioritized order helps me a lot in understanding where I want to go next.
As others have said, TDD shouldn't hurt a project in progress, but think carefully if you're tempted to do large-scale refactoring just to allow testing. Make sure the benefits justify the cost.
I'm a little concerned that you "struggle with a sense of direction." I don't know that TDD will help you there. I find it's a great help for low-level design decisions, but not so great for architecture decisions. Adding TDD to a directionless project sounds a bit like having a baby to save a marriage - unwise. Hopefully I misread your intention.
Yes.
TDD makes it easier for other people to understand the code, as well as it gives the application a better design over time
In theory you were supposed to test first, but you didn't. In this scenario, contrary to others opinion, I wouldn't start with new features.
Take advantage of the 80:20 rule, run a profiler, and put the test cases to the most frequently called piece of code.
Put tests around the house jewel, gut, most-important code.
Put tests around the annoying, always-breaking, recurrent déjà vu buggy code.
Put tests around all bugs you come across before fixing the bug for failing test.
Warning: Putting test cases will require refactoring, which means you must fix something that's not breaking.
If you still love unit tests at this point, you'd be Red, Green, Refactoring on your own.
Absolutely.
Introduce TDD to new code and if time allows, introduce "Comment Driven Design" with your existing code if it's not already tested.
Comment out the block of existing code you need to test
Write your test
Uncomment your original code one statement at a time (if you have an if block, uncomment the entire block)
Determine if your original code ultimately passes your test and if not, re-write to pass your tests accordingly
Writing tests for existing, working code that you don't plan to change doesn't fit with the thrust of TDD, which is to write tests that teach you about the system you're building.
My approach to bringing in TDD mid-stream has been to:
write tests for all new features, and
when changing a piece of code, write a test that covers the existing functionality (to make sure I understand it), then change the test before changing the code.
It can also be beneficial to write tests for code related to code you're changing - e.g., if you're altering a parent class, you may want to build tests around child classes first to protect yourself from potential damage.
Yes, you should. I'm currently working on a project that until recently wasn't covered with unit tests, but we decided that we should start testing our code, so we started writing them now. Unfortunately, I'm the only developer that practices TDD, others just write tests after writing their code.
Still, I found that practicing TDD helps me write better code, and I write it faster than before. Now that I learned how to do TDD, I just don't want to go back to writing code the way I used to.