Test-driven development is a relatively new development technique. Originally it is part of the Extreme programming process, but can be applied with success to other processes like RUP, Scrum and so on. Test-driven development or TDD is an incremental practice which emphasizes on writing unit tests first and then refactoring. In fact tests drive the design and the implementation, so TDD is in fact not a method for writing unit tests it’s a method for evolutionary design and development. The life-cycle of Test-driven development is:
- Pick up a new test and create it, it is very possible that the unit test will not compile. The Eclipse studio, on top of which our Netweaver development studio (NWDS) is built, provides features to help us, so you can press Ctrl-1 and the studio will offer you to create a new class or function for you. So you have the test compiled, you can run it and the test will fail. This it the red bar, i.e. the red part of the title of this blog.
- Now you need to write as much code as to make the test pass, this is the junit’s green bar, or the green word in the title above. Try keeping the code as simple as possible. 
- This is the refactoring part, refactor to remove any code duplication or any other code smell  that have occurred. The goal is to keep the code clean, this also means to reduce the coupling and increase the cohesion, which is also known as Single Responsibility Principle .
Although Test-driven development doesn’t rely on big up-front design, it doesn’t mean that there is no design at all. You need to think of desing, before you start creating tests and code, but you shouldn’t be very detailed. In fact TDD provides a technique for applying evolutionary design, i.e. the design process continues during the implementation. Another aspect of Test-driven development is that the developer works in short iterations. Since we write source code only to satisfy a failing test this implicitly means that there is very extensive test coverage, i.e. our code is covered by tests better than any other development style we use. This fact has a psychological effect – programmers are more confident about their code, there is less stress, less bug reports, and as a consequence we are enjoying programming itself. When we test after the code is completed the process can be a little bit boring, but with TDD it is not that any more. According to Kent Beck, who popularized Test-driven development, there are two core rules in TDD which are – you write code only if a test has failed, and you should remove any duplication and code smell you find. Again according to him this leads to the following behavior:
- You design organically, with the running code providing feedback between decisions.
- You write your own tests because you can’t wait 20 times per day for someone else to write them for you.
- Your development environment must provide rapid response to small changes (e.g. you need a fast compiler and regression test suite).
- Your designs must consist of highly cohesive, loosely coupled components (e.g. your design is highly normalized) to make testing easier (this also makes evolution and maintenance of your system easier too).
These statements are cited from Kent Beck’s book “Test Driven Development: By Example”. Fortunately our Netweaver Development Studio supports development in test-driven manner; we can also use “Mock Objects”  when we create highly cohesive and loosely coupled components. I will dedicate one of my future blogs on “Mock Objects”.
Here is a simple list how Test-driven development can improve the design of software:
- Better encapsulation – by writing a new test code, you concentrate on one thing at a time, thus creating smaller and well-defined interfaces.
- Better reusability – since every piece of code is tested in isolation it has at least two clients – the unit tests and its real client code. This makes it easy to add a third client.
- Design patterns – as there is good test coverage we can refactor more aggressively, and as you refactor you can discover patterns .
- Fewer and better dependencies between the entities (classes) – since we test the code in isolation it is natural to have better relationships between classes.
Two months ago I was given a task to create functionality to execute mappings between parameters of two functions, it is somewhat similar to the mappings between the Web Dynpro contexts. I was given very good requirements and I decided to start working in test-driven manner. First I thought of the algorithm I will use, then I set up the test environment, I created XML parsers for the test data and metadata. I used JUnit as a testing framework and at one place I needed to create mock object to provide better encapsulation. So I started with creating unit tests for the most simple use cases, and finally I got about 38 test cases for about 800-900 lines of code, most of which a little bit complex. During the implementation I noticed some similarities in certain domain objects – I extracted interfaces and thus reduced duplication, I refactored aggressively toward the goal of simple and clean code. I tried to follow the “redgreen
efactor” mantra, but it is still hard for me since I am in the learning curve yet. From time to time my implementation phase was maybe longer that needed, but that was the way I was feeling good with.
 [C2 Wiki] You Arent Gonna Need It
 [sdn.sap.com] When should we refactor?
 [wikipedia.org] Single responsibility principle
 [connextra.com] Endo-Testing: Unit Testing with Mock Objects
 [industriallogic.com] Refactoring To Patterns Catalog
 Game development with TDD