Are we harming ourselves by "saving time" not building quality into code up front?
One of the concerns about Test Driven Development (TDD) is the impact on short-term velocity; writing a function plus the tests requires more time up front than writing the function alone. The trade-off is in quality. But how much velocity do you lose and how much quality do you gain?
Well here are some numbers from the study: "Realizing quality improvement through test driven development: results and experiences of four industrial teams"
The results of the case studies indicate that the pre-release defect density of the four products decreased between 40% and 90% relative to similar projects that did not use the TDD practice. Subjectively, the teams experienced a 15–35% increase in initial development time after adopting TDD.
Now the question becomes "how much time do we save in the long run in exchange for the extra time we spend doing TDD?" There have been numerous studies about that which show productivity increasing over the life of the code between 0 and 65%. I believe, and have seen, that for games the benefits are much higher. This is because game development productivity is more dependent on defects; we use the game to build the game. Tools are integrated with the game to allow rapid iterations that are necessary to get things right.
For example, consider Unreal Engine 3. In UE3, artists, designers and programmers use the tools embedded in the engine (UnrealEd) to add behaviors, assets and tune mechanics. If the build stability is low, if it's crashing all the time, it impacts not only the programmers, but most of the other disciplines as well.
Think about a team working on a mass-market game. There can be 100 developers working on it. How many are programmers? About 25? Even if TDD didn't speed up code development, would a 40% to 90% decrease in the defect rate still improve the velocity of the entire project?