13

When I have to implement a new feature or fix a bug, I usually try to recreate the situation with a test. I sometime spend around 3 hours coming up with fixtures and writing the test. The actual feature implementation or bug fixing takes less than 1 hour.

Does anyone else out there spend at least 3 times longer to write a test compared to actually implementing a feature or fixing a bug? What's the acceptable ratio of time spent writing test to writing code?

Thierry Lam
  • 1,118
  • 3
  • 11
  • 17

9 Answers9

20

It varies on the complexity of the bug or feature. I recall one project that once had a 1.5-week development time estimate... and a 3-month testing estimate. The code change was small, a handful of lines here and there, but it impacted a number of components of an insurance system in a number of ways, so had to be tested very thoroughly. Another time there was a bug that involved a parenthesis in the wrong place. Took 2 hours to find it, 2 seconds to fix it, but about a week to test dozens of scenarios that may have been affected by the change in logic.

In general, I don't worry about the ratio of time spent coding to time spent testing because there's just no way to be accurate. I find that in some projects, a project-relative ratio appears that is usually standard (to the project), but even then that can change later.

Spend as much time as is needed to say with confidence that the code works properly.

rzickler
  • 129
8

How about you spend enough time writing the tests until you've shown that the feature works as intended, or the bug has been correctly fixed.

Every situation will be different; there cannot be some kind of ratio. Some tests will take a tenth the time as the implementation, others will take hundreds of times as much time.

7

I once did a survery after introducing unit tests in a project. The result: time spent writing tests was about 40% again as much as time spent implementing. But we weren't aiming for full coverage there, and it was a well-established project with strong structure and conventions.

3

I'd say the time coding vs. the time writing unit tests should be approximately equal. Maybe a little bit more sometimes. Take a look at this article on SO Ratio of time spent on coding versus unit testing

TomHarrigan
  • 226
  • 1
  • 3
2

One thing I have learned through difficult experience is that if your gut feeling is that you're spending too much time testing, this could be a red flag that it is time for a refactor. In other words, proper abstractions make ease of testing go a long way. Incorrect engineering or sloppy implementation makes creating mockups exceedingly difficult and even impossible at times.

I would estimate that I spend up to a third of my code-writing time on tests, but as was pointed out by @FrustratedWithFormsDesigner, this will vary from case to case and is directly proportional to the scope of systems or features affected by the code under test.

Writing tests is an excellent rubber-duck debugging technique that has precipitated a number of light-bulb moments for me.

1

Are you counting right? In order to do an accurate accounting of how much time you spend on tests you need to write the code without the test.

If it really took you three hours to write the test and one to write code for it to pass, you may find that it takes 5+ hours to fix the same bug without writing tests.

Yes, I very much often spend much more time on the test than the actual fix code.

0

Does anyone else out there spend at least 3 times longer to write a test compared to actually implementing a feature or fixing a bug?

I'm picking up an undercurrent that you resent (or, at least, disapprove of) the time it takes to write tests.

Yes, it takes time to write tests and this does add to the overall, initial development time for a feature but remember what Tests are for ...

  • Debugging can prove that the code is working correctly today.
  • Tests prove that that the code continues to work correctly over time.

Writing a test might be a one-off activity, but running that test, to confidently demonstrate that a subsequent change hasn't messed things up, will happen over and over (and over) again ...

Phill W.
  • 13,093
0

You should compare the time writing code + convincing yourself that its quality is fine to the time writing code + writing tests + being confident that the quality is fine. Most of the time you spend on writing tests you safe when you examine the quality. Now if you fix a bug in this code next week, you actually safe time because most of your tests can be just are not rewritten.

If you are under pressure not to “waste time” writing tests, you can save some time. Some people insist that one test should only test one thing to make it easier to fix problems. But given the choice of writing ten tests, testing 10 things, and getting told off by some numpty for wasting time, vs. writing one test testing ten things, vs. writing no tests, the single test is your best choice.

gnasher729
  • 49,096
0

I usually spend between 8 and 16 hours to implement a feature with the afferent test suite. I know this because I end working just when the job is done and not because I monitor my activity since monitoring is exhausting and at the end of the day useless.

In TDD (test driven development) the test suite is the equivalent of technical documentation and development is less lengthy thanks to it.

In case it worries you the time spent in writing documentation stop monitoring it, it will relax and help you have fun while developing software.