33

In a company I used to work for, executives insisted that the code coverage with unit tests must be 99% or more. This resulted in writing more tests than code. It took us literally 3 days to write tests for a single class that took a day to implement.

As a result, however, I learnt a lot about TDD, testing tools, practices etc.

In the company I worked for afterwards, unit testing was an unknown thing. It was something someone maybe heard before. I struggled to introduce them to concept of unit testing, but without effect.

Now, as a self-employed, I wonder - how much time is really necessary to spend on unit testing? Being mostly iPhone/Android developer, which parts of the code should be covered in tests?

Maggie
  • 826
  • 1
  • 8
  • 11

6 Answers6

20

The amount of unit testing that is needed depends on several factors:

  • Product Size (The larger the project, the greater the need to include at least some unit testing)
  • Required Quality Level (If you are quickly putting software together that needs to be out as quick as possible and some minor bugs are acceptable, then you might be forced to skip some testing like unit testing)
  • Type of Product (UI's can be unit tested, but sometimes it is easier to skip unit testing on heavy GUI sections of a project and instead test manually)
  • Your Coding Ability/History (What type of bugs do you normally create? Are they things that Unit Testing normally catches or things that another type of testing normally finds. Knowing this might push you to do more or less unit testing)
jzd
  • 4,176
12

Unit tests pay off at maintenance time. If you plan to have a long living application you will spend more time maintaining than you think you will now (if you have not yet tried this, you will be surprised how long a successful project may live)

What you want, is that if you accidentally change your functionality your tests break so you find these things as fast as possible. Customers strongly dislike when functionality changes unexpectedly.

12

In our product group we target 50-70% code coverage from unit tests and 90%+ coverage from unit tests and test automation combined. Typical time budgeted on writing unit tests is about 1 day for every feature that takes 3-4 days of heads down coding. But that can vary with a lot of factors.

99% code coverage is great. Unit tests are great. But 99% code coverage from unit testing alone? I find that hard to believe that you can get that much coverage from unit testing alone.

For the case where you spent 3 days writing tests for a class that otherwise took 1 day to implement. You didn't elaborate as to why it took this long or share any code. From speculation, I am guessing you weren't really writing a true unit test for your class, but were actually writing test automation. And there's actually nothing wrong with that - as long as your recognize the difference between the two different types of tests.

But you said the three days of test writing was only for a single class. Perhaps the class itself was not designed for unit testing. Does the class implement UI? Networking? File I/O? If so, you might have ended up writing more code to test the Java runtime than your business logic that interacts with the runtime.

TDD gets you thinking in terms of interfaces and interfaces to dependencies. That single class that implements UI, networking, and file/io for a single feature might be better served split into multiple classes - one for networking, one for file/io, and the UI broken into a model-viewer-controller design. Then you can implement appropriate tests for each with simple mock objects for the dependencies. Of course, all of this takes up more time. So rather than 1 day to code and 3 days to write tests, this type of design may require 3 days of coding and 1 day of writing tests. But the code will be far better maintainable and reusable.

selbie
  • 643
4

If you're doing TDD, you'll be writing the tests at the same time as the code, switching between them every few minutes (or less.) There won't be any distinct time spent for tests. Using TDD makes it much easier to know that you have solid test coverage.

If you are Unit testing after the fact, you need to write the tests that will tell you if the code is broken due to changes. I wouldn't rely on coverage metrics here, but would go based on use cases and parameters to public interfaces. This will ultimately be based on your good taste and experience.

3

If you will not spend time on tests, you will spend even more time to debug in live code.
So spend as many time as it needed to tests, to cover all (or 99% of code).

OZ_
  • 307
2

As others already noted, it depends largely on the type of software. The 3:1 test/development time ratio you mention may be a bit too much for average projects, but may be perfectly OK for mission critical apps and may be even too little for a life critical system.

99+% unit test coverage is similarly maybe too much to expect in the case of an average app, but too little for a life critical project.

In my experience, considering that a significant part of production code is error handling code, a coverage of 80-90% would be sufficient for most apps, and this could require roughly about the same amount of time spent writing unit tests as production code. (Then again, if one is earnestly working in TDD fashion, the two are completely intertwined to practically become one single task, so one can only guesstimate the actual ratio.)