Developer Testing in the IDE: Patterns, Beliefs, and Behavior
Reviewed by Greg Wilson / 2021-09-12
Keywords: Test-Driven Development, Testing
Back in 2016 we reviewed Beller2015, which looked at when, how, and why developers (don't) test in their IDEs. Beller2019, published four years later, is a deeper and more detailed look at how much testing programmers actually do, and it isn't a pretty picture. After studying almost 2500 software engineers using Java and C# in four different IDEs for two and a half years, the authors found that (among other things):
- half of developers do not test at all;
- most programming sessions end without any test execution;
- 12% of tests show flaky behavior;
- test-driven development (TDD) is not widely practiced; and
- software developers only spend a quarter of their time engineering tests, whereas they think they test half of their time.
That last point is the most important one for me. Telling people to do more testing is pointless if they think they're doing enough already, and gauging the impact of testing based on self-reports is going to give the wrong answer if those reports are systematically skewed. This paper doesn't try to prove that more testing will lead to better software, but it is a valuable baseline for assessing any such claims.
Beller2019 Moritz Beller, Georgios Gousios, Annibale Panichella, Sebastian Proksch, Sven Amann, and Andy Zaidman: "Developer Testing in the IDE: Patterns, Beliefs, and Behavior". IEEE Transactions on Software Engineering, 45(3), 2019, 10.1109/tse.2017.2776152.
Software testing is one of the key activities to achieve software quality in practice. Despite its importance, however, we have a remarkable lack of knowledge on how developers test in real-world projects. In this paper, we report on a large-scale field study with 2,443 software engineers whose development activities we closely monitored over 2.5 years in four integrated development environments (IDEs). Our findings, which largely generalized across the studied IDEs and programming languages Java and C#, question several commonly shared assumptions and beliefs about developer testing: half of the developers in our study do not test; developers rarely run their tests in the IDE; most programming sessions end without any test execution; only once they start testing, do they do it extensively; a quarter of test cases is responsible for three quarters of all test failures; 12 percent of tests show flaky behavior; Test-Driven Development (TDD) is not widely practiced; and software developers only spend a quarter of their time engineering tests, whereas they think they test half of their time. We summarize these practices of loosely guiding one's development efforts with the help of testing in an initial summary on Test-Guided Development (TGD), a behavior we argue to be closer to the development reality of most developers than TDD.