The objective of this survey and analysis is to gather information on the actual state-of-the-practice in software testing today. The questions originate from software development team assessments I executed over the years. A process assessment is an observation and questioning of how and what you and your team does.
In such a long survey, I wanted to keep process questions to a minimum and seek free-form comments and opinions on perceptions of process, and save more survey time for assessment of actual practices, strategy and execution rather than an individual’s plan for a process. The following section of questions deals directly with the development process itself. Results from this section has always been informative for managers as it deals directly with how things ought to be and what perceptions are for the process itself─in most cases there is a discrepancy between reality and the best laid plans! Let’s take a look at the Process portion of the survey.
P-1. How would you describe the effort to release your team’s software product or application?
|The process is occasionally difficult, but we reach fair compromises to fix problems||43.40%|
|The projects are smooth with occasional process or product problems||28.90%|
|It’s a repeatable, under-control process.||19.70%|
|It’s difficult; problems every time||6.60%|
|It’s a huge, stressful effort||1.30%|
This is an encouraging response! The overwhelming majority, 92% say their development process is either under-control, smooth or occasionally difficult. Only eight percent states the process difficult or stressful.
I am surprised at the very positive response teams have for their development process. This is a very common area for teams to complain and express frustration. Current conventional wisdom has teams frustrated with traditional processes. More than half of the survey respondents self-describe as using development process other than Agile.
P-2. Do you think your SDLC processes are followed effectively?
Now we see the big disconnect with the first question. This response is a virtual split. Just over half think their SDLCs are followed effectively, and the remaining portion does not.
From my experience, I find in development today is that there are “process docs” detailing how teams should effective make software yet are commonly written outside of development or many times by consultants. Teams regularly disregard these docs and are more comfortable with their own culture or practice that is either expressed or implied in tribal knowledge.
P-3. Have people been trained in the process?
This is encouraging results matching what I expected for internal training. If a team has a process in place, then it would be easy for the department to create either a PowerPoint slideshow or flash video of the process. Such a tool would make training easy for all employees. The opposite would be a team that does not have a process and works based on either tribal knowledge or “whatever works” ethics—a problematic process for long term goals. Making your SDLC as a training standard is also a great opportunity to question why the team does certain practices and sharpen the process or connect reality to “shoulds.”
P-4. During the testing phase of a project, what percentage of time do you spend executing tests on an average week? Example: 10 hours testing in a 40 hour week: 25%
|74% – 51%||23.40%|
|Less than 25% of my time is spent on executing test cases||20.80%|
|49% – 26%||18.20%|
|More than 75% of my time is spent on executing test cases||10.40%|
If there is one big, dark, hidden secret in testing I have chosen as my cause, it is to expose the amount of time testers actually spend testing as opposed to documenting, time in meetings, building systems, data, planning, maintenance─all those things testers need to do as well. The perception of most managers is that testers spend the overwhelming majority of their time executing tests. This is not the case.
Ten percent of respondents say they spend 75% of their time or more testing. This can also be read that only 10% of respondents are testing at least 30 hours in a 40 hour week with 10 hours a week spent on other efforts.
Just over 20% put themselves in the lowest category admitting to less than 25% of their time spent testing. This means 10 hours/week or less is spent testing and during a test phase, spending 30 hours/week working on other initiatives.
Two-thirds, 66.3% of respondents spend half their time or less (20 hours/week or less) in a testing phase actually testing. This is common and always very surprising to managers.
I do not conclude any fault lies in this. I know what test teams have to do to get a good testing job planned, executed and documented. Yet, in some situations, this is an indication of a problem. Perhaps the test team is forced into too much documentation, must attend too many meetings, or gets called into supporting customers and cannot make up lost testing time.
What is most important here is that everyone concerned─managers and directors, project managers, leads, anyone estimating project size─all know, most testers spend half or less of their time testing.
P-5. How much time do you spend documenting (creating and maintaining) test cases during an average product cycle?
|Less than 25% of my time is spent on documenting test cases||51.30%|
|49% – 26%||27.60%|
|74% – 51%||5.30%|
|More than 75% of my time is spent documenting test cases||2.60%|
|We do not document our test cases||1.30%|
There are many useful pieces of information to glean from this. Few groups spend too much time documenting. This is a great improvement from just a few years ago when many teams were under tremendous pressure to document every test case. Aside from this being completely useless, it leads to missed bugs! Teams spending so much time documenting were not testing enough causing testers to overlook bugs.
Some teams were collapsing under the stress of regulatory pressure to prove requirements traceability to auditors who were using naïve test case design methods or using a MS Word for test cases.
The last few years have seen an explosion in better test case management tools, better test design methods, like action based testing and Agile methods where lean manufacturing ideas have teams agreeing that less is more when it comes to test case documentation.
Very few teams reported not documenting their tests. This is a ignificant improvement to just a few years ago; during the dot-com boom when web applications might get a rapid fire test there was no time for documenting any tests, no regression testing, and no repeatability. In the next release, you are expected to start from scratch. All business intelligence is left undocumented and kept with few individuals. Hopefully, those days are gone.
Still almost 20% of the groups report spending 50% or more of their time during a project documenting. That is pretty high. There has to be excellent reason those groups are documenting so much, otherwise this is a problem. If you are managing that time you must ask yourself: do these testers have enough time to test? Is it a test project or a documentation project?
P-6. If your group receives requirement documents prior to the testing planning and test case design process, how would you characterize the adequacy and accuracy of these documents?
It is a positive result that only very few respondents find their requirements useless. It is also encouraging to note that almost half of the respondents find their requirements very useful! This is habitually another area where test teams complain of the level of quality of the requirements docs.
An assumption from these results is that requirements docs may be getting better. Perhaps a balance has developed in many teams as to how much information business analysts or marketing teams need to give both developers and test teams for them to do their jobs effectively. That, or, test teams have stopped complaining about requirements and make due in other ways.
P-7. What is your view on the quality of the code that is handed to you at the start of testing?
|Usually stable/testable, no idea about unit testing, no accompanying documentation of the build||45.90%|
|Stable, unit tested||21.60%|
|Stable with build notes||13.50%|
|Often unstable with accompanying documentation of known problems||6.80%|
|Often unstable, little/no information on unexpected changes||6.80%|
|Very stable, unit tested, with build notes explaining bug fixes and changes||5.40%|
To highlight: 40% of respondents appraise their builds as stable; 46% of respondents appraise their builds as usually stable; and 13% found the quality of code often unstable.
This all seems pretty good. There is one area that is particularly troubling in all this data.
Almost half of all the respondents do not get information from development. Testers have no idea about unit testing and no information about what changed in the build. There is no viable reason for this and it hurts product quality.
Agile development with TDD and daily scrums is meant to prevent this problematic lack of information. The Continuous Integration practice including automated re-running of the unit tests and a smoke or build acceptance test is very effective in speeding up development and delivery.
The following statements are hand-written responses.
P-8. Please fill-in the blank: My biggest problem with our development process today is:
- “Not all out BA’s are trained in writing effective clear requirements.”
- “Lack of Unit testing. Unit Testing is NOT automated, very few developers write test harnesses. It is all manual Unit Testing.”
- “We have no clue what they are doing.”
- “We test our changes, but do not test the overall product.
- Regression testing is our biggest problem.”
- “It’s a black hole!”
- “On most projects, there is a lack of collaboration and cooperation between test and development teams (these by and large are not Agile projects, of course!).”
- “No technical documentation of what had been build.”
- “They are rude with testing team.”
- “We need earlier involvement.”
- “They don’t understand the business or the users well enough.”
- “Bad communication.”
- “Bad estimation.”
- “No timely escalation of probable risks on quality delivery.”
- “Too many processes are followed by rote.”
- “Bad scope and requirements management”
- “They are laying off QA staff and I’m not sure how they are going to adequately test the product.”
- Lots of documentation required that does not increase the quality of the product.”
P-9. If you could do anything to make your projects run smoother, what would that be?
- “Better communication.”
- “More communication with remote team.”
- “More testing by development.”
- “Unit testing to be be mandatory & unit test report should be treated as a exit criteria to start the Testing.”
- “Send bad developers to training.”
- “Spend more time automating regression test cases.”
- “Automate our testing.”
- “More time allowed for test case preparation / documentation.”
- “Re-Focus on planning and requirements gathering. Also we could stand to enforce creation of unit tests by developers. We rely too heavily on QA Automation to catch everything”
- “Get buy-in from key people up-front and work to expose icebergs (blockers to success) as early as possible.”
- “Policy in handling customer requests on change requests. Project management and Sales Team have to follow process so as not to over commit on deliveries.”
- “Plan to reduce last-minute changes.”
- “Lighten the documentation.”
- “Stronger project managers that can lead.”
- “Better project management, better enforcement of standards for SW development, CM and Testing.”
- “Integrate ALM tools.”
P-10. If you have a lessons learned, success or failure story about your team’s development processes that is interesting or might be helpful to others, please write it below:
- “We have done a good job on creating a repeatable build process. We release once a week to Production. Where we fail is in integration and regression testing.”
- “The processes are well-defined. The team’s commitment and unity of the team are critical to the success of the project.”
- “Don’t develop in a vacuum. The less exposure a team has to the business, user needs, how software supports tasks, etc., the less likelihood of success. Get integrated – get informed – get exposed! At a previous company, I used to drag my developers out to customer sites with me and while they dreaded facing customers, they were ALWAYS extraordinarily energized at the end having learned so much and feeling much more connected and *responsible* for satisfying customers. This tactic was ALWAYS valuable for our team.”
- “Maintain high morale on the team. Motivate them to learn and develop technical and soft skills.”