For 2017 LogiGear is conducting a 4-part survey to assess the state of the software testing practice as it stands today.
This is not the first survey that we’ve launched; LogiGear conducted one very large state of the practice testing survey in 2008.  It was much larger, with over 100 questions. Nearly a decade later, we are launching another survey. But unlike 2008, we’re doing something different this time: instead of 100 questions, we are doing 4 much shorter surveys, with questions numbering closer to 10 on a smaller range of topics.

This is a 4-part series to mirror LogiGear Magazine’s issues this year.

1.)       Test Essentials (Back to Basics)— topics include test plans, test cases, attitudes about quality and testing.

2.)       DevOps— topics include Continuous Delivery, training, tools, development lifecycle and IT/Ops integration.

3.)       Automation— topics include tools, attitudes, skills and use.

4.)       Staffing, Outsourcing, Training, Leading and Managing— topics include training, staffing, management, outsourcing and job satisfaction.

Survey 1, announced in January, is now complete. Below are the results and my comments on them.  Survey 2 is now open and collecting results.  The current survey is on DevOps. It is unique in that there are 2 separate surveys to choose from, depending on your business situation. One survey is for teams that are currently doing Continuous Delivery/Continuous Testing/DevOps practices, and the other is for teams that aren’t.  I encourage you to please pass these surveys along to your Test/QA friends to get the widest set of responses and opinions.

The nature of my work is going to various organizations to do consulting and training on software development, as well as software testing, across various industries. I am always asked what practices are common, what other companies do, and what the correct dev-to-test ratios are. It’s an easy answer— there aren’t any!

Everyone is aware that software development is constantly changing. New tools are introduced into the life cycle every day. Skill sets, technologies, and staffing are fully dynamic. On top of all this, even though quality levels vary greatly across product types, one thing remains constant: everyone cares, to some degree, about product quality and does not want to be left behind on technologies or tools. Some people ask about and even show me standards that they want to follow to ensure they are doing the right things.

These surveys are a great way to understand what the testing world today looks like on a broad level, and it can be used as a check-in. It can confirm something you do as more commonly practiced today, and ensure that you’re keeping pace with current industry standards.  Likewise, if you see something you do as no longer common or if it’s now outdated, you can use this article series as a way to identify, assess, and change your skills set or practices where applicable.

My favorite feature of this survey is our “CEO Secrets” question, in which we ask our participants to tell us the stories that happen behind the scenes that the CEO, or upper management never find out about.

Test teams always see deep inside development projects. It’s our job to get to the heart of the functions and workflows, identifying the limitations and problems— problems with functions, problems with data, as well as problems with platforms. Also, we see the technical debt firsthand. How many times have we heard: “Let’s just put it out there like this, and fix it in the first patch or next iteration.”  Many times. These issues are not general knowledge; they are kept quiet, swept under the rug.

All responses were given anonymously. On a personal note, I love to hear these stories and you’ll be shocked to find out how often these problems happen (it’s a lot more than you think), as well as how similar these problems are across industries. We are including a “CEO Secrets” in each survey as a bonus question.

I will be publishing the results of each survey in the following issue of LogiGear Magazine. The “CEO Secrets” will be reviewed and separated from the results. When all the surveys are complete we will be releasing the accumulated results as an ebook, where I will also share my final thoughts and views of the software testing industry as a whole.

The Results of Survey 1: Testing Essentials

This survey focuses on the basics of software testing. It is also meant to create a scope for the next set of surveys of the audience and attitudes of the sample set of respondents.

There is some controversy built into some of these questions. For example: what are test plans in the Agile age? How do we document test cases when everyone is trying to be lean and mean? These are not easy questions with fast answers. Test plans and test cases are areas where I am trying to ascertain how diverse the testing world is.

#1 What is your organization’s primary business activity?

It is very common for software development practices to differ greatly depending on the industry.

This is another reason why there are no real standards across software development. Products are so varied, with each one having a unique business strategy for how they get tested, which also factors into the differences between industries. I want to start this survey by looking at what industry the respondents come from in order to get a sense of the range of test teams.

The top 3 industry groups account for over 40% of the responses: Financial Services, Manufacturing (hardware and software), and Medical/Healthcare.

Almost 20% of respondents are consultants. To me, the great thing about this is that these individuals will give, perhaps, more independent and less corporate answers.

Usually, consultants are expected to have a broader range of experience from having worked in various industries, while internal staff more often has a deeper understanding of one area from being an internal employee for some time.

A small number but wide variety of participants come from Automotive, Oil & Gas, Education, Defense/Military, Manufacturing (non-hardware or software), Games/Entertainment, and Telecom responses.

We  have a good and current cross section of the software development world.

What is your organization’s primary business activity?


#2 What phase or aspect of testing do you feel the management at your company does not understand?

“Automation is not easy.” Number 1 answer.
It still amazes me how common the misunderstandings between test teams and management are!  47% of survey respondents—nearly half—view misunderstandings about automation as the biggest problem they face.
I see this often in my work. I say too often: “incidental automation does not work.” As test engineers, we need to educate people that test automation is software development and needs to be planned, designed well, scheduled, and staffed correctly. How projects get behind schedule due to user story problems and requirements creep are well-known, yet often overlooked. Testing requires skill, and not everyone is good at testing!
It is worth noting, that a very lucky and privileged 14% responded: “None, my management team fully understands testing.”

What phase or aspect of testing do you feel the management at your company does not understand? (select all that apply)

Tester Automation is not easy 47%
Projects are most often behind schedule because of shifting user stories/ requirements, not test delays 39%
Testing requires skill 38%
How to adequately schedule testing 35%
How to measure testing 32%
Choosing good coverage metrics 27%
Agile/ Scrum process problems with testing 23%
None, my management team full understand testing 14%
The inability of the team to complete tests 10%
Other 5%

#3 What pain does your product team have regarding quality?

For as much as testing has changed, quality responsibilities shifted (unit testing, user testing, requirements/user story analysis) and matured— many problems remain the same.

For nearly half of all respondents, the #1 and #2 pains are: lack of upstream quality practices and not enough schedule time. This is a clue to me that even though we have a wider, and greater, distribution of quality tasks and ownership of quality, there is still not enough upstream testing. Upstream testing should include processes such as requirements analysis, code review, and unit testing. These are all lower cost quality practices than traditional UI testing. People in software engineering should know this by now, and if they do not, then they need to be made aware of it.
There are estimation or Scrumbutt problems that so many people cite insufficient time to test. Great news for all of us, a low number of respondents, 11%, cited that low-morale as a problem but for very few teams, which shows how much more test teams are respected in most organizations.

 What pain does your product team have regarding quality? (select all that apply)

Lack of upstream quality assurance )user story/ requirements analysis and review, code in inspection and review, unit testing, code-level coverage analysis 49%
Insufficient schedule time 47%
Lack of effective or successful automation 38%
Poor project planning and management 31%
No effective Definition of Done (DOD) or, building of Technical Debt to meet sprint schedules 30%
Feature/ user story creep 30%
Poor project communication 26%
Inadequate test support tools 23%
Missing team skills 19%
Project politics 18%
Low morale 11%
Poor development platform 6%

#4 If you could do anything to release a higher quality product or service, what would that be?

I am surprised in a free form answer field that only 2 topics were mentioned with overwhelming concerns. In a free form field, I would expect a great variation in topics, but that isn’t the case. This shows again, how so many test teams are struggling with the same issues across industries, development organizations, and styles.
Most of the answers here centered on 2 areas:

  1. Raising the awareness level and knowledge of quality and testing across the team, and testing earlier in development.
  2. Issues around automation such as too much pressure, and inability to automate due to time demands as well as misunderstandings of the time to automate well.

There are two other answers that were very common: the need for more clear documentation in requirements and user stories, as well as better communication with the business side of development.

A few sample answers:

  • Raise profile of QA team so that they are involved in all aspects of a project; right from the beginning.
  • Include testing perspective right from design phase.
  • Earlier involvement of SQA with user stories and design.
  • Adequately staff QA/Testing role on projects and continue to ensure quality is built into software from the beginning of process, not just in the testing phase.
  • Automation of all regression test cases.
  • Recognize the need for clearer documentation.

#5 How do you measure and report coverage to the team?

44% of all respondents measure user story and test case coverage. Code coverage, many people are finally realizing, is less important than it had been.

This is what I see out in the development world. Many teams measure test case coverage: how complete a test job is by measuring completion of pre-documented test cases. As well as traceability from requirements/user stories/acceptance criteria to test cases to ascertain that every item is tested.

More than 27% do not measure coverage. Test coverage, to me, is the essential test metric.

If you are trying to guage quality by how many bugs are found and fixed, all that is relative and without direct meaning if you do not have a very clear measure of how much of the product, or release, was exercised to get those other results.

How do you measure and report coverage to the team?

#6 Where do you document test cases?

58% responded a tool, another 18% excel. The Application Lifecycle Management tool suites (ALMs) have taken over the development world.

For all the process changes— from Agile, Lean, DevOps, to Continuous Development (CD)— ALM tool suites have impacted development as much as process has changed. Whether it’s Microsoft’s TFS/Visual Studio—the largest enterprise tool, or a build-your-own suite (i.e. Git, Jira, Jenkins, Zephyr, Selenium, etc.), every modern development organization has moved out of Word Docs and Excel, or isolated Google Docs into traceable, easy to share, global, modern tools. Getting more familiar with various tool suites will be a big benefit to everyone in development.

Interestingly, a very small number, only 7% do not document test cases. I had thought, with Lean development practices being so popular these days that this number would be higher.

Where do you document test cases?

#7 Do you write test plans?

Test plans are a hot item. To do or not to do? Why and why not? It’s why I have focused a large article on the topic. For 22% of respondents, the answer is no. That is a big number.

Most teams in this Agile Age still write some kind of test plan. A majority of our survey takers said that their teams do write test plans, totaling about 67%. This is actually higher than I thought it would be.  For the teams that do write test plans, the majority write them by release or secondarily, by user story.

Good news. Due to the evolution of software development— the diversity of teams, products, and development style— there are many routes teams can take to writing test plans these days. Hopefully they are also more useful than they had been in the past, where many teams were making “copy & paste” docs to satisfy someone’s doc requirement, rather than creating a useful, thoughtful document describing the plans and risks for testing.

Do you write test plans?

#8 Does your test strategy include: Unit, API/Integration level, and/or UI testing?

This is probably the most fascinating question to me in the survey. With much talk of the new distribution of quality responsibilities, to developers, to IT/Ops, back to the business… What do test strategies look like today?

By asking this question, I’m looking to see who has implemented “Shift Left.” When a team shifts-left, they push testing earlier in the Dev process and cheaper to find issues. The majority, over 54% of respondents, have test strategies that include only API/integration and UI testing. Only 12% have UI-only test strategies. This is clearly a sign the days of untested product being tossed over the wall to testers is almost gone!

API/Integration is earlier. But every modern development SDLC assumes Unit Testing. We still have far to go to achieve this. Of course, the most cost effective strategies will have unit testing, service/integration/API testing as well as UI testing. What I find in practice is many companies are still struggling with how to implement unit testing. Almost 60% of test strategies now include unit testing. That is a huge increase in the last decade. When will it be much closer to 100%?

Does your test strategy include: Unit, API/Integration level, and/or UI testing?

#9 Since Agile/Scrum, has your job gotten better?

I love job satisfaction type questions. They are very telling about the state of our practice.

When I remove the “not applicable” answers, the Yes outnumber the No— but not by much.

Of the Yes/No answers, 58% are happier, 42% are not. It is disheartening. “Stayed the same” combined with NO is a majority, 53%. Agile/Scrum are supposed to fix problems and make things better. If not— this is the sign it has been implemented wrong. I often say in my consulting work that if teams are not happier doing Agile— you are doing it wrong.

For 22%, testing has gotten better with group ownership of quality. This is great! But this number is actually still pretty small. I had hoped this would be more than half.

“Yes, things are better with less documentation” is a surprisingly popular answer, in other places in this survey many responses cite a need for more and better documentation.

Sadly, for a large percentage of teams 19%— things are the same. This is a problem! Perhaps re-reading the Scrum Guide and getting copies to other team members is a good idea. The Scrum Guide is very short (approximately 25 pages) and easy read, detailing the framework. It is a great way to see areas for improvement, outside the confines of a retrospective.

Since Agile/Scrum, has your job gotten better? (select all that apply)

Yes, group ownership of quality and test coverage 22%
No, there is no good communication/ collaboration/ things are still handed off to us with little information 18%
Yes, the team understand testing more 15%
It’s the same 15%
Not applicable 12%
Yes, the team is more understanding of automation problems 9%
No, the time pressure has made things worse 8%
Other 3%
Yes, less documentation 2%
I did not test before Agile/ Scrum 2%

#10 What percentage of tests are automated?

Still more than 65% of respondents say less than 50% of their tests are automated. This is a problem test teams must address. Manual testing will never go away. It is important, useful and serves a different purpose than automated testing. Automation cannot and should not attempt to do everything— and it is a mistake to think automation will ensure quality— when it is not even the goal of automation.

The goal of automation is to show consistency, not the absence of bugs. But automation has a very important place in every test strategy, even if only to automate the mundane tasks to free up more time for more interesting exploratory tests. Having so many teams with such low automation percentages will hold back organizations, as more teams look to benefit from the flood of tools to do Continuous Delivery in DevOps.

26% of all respondents have no automated testing. This number is just too high, especially with advances in tools and business readable test cases. Tools are much easier to use today. The learning curve has dropped and the investment— a big deterrent in the past— is also lower today with the adoption of more open source tools.

What percentage of tests are automated?

We have reached the end of this installment. As promised, we’ll share the full review, with thoughts and opinions in our upcoming ebook. In the meantime, please take our Testing in Continuous Delivery survey, or peruse other articles in this issue. Don’t forget to share the survey with your network! The more recipients we get, the better reflection we get on the current state of the industry!

Michael Hackett

Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006).
He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.


Leave a Reply

Your email address will not be published. Required fields are marked *