2010 – 2011 LogiGear Global Testing Survey Results – Tools

TOOLS

T1. What testing-support tools do you use? (Please check all that apply.)

Response percentResponse count
Bug tracking/issue tracking/defect tracking87.70%64
Source control54.80%40
Automation tool interface (to manage and run, not write automated tests)52.10%38
Test case manager50.70%37
Change request/change management/change control system47.90%35
A full ALM (application lifecycle management) suite19.20%14

Result analysis: Thirteen percent do not use a bug tracking tool. This does not surprise me but it does many others that many test teams do not track their bugs!

About half of the respondents use a test case manager, and the same percentage uses a requirements manager or change control system. Half use a automation tool interface; these tools most commonly contain manual and automated test cases. Yet, only 20% use a full lifecycle ALM tool. A few years ago this number would have been much smaller.

With each passing year─especially as more teams go agile or offshore work─this number will dramatically increase.
T2. I would describe our bug (bug, issue, defects) tracking as:

Response percentResponse count
Effective37.70%26
Very effective; has a positive impact on product quality34.80%24
Adequate20.30%14
A mess4.30%3
Poor2.90%2
Hurts product quality/has a negative impact on product quality0%0

T3. What type bug tracking tool do you use?

Response percentResponse count
Relational database tool (Bugzilla, TrackGear, TeamTrack, Team Test, ClearQuest, Homebuilt web-based or client server database)68.10%47
ALM tool that includes defect tracking18.80%13
Excel8.70%6
Email2.90%2
We do not track bugs1.40%1

Result analysis: This is a very positive move in our profession. Just a few years ago the number of teams using excel to track issues was significantly higher.

Excel is not an adequate issue tracking tool to sort, query, retrieve old issues from past releases, or control and manage access. With so many good and open source tools, there is no reason to be using a naive system.
T4. How many bug tracking systems do you use during a regular test project?

Response percentResponse count
169.60%48
Combination of tool and Excel and/or email14.50%10
211.60%8
More than 24.30%3

Result analysis: The problem of multiple bug tracking tools is common. In this survey, almost 30% of teams use more than one bug tracking tool. Most problematic is that almost 15% who use a tool, use excel and email. I see this often in my consulting work. It always causes headaches.

One team will not use another team’s tool, developers have a work management tool and will not use the bug tracking tool so the test team has to use two tools, some remote team may not be allowed access to the internal tool so all their bugs get communicated in excel and email. It is a management problem, but it also lends to a more devious problem of giving the impression that testing is disorganized.
T5. How do you communicate and manage test cases?

Response percentResponse count
A relational database/repository focused on test case management (TCM, Silk Central, Rational Test Manager, TA, etc)41.50%27
Excel21.50%14
ALM tool that includes test case management20%13
Word15.40%10
We do not track, communicate or manage test cases1.50%1

Result analysis: The problem here is that almost 37% of teams are using MS-Word or Excel─that is dead data. It is difficult to share, edit, maintain, sort, query, and measure with these programs.

There are so many good test case management tools, some open source that make writing, editing/maintaining, sharing and measuring test cases so much easier. In my experience, there are very few good reasons not migrating to an easier and more sophisticated tool set.

There are also easy solutions to have test cases and bug tracking linked to the same tool. Test teams can graduate to a higher level of management, reporting and efficiency with tool sets.

T6. How are the test cases used? (Choose the MOST appropriate.)

Response percentResponse count
They are used only for testers to execute tests34.30%24
They are used to measure and assess test coverage30%21
They are used to assess proper test execution21.40%15
They are used to measure and manage project progress14.30%10
They are not used during the project0%0

Result analysis: For teams not using test cases for more than execution, it may be useful to know that it is very common for them to have other usage.


T7. If you use a test case management tool, how is it used?

Response percentResponse count
It is used to run our manual and automated tests57.40%31
It is used only for manual tests40.70%22
It is used to run only automated tests1.90%1

T8. If you have experience with success or failure regarding test tool use (ALM, bug tracking, test case management, automation tool interface, other) that is interesting or helpful to others, please write it below: (Comments from practitioners)

  1. “The best thing to do is manage the progress of the tests and see the bugs. You can measure the project’s health.”
  2. “I find Bugzilla reporting and commenting adequate communication most of the time. Its only problem is when immediate problems surface – at that point an email to appropriate parties telling them to look at Bugzilla usually works. So does walking over to the developer and showing them the issue.”
  3. “So far Jira was the best bug tracking tool.”
  4. “If you want people to use a TCM or bug management tool, make sure it has good performance and it’s simple.”
  5. “For a large project or program it is crucial to select a single method of tracking defects and what is considered defects versus ‘issues.’ This can lead to a great deal of confusion where defects identified as issues are not handled and addressed properly. I worked on a large project that various efforts had four different ways of tracking defects and issues. The result was that it was hard to assess the overall quality of the product that was being implemented.”
  6. “Testing should be driven by proven testing methodologies; not by the tool itself.”
  7. “Generating quality reports can be difficult using bug tracking systems.”
  8. Certain automation tools will not be suitable for certain type for projects.”
  9. “Test case management tools are not integrated to requirement management tools reason why our test cases are sometimes tested against obsolete functionality.”
  10. “Rally is very useful.”
  11. “Process discipline matters more than any tool.”
  12. “The tool is difficult to use for non-technical team members.”
Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006). He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

The Related Post

Check out the results of our poll where we asked practitioners what software testing trends they think will dominate in 2019. You can barely go online today without being asked to respond to a poll. Many have a hook to a sale or to win a free phone. But, to cut to the point, many ...
Complete 2010 – 2011 Global Survey Results LogiGear Corporation LogiGear Corporation provides global solutions for software testing, and offers public and corporate software-testing training programs worldwide through LogiGear University. LogiGear is a leader in the integration of test automation, offshore resources and US project management for fast and cost-effective results. Since 1994, LogiGear has worked ...
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the sixth analysis of the 2010 Global Testing Survey Series. More survey results will be included in subsequent magazine issues. To read past surveys, visit https://www.logigear.com/magazine/category/issue/survey/. Part 1- The Home Team HT1. Do you outsource testing (outside your company)? Response percent ...
This survey on modern test team staffing completes our four-part 2017 state of software testing survey series. We’ll have more results and the “CEO Secrets” survey responses to publish in 2018.
METHODS M1. The test cases for your effort are based primarily on: Response percent Response count Requirements documents 61.3% 46 Discussions with users on expected use 2.7% 2 Discussions with product, business analysts, and marketing representatives 9.3% 7 Technical documents 4% 3 Discussions with developers 8% 6 My experience and subject or technical expertise 12% ...
Few people like to admit team dynamics and project politics will interfere with successful completion of a software development project. But the more projects you work on, the more you realize it’s very rare that technology problems get in the way. It’s always the people, project, planning, respect, communications issues that hurt development teams the ...
For 2017 LogiGear is conducting a 4-part survey to assess the state of the software testing practice as it stands today. This is not the first survey that we’ve launched; LogiGear conducted one very large state of the practice testing survey in 2008. It was much larger, with over 100 questions. Nearly a decade later, ...
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the first analysis of the 2010 Global Testing Survey. More survey results will be included in subsequent magazine issues.
LogiGear strives to keep its finger on the pulse of the latest trends in Software Testing. During this defining moment in history, we want to hear from you about how your work has been impacted by the pandemic in this quick 5 minute survey.
Michael Hackett looks at questions posed to managers in the final installment of our 2010-2011 Global Survey results.
Process The objective of this survey and analysis is to gather information on the actual state-of-the-practice in software testing today. The questions originate from software development team assessments I executed over the years. A process assessment is an observation and questioning of how and what you and your team does.

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe