Formulating a Software Test Strategy

This article was adapted from a presentation titled “How to Optimize Your Web Testing Strategy” to be presented by Hung Q. Nguyen, CEO and founder of LogiGear Corporation, at the Software Test & Performance Conference 2006 at the Hyatt Regency Cambridge, Massachusetts (November 7 – 9, 2006).
Click here to jump to more information on STP CON.

Introduction

The following article provides a brief overview of the process for formulating a software test strategy, the key things that need to be included, and the many critical questions that you should be asking yourself.

Formulating a Test Strategy

Some of the key things to remember when formulating a software test strategy are:

  1. It is teamwork, not something done by an individual or developed “on-high” and passed down to be implemented
  2. It requires all stakeholders to participate
  3. It requires executive support
  4. It requires that participants think outside-of-the-box––in essence, they should start with a blank piece of paper and not start the process with preconceived notions or approaches that represent the way things have always been done
  5. It requires a lot of asking, “Why?”
  6. It requires thinking from the bottom-up, and starting from the end

A formulated software test strategy should include many key things, including:

  1. Identifying different product development styles from inception through maintenance, so that we can eventually map the appropriate test strategy to each
  2. Mapping out phases, milestones, and relevant activities on a timeline
  3. Identifying the equivalent type of test strategies for each development method
  4. Prescribing what is involved in each test strategy

When undertaking the process of formulating a test strategy you should be asking yourself:

  1. What are your quality objectives or characteristics? Examples of quality objectives include: functionality, usability, performance, security, compatibility, scalability, and recovery.
  2. What are the requirements for each characteristic?
  3. What are the types of bugs that affect each quality characteristic?
  4. What are the test types or activities needed to support finding problems described in #3? These may include things such as: design reviews, code inspection/reviews, code walk through, design walk through, unit testing, API testing, external functional testing, usability testing, accessibility testing, configuration testing, compatibility testing, regression testing, performance testing, load testing, stress testing, failover/recovery testing, installation testing, security testing, and compliance testing.
  5. What are the most effective approaches to finding specific types of bugs as early as possible? Approaches may include: requirement-based testing, scenario-based testing, story-based testing, soap opera testing, model-based testing, attack-based testing, risk-based testing, fault injection, Diagnostic Approach to Software Testing (DAST), exploratory testing, and so on.
  6. What is the required application maturity to support #4?
  7. How would #5 and #6 be mapped to the various phases in the Software Development Life Cycle (SDLC)?
  8. How would you qualify the maturity of the software to determine that it has reached its milestone?
  9. How do you quantify and measure your work?
  10. What tools can help you improve your work and which framework is needed to implement the tool successfully?

The results of such a strategy formulation process can be:

  • A reduction in the number of missed bugs
  • Few or no missed critical bugs
  • Test Automation frameworks that are deployed for better visibility, maintainability and productivity

Conclusion

“Strategy without tactics is the slowest route to victory. Tactics without strategy is the noise before defeat.”.

– Sun Tzu

LogiGear Software Test & Performance Conference 2006 Presentations

Presentations to be delivered by LogiGear at the Software Test & Performance Conference 2006 include:

  • Wednesday, Nov. 8, 8:30 am to 10:00 am – “Effectively Training Your Offshore Test Team” by Michael Hackett
  • Wednesday, Nov. 8, 1:15 pm to 2:30 pm – “How to Optimize Your Web Testing Strategy” by Hung Q. Nguyen
  • Wednesday, Nov. 8, 3:00 pm to 4:15 pm – “Agile Test Development” by Hans Buwalda
  • Thursday, Nov. 9, 8:30 am to 10:00 am – “Strategies and Tactics for Global Test Automation, Part 1” by Hung Q. Nguyen
  • Thursday, Nov. 9, 10:30 am to 12:00 pm – “Strategies and Tactics for Global Test Automation, Part 2” by Hung Q. Nguyen
  • Thursday, Nov. 9, 2:00 pm to 3:15 pm – “The 5% Challenges of Test Automation” by Hans Buwalda

To register or for more information on STP CON, see: http://www.stpcon.com/

Hung Nguyen

Hung Nguyen co-founded LogiGear in 1994, and is responsible for the company’s strategic direction and executive business management. His passion and relentless focus on execution and results has been the driver for the company’s innovative approach to software testing, test automation, testing tool solutions and testing education programs.

Hung is co-author of the top-selling book in the software testing field, “Testing Computer Software,” (Wiley, 2nd ed. 1993) and other publications including, “Testing Applications on the Web,” (Wiley, 1st ed. 2001, 2nd ed. 2003), and “Global Software Test Automation,” (HappyAbout Publishing, 2006). His experience prior to LogiGear includes leadership roles in software development, quality, product and business management at Spinnaker, PowerUp, Electronic Arts and Palm Computing.

Hung holds a Bachelor of Science in Quality Assurance from Cogswell Polytechnical College, and completed a Stanford Graduate School of Business Executive Program.

Rob Pirozzi

Over 20 years of sales, marketing, management, and technology experience in high technology with exposure to industries including financial services, healthcare, higher education, government, and manufacturing; demonstrating a strong track record of success. Proven ability to build and maintain strong relationships, contribute to target organization success, and deliver results. Website: http://www.robpirozzi.com/

Hung Q. Nguyen
Hung Nguyen co-founded LogiGear in 1994, and is responsible for the company’s strategic direction and executive business management. His passion and relentless focus on execution and results has been the driver for the company’s innovative approach to software testing, test automation, testing tool solutions and testing education programs.
Hung is co-author of the top-selling book in the software testing field, “Testing Computer Software,” (Wiley, 2nd ed. 1993) and other publications including, “Testing Applications on the Web,” (Wiley, 1st ed. 2001, 2nd ed. 2003), and “Global Software Test Automation,” (HappyAbout Publishing, 2006). His experience prior to LogiGear includes leadership roles in software development, quality, product and business management at Spinnaker, PowerUp, Electronic Arts and Palm Computing.
Hung holds a Bachelor of Science in Quality Assurance from Cogswell Polytechnical College, and completed a Stanford Graduate School of Business Executive Program.
Hung Q. Nguyen on Linkedin
Rob Pirozzi
Over 20 years of sales, marketing, management, and technology experience in high technology with exposure to industries including financial services, healthcare, higher education, government, and manufacturing; demonstrating a strong track record of success.

The Related Post

In today’s mobile-first world, a good app is important, meaning an effective Mobile Testing strategy is  essential.  
David S. Janzen – Associate Professor of Computer Science Department California Polytechnic State University, San Luis Obispo – homepage LogiGear: How did you get into software testing and what do you find interesting about it? Professor Janzen: The thing I enjoy most about computing is creating something that helps people. Since my first real job ...
People who follow me on twitter or via my blog might be aware that I have a wide range of interests in areas outside my normal testing job. I like to research and learn different things, especially psychology and see if it may benefit and improve my skills and approaches during my normal testing job. ...
LogiGear Magazine – February 2014 – Test Methods and Strategies
Most have probably heard the expression ‘less is more‘, or know of the ‘keep it simple and stupid‘ principle. These are general and well-accepted principles for design and architecture in general, and something that any software architect should aspire to. Similarly, Richard P. Gabriel (a major figure in the world of Lisp programming language, accomplished poet, and currently ...
Alexa Voice Service (AVS): Amazon’s service offering for a voice-controlled AI assistant. Offered in different products. Source: https://whatis.techtarget.com/definition/Alexa-Voice-Services-AVS Autopilot Short for “automatic pilot,” a device for keeping an aircraft on a set course without the intervention of the pilot. Source: https://en.oxforddictionaries.com/definition/us/automatic_pilot Blockchain Infrastructure: A complex, decentralized architecture that orchestrates many systems running asynchronously over the ...
It’s a bird! It’s a plane! It’s a software defect of epic proportions.
The 12 Do’s and Don’ts of Test Automation When I started my career as a Software Tester a decade ago, Test Automation was viewed with some skepticism.
This article was developed from concepts in the book Global Software Test Automation: Discussion of Software Testing for Executives. Introduction When thinking of the types of Software Testing, many mistakenly equate the mechanism by which the testing is performed with types of Software Testing. The mechanism simply refers to whether you are using Manual or ...
Please note: This article was adapted from a blog posting in Karen N. Johnson’s blog on July 24, 2007. Introduction The password field is one data entry field that needs special attention when testing an application. The password field can be important (since accessing someone’s account can start a security leak), testers should spend more ...
Introduction Keyword-driven testing is a software testing technique that separates much of the programming work of test automation from the actual test design. This allows tests to be developed earlier and makes the tests easier to maintain. Some key concepts in keyword driven testing include:
Test plans have a bad reputation, and perhaps, they deserve it! There’s no beating around the bush. But times have changed. Systems are no longer “black boxes” where QA Teams are separated from design, input, and architecture. Test teams are much more technically savvy and knowledgeable about their systems, beyond domain knowledge. This was an old ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe