The Test Automation Top 5: Suggested Best Practices

This article was developed from concepts in the book Global Software Test Automation: A Discussion of Software Testing for Executives, by Hung Q. Nguyen, Michael Hacket and Brent K. Whitlock

Introduction

The top 5 pitfalls encountered by managers employing software Test Automation are:

  • Uncertainty and lack of control
  • Poor scalability and maintainability
  • Low Test Automation coverage
  • Poor methods and disappointing quality of tests
  • Technology vs. people issues

Following are 5 “best practice” recommendations to help avoid those pitfalls and successfully integrate Test Automation into your testing organization.

1. Focus on the Methodology, Not the Tool

A well-designed Test Automation methodology can help to resolve many of the problems associated with Test Automation. It is one of the keys to successful Test Automation. The methodology is the foundation upon which everything else rests. The methodology drives tool selection and the rest of the Automation process. The methodology will also help to drive the approach to any offshoring efforts that may be under consideration, helping to guide locating the “appropriate” pieces of the testing process both on- and offshore.

When applying a methodology, it is important that testers and Automation Engineers understand and accept the methodology. Also, other stakeholders such as managers, business owners, and auditors should have a clear understanding of the methodology, and the benefits that it brings.

2. Choose Extensible Test Tools

Select a test tool that supports extensibility, a team-based Global Test Automation framework (team members are or may be distributed), and offers a solid management platform.

Surveying test tools can be time consuming, but it is important to choose the best tool to meet your overall test needs. Before beginning the survey, however, you should have a good idea of what you need in the first place. This is intimately tied to your overall test methodology.

Make sure your chosen test tool has an “appropriate” Automation architecture. Whatever tool is used for the Automation, attention should be paid to how the various technical requirements of the test case execution are implemented in a manageable and maintainable way. In looking at tools and considering your methodology, you should ask the basic questions of how well these tools address reusability, scalability, and team-based Automation (a driver for productivity quantitatively), maintainability (a driver for lowering maintenance cost), and visibility (a driver for productivity qualitatively and a vehicle for control, measurability and manageability).

You should strongly consider tools based on Action-Based Testing (ABT). Action-Based Testing (ABT) creates a hierarchical test development model that allows Test Engineers (domain experts who may not be skilled in coding) to focus on developing executable tests based on action keywords, while Automation Engineers (highly skilled technically but who may not be good at developing effective tests) focus on developing the low-level scripts that implement the keyword-based actions used by the test experts.

Care should be taken to avoid simplistic “Record-playback 2.0” tools that claim to do Test Automation with no coding. There is nothing against being able to automate without having to code––it is in fact a good benefit to have. However, “Record-playback 2.0” tool’s bottlenecks quickly show as you start getting deep into production.

3. Separate Test Design and Test Automation

Test design should be separated from Test Automation so that Automation does not dominate test design. In Test Automation, it is preferable to use a keywords approach, in which the Automation focuses on supplying elementary functionalities that the tester can tie together into tests. This way, the complexity and multitude of the test cases do not lead to an unmanageable amount of test scripts.

The testers (domain experts) should fully focus on the development of test cases. Those test cases in turn are the input for the Automation discipline. The Automation Engineers (highly skilled technically) can give feedback to the testers if certain test cases are hard to automate, suggesting alternative strategies, but mainly the testers should remain in the driver’s seat, not worrying too much about the Automation.

In general, no more than 5% of the effort surrounding testing should be expended in automating the tests.

4. Lower Costs

There are 3 ways that you can look to lower costs:

  1. You can use labor that costs less than your local team
  2. You can use a tool that costs less
  3. You can use training to increase the tool productivity

It is important, however, when addressing costs, not to focus on one dimension too closely without keeping in mind the overall methodology and considering the impact of any decision on other parts of the process. For example, lowering one cost such a labor by outsourcing may actually increase the total costs if that labor that does not have the proper skills.

5. Jumpstart with a Pre-Trained Team

Jumpstart the process with a pre-trained outsourcing partner that knows more about Test Automation success than you do, and that has a competent, well-trained staff of Software Testers, Automation Engineers, Test Engineers, Test Leads and Project Managers.

A pre-trained team can:

  • Reduce your overall project timeframe, because you don’t need to include training at the beginning of the project schedule
  • Reduce risk, because you don’t need to worry about how well the team members will learn the material and how skilled they will be after the training is complete

Conclusion

To summarize the preceding in a simple list, the 5 suggested best practices for Test Automation success are:

  1. Focus on the methodology, not the tool
  2. Choose extensible test tools
  3. Separate test design and Test Automation
  4. Lower costs
  5. Jumpstart with a pre-trained team
Hung Nguyen

Hung Nguyen co-founded LogiGear in 1994, and is responsible for the company’s strategic direction and executive business management. His passion and relentless focus on execution and results has been the driver for the company’s innovative approach to software testing, test automation, testing tool solutions and testing education programs.

Hung is co-author of the top-selling book in the software testing field, “Testing Computer Software,” (Wiley, 2nd ed. 1993) and other publications including, “Testing Applications on the Web,” (Wiley, 1st ed. 2001, 2nd ed. 2003), and “Global Software Test Automation,” (HappyAbout Publishing, 2006). His experience prior to LogiGear includes leadership roles in software development, quality, product and business management at Spinnaker, PowerUp, Electronic Arts and Palm Computing.

Hung holds a Bachelor of Science in Quality Assurance from Cogswell Polytechnical College, and completed a Stanford Graduate School of Business Executive Program.

Rob Pirozzi

Over 20 years of sales, marketing, management, and technology experience in high technology with exposure to industries including financial services, healthcare, higher education, government, and manufacturing; demonstrating a strong track record of success. Proven ability to build and maintain strong relationships, contribute to target organization success, and deliver results. Website: http://www.robpirozzi.com/

Hung Q. Nguyen
Hung Nguyen co-founded LogiGear in 1994, and is responsible for the company’s strategic direction and executive business management. His passion and relentless focus on execution and results has been the driver for the company’s innovative approach to software testing, test automation, testing tool solutions and testing education programs.
Hung is co-author of the top-selling book in the software testing field, “Testing Computer Software,” (Wiley, 2nd ed. 1993) and other publications including, “Testing Applications on the Web,” (Wiley, 1st ed. 2001, 2nd ed. 2003), and “Global Software Test Automation,” (HappyAbout Publishing, 2006). His experience prior to LogiGear includes leadership roles in software development, quality, product and business management at Spinnaker, PowerUp, Electronic Arts and Palm Computing.
Hung holds a Bachelor of Science in Quality Assurance from Cogswell Polytechnical College, and completed a Stanford Graduate School of Business Executive Program.
Hung Q. Nguyen on Linkedin
Rob Pirozzi
Over 20 years of sales, marketing, management, and technology experience in high technology with exposure to industries including financial services, healthcare, higher education, government, and manufacturing; demonstrating a strong track record of success.

The Related Post

Check out the top 12 Automation tools with pros and cons–like Cross-Operating Systems, Cross-Automation Platforms, Programming Language Support, and more – for desktop Automation Testing. Although the demand for desktop app testing is not growing as fast as mobile and web app testing, it’s still a crucial day-to-day duty for many testers, especially those who ...
I recently came back from the Software Testing & Evaluation Summit in Washington, DC hosted by the National Defense Industrial Association. The objective of the workshop is to help recommend policy and guidance changes to the Defense enterprise, focusing on improving practice and productivity of software testing and evaluation (T&E) approaches in Defense acquisition.
I feel like I’ve spent most of my career learning how to write good automated tests in an Agile environment. When I downloaded JUnit in the year 2000 it didn’t take long before I was hooked – unit tests for everything in sight. That gratifying green bar is near-instant feedback that everything is going as ...
Even the highest quality organizations have tradeoffs when it comes to their testing coverage. In Japan, Europe, and the United States, automotive manufacturers are aiming to enhance automotive functions by using software; in Japan in particular, Toyota, Nissan, Honda, Mazda, and Subaru are all adding endless amounts of software to their vehicles in the form ...
A short-list of selection criteria and popular automation tools. There are a lot of test automation tools available in the market, from heavy-duty enterprise level tools to quick and dirty playback-and-record tools for browser testing. For anyone just starting their research we’ve put together a short list of requirements and tools to consider.
An Overview of Four Methods for Systematic Test Design Strategy Many people test, but few people use the well-known black-box and white-box test design techniques. The technique most used, however, seems to be testing randomly chosen valid values, followed by error guessing, exploratory testing and the like. Could it be that the more systematic test ...
As our world continues its digital transformation with excitement in the advancement and convergence of so many technologies- from AI, machine learning, big data and analytics, to device mesh connectivity, nor should we forget VR and AR- 2017 promises to be a year that further transforms the way we work, play and take care of ...
Investing in Test Automation training will increase your team’s productivity. The availability of reliable jobs in a competitive US market seems to be constantly embattled with competition and replacements of artificial intelligence (AI). In 2016, Foxconn replaced 60,000 employees with robots. However, the growth of Test Automation as an occupation has highlighted an intriguing option ...
LogiGear Magazine – October 2010
LogiGear Magazine January Trends Issue 2017
When automated tests are well-organized and written with the necessary detail, they can be very efficient and maintainable. But designing automated tests that deal with data can be challenging if you have a lot of data combinations. For example, let’s say we want to simulate a series of 20 customers, along with the number of ...
When Netflix decided to enter the Android ecosystem, we faced a daunting set of challenges: 1. We wanted to release rapidly (every 6-8 weeks). 2. There were hundreds of Android devices of different shapes, versions, capacities, and specifications which need to playback audio and video. 3. We wanted to keep the team small and happy. ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe