Elfriede Dustin Sharing Her Interest in Automated Testing

Elfriede Dustin of Innovative Defense Technology, is the author of various books including Automated Software Testing, Quality Web Systems, and her latest book Effective Software Testing. Dustin discusses her views on test design, scaling automation and the current state of test automation tools.

LogiGear: With Test Design being an important ingredient to successful test automation, we are curious at LogiGear magazine how you approach the problem? What methods do you employ to insure that your teams are developing tests that are ready or supported for automation?

Dustin: How we approach test design depends entirely on the testing problem we are solving. For example, we support the testing of mission critical systems using our (IDT’s) Automated Test and Re-Test (ATRT) solution. For some of the mission critical systems, Graphical User Interface (GUI) automated testing is important, because the system under test (SUT) has operator intensive GUI inputs that absolutely need to work – we approach the test design based on the GUI requirements, GUI scenarios, expected behavior and scenario based expected results.

Other mission critical systems we are testing have a need for interface testing, where hundreds or thousands of messages are being sent back and forth between the interfaces – and the message data being sent back and forth has to be simulated (here we have to test load, endurance and functionality). We might have to look at detailed software requirements for message values, data inputs and expected results and/or examine the SUT’s Software Design Documents (SDDs) for such data so we can ensure the test we are designing, or even the test that already existed, are sufficient to permit validation of those messages against the requirements. Finally, it’s of not much use if we now have automated the testing of thousands of messages, and we have added to the workload of the analysis. Therefore, the message data analysis needs to be automated also and again a different test design is required here. Many more test design approaches exist and need to be applied depending on the many more testing problems we are trying to solve, such as testing web services, databases, and others.

ATRT provides a modeling component for any type of test, i.e. GUI, message-based, combination of GUI and message based and more. This modeling component provides a more intuitive view of the test flow than mere text from a test plan or in a test procedure document. The model view lends itself to producing designs that are more likely to ensure the automated test meets the requirements the testers set out to validate and reduces rework or redesign once the test is automated.

LogiGear: What are your feelings on the different methods of test design from Keyword Test Design, Behavior Driven Test Development (Cucumber Tool) to scripted test automation of manual test case narratives?

Dustin: We view keyword-driven or behavior-driven tests as automated testing approaches; not necessarily test design approaches. We generally try to separate the nomenclatures of test design approaches from automated testing approaches. Our ATRT solution provides a keyword-driven approach, i.e. the tester (automator) selects the keyword action required via an ATRT-provided icon and the automation code is generated behind the scene. We find the keyword / action-driven approach to be most effective in helping test automators and in our case ATRT users build effective tests with ease and speed.

LogiGear: How have you dealt with the problems of scaling automation? Do you strive for a certain percentage of all tests to be automated? If so, what is that percentage and how often have you attained it?

Dustin: Again the answer to the question of scaling automation depends: are we talking scaling automated testing for message-based / background testing or scaling automation for GUI-based automated testing? Scaling for message based / background automated testing seems to be an easier problem to solve than scaling for GUI-based automated testing.

We don’t necessarily strive for a certain percentage of tests to be automated. We actually have a matrix of criteria that allows us to choose the tests that lend themselves to automation vs. those tests that don’t lend themselves to automation. We also build an automated test strategy before we propose automating any tests. The strategy begins with achieving an understanding of the test regimen and test objectives then prioritizes what to automate based on the likely return on investment. Although everything a human tester does can, with enough time and resources, be automated, not all tests should be automated. It has to make sense, there has to be some measurable return on the investment of automation before we recommend applying automation. “Will automating this increase the probability of finding defects?” is a question that’s rarely asked when it comes to automated testing. In our experience, there are ample tests that are ripe for achieving significant returns through automation so we are hardly hurting our business opportunities when we tell a customer that a certain test is not a good candidate for automation. Conversely, for those who have tried automation in their test programs and who claim their tests are 100% automated my first question is “What code coverage percentage does your 100% test automation achieve? One hundred percent test automation does not mean 100% code coverage or 100% functional coverage. One of the most immediate returns on using automation is the expansion of test coverage without increasing the amount of test time (and often while actually reducing test time), especially when it comes to automated backend / message testing.

LogiGear: What is your feeling on the current state of test automation tools? Do you feel that they are adequately addressing the needs of testers? Both those who can script and those who cannot?

Dustin: Most of the current crop of test automation tools focus on Windows or Web development. Most of the existing tools must be installed on the SUT thus modifying the SUT configuration. Most of the tools are GUI technology dependent. In our test environments we have to work with various Operating Systems in various states of development, using various types of programming languages and GUI technologies. Since none of the currently provided tools on the market met our vast criteria for an automated testing tool, we at IDT developed our own tool solution mentioned earlier, ATRT. ATRT is GUI technology neutral, doesn’t need to be installed on the SUT, is OS independent and uses keyword actions. No software development experience is required. ATRT provides the tester with every GUI action a mouse will provide in addition to various other keywords. Our experience with customers using ATRT for the first time is that it provides a very intuitive user interface and that it is easy to use. We also find, interestingly, that some of the best test automators with ATRT are younger testers who have grown up playing video games coincidentally developing eye-hand coordination. Some of the testers even find working with ATRT to be fun taking them away from the monotony of manually testing the same features over and over again with each new relase.

With ATRT we wanted to provide a solution where testers could model a test before the SUT becomes available and/or see a visual representation of their tests. ATRT provides a test automation modeling component.

We needed a solution that could run various tests concurrently (whether GUI tests or message based/backend/interface test) on multiple systems. ATRT has those features.

A tester should be able to focus on test design and new features. A tester should not have to spend time crafting elaborate automated testing scripts. Asking a tester to do such work places him/her at risk of losing focus on what he/she is trying to accomplish, i.e. verify the SUT behaves as expected.

LogiGear: In your career, what has been the single greatest challenge to getting test automation implemented effectively?

Dustin: Test automation can’t be treated as a side-activity or nice-to-have activity, i.e. whenever we have some time let’s automate a test. Automated testing needs to be an integral, strategically planned part of the software development lifecycle (that is a mini-lifecycle itself) and cannot be an afterthought. See our books on “Automated Software Testing” and “Implementing Automated Software Testing” on the automated testing lifecycle.

LogiGear: Finally, what is the greatest benefit that test automation provides to a software development team and/or the test team?

Dustin: Systems have become increasingly complex. We can’t test systems the same way we did 20 years ago. Automated software testing applied with a return-on-investment strategy built collaboratively with subject matter and testing experts, provides speed, agility and efficiencies to the software development team. When done right, no software or testing team can test anywhere near as much, so quickly, consistently and repeatedly as with automated testing.

 

LogiGear Corporation

LogiGear Corporation provides global solutions for software testing, and offers public and corporate software-testing training programs worldwide through LogiGear University. LogiGear is a leader in the integration of test automation, offshore resources and US project management for fast and cost-effective results. Since 1994, LogiGear has worked with hundreds of companies from the Fortune 500 to early-stage startups, creating unique solutions to exactly meet their needs. With facilities in the US and Vietnam, LogiGear helps companies double their test coverage and improve software quality while reducing testing time and cutting costs.

For more information, contact Joe Hughes + 01 650.572.1400

LogiGear Corporation
LogiGear Corporation provides global solutions for software testing, and offers public and corporate software testing training programs worldwide through LogiGear University. LogiGear is a leader in the integration of test automation, offshore resources and US project management for fast, cost-effective results. Since 1994, LogiGear has worked with Fortune 500 companies to early-stage start-ups in, creating unique solutions to meet their clients’ needs. With facilities in the US and Viet Nam, LogiGear helps companies double their test coverage and improve software quality while reducing testing time and cutting costs.

The Related Post

This article was developed from concepts in the book Global Software Test Automation: A Discussion of Software Testing for Executives, by Hung Q. Nguyen, Michael Hacket and Brent K. Whitlock Introduction The top 5 pitfalls encountered by managers employing software Test Automation are: Uncertainty and lack of control Poor scalability and maintainability Low Test Automation ...
With the new year just around the corner, here’s a look at the Test Automation trends that have the potential to dominate. DevOps is being relied upon more than ever. With there being strong Market Drivers for the adoption of DevOps, the need for Test Automation has also never been greater. But what’s next after ...
When configured with a Python harness, TestArchitect can be used to automate testing on software for custom hardware Unlike other proprietary and open source tools, that are able to automate only desktop, or mobile, TestArchitect (TA Test) has the ability to test the software that runs on hardware in the following ways: 1. TA can ...
When automated tests are well-organized and written with the necessary detail, they can be very efficient and maintainable. But designing automated tests that deal with data can be challenging if you have a lot of data combinations. For example, let’s say we want to simulate a series of 20 customers, along with the number of ...
Test Automation is significant and growing-yet I have read many forum comments and blog posts about Test Automation not delivering as expected. It’s true that test automation can improve reliability while minimizing variability in the results, speed up the process, increase test coverage, and ultimately provide greater confidence in the quality of the software being ...
Introduction As a consultant and trainer, I am often asked by my clients and students how to deal with automated acceptance tests. One common question is whether automated acceptance tests should use the graphical user interface (GUI) provided by the application.
Are you frustrated with vendors of test automation tools that do not tell you the whole story about what it takes to automate testing? Are you tired of trying to implement test automation without breaking the bank and without overloading yourself with work? I experienced first-hand why people find test automation difficult, and I developed ...
We’re celebrating the 1st birthday of our Agile eBook! It has been one year since we launched our eBook on Agile Automation. To celebrate, we’ve updated the foreword and included a brand new automation checklist! As we take the moment to mark this occasion, we wanted to take some time to reflect on the State ...
LogiGear Magazine – September 2010
LogiGear Magazine – October 2010
As our world continues its digital transformation with excitement in the advancement and convergence of so many technologies- from AI, machine learning, big data and analytics, to device mesh connectivity, nor should we forget VR and AR- 2017 promises to be a year that further transforms the way we work, play and take care of ...
5 roadblocks in vehicular autonomy that complicate Software Testing Experts in the field have previously referred to air travel as somewhat of a gold standard for autonomous vehicle safety, but after Boeing’s two tragedies, that analogy can no longer be used when talking about self-driving cars. This was after Boeing’s 737 MAX Jets have found ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe