Risks of Not Integrating QA Into DevOps

Automated Testing is a huge part of DevOps, but without human-performed quality assurance testing, you’re increasing the risk of  lower-quality software making it into production. 

Automated Testing is an essential DevOps practice to increase organizations’ release cadence and code quality. But there are definitely limits to only using Automated Testing. Without human quality assurance (QA) testing, software is released without ever taking the end-user experience into account. This all but ensures lower-quality software with a poor end-user experience goes to production.

Simply, Automation is a huge part of DevOps, but it shouldn’t be confused with eliminating all manual processes. Conflating the 2 can cause otherwise beneficial DevOps practices to do more harm than good. Here, we explain the top 3 risks organizations face if human QA is not integrated with DevOps.

1. Lack of Human Intervention Allows for Errors to Slip Through Cracks

Automated Testing has greatly improved both the speed and code quality associated with builds and releases. But machines aren’t quite human (yet…), and automated delivery practices cannot always grasp some human aspects of a project. At this time, customer experience cannot (by its very nature) be solely replicated by Automation.

Automated Testing is a type of ‘functional testing.’ Functional testing looks to ensure that defined requirements are properly satisfied by the software. There are 8 different types of functional testing that a build should undergo. Some of these tests are automated, while others require manual intervention. When QA teams are not involved in the testing process and integrated into a DevOps culture, builds can get pushed live with obvious user experience failures.

Just because a code passes Automated Testing doesn’t mean its experience does.

In the same way that a basic spell checker wouldn’t catch the error when talking about “Pear Harbor” (when you meant to type “Pearl Harbor”), many automated functional tests may miss what are obvious user experience failures to a human user. Even large-scale, well-established enterprises sometimes struggle to ensure quality in their builds when humans and machines fail to collaborate on QA testing.

The Impact

Failing to properly test software before it is live causes more time and money to fix the problem after the build has been released.

In the mid-2000s, Toyota drivers reported their cars accelerating without them even touching the pedal, causing several accidents and a recall of millions of affected vehicles. These errors within the cars’ installed software caused stock prices to drop and drivers to migrate towards other car brands.

Builds pushed live with user errors that would have been obvious to a human tester drive away current and potential customers due to a negative experience for the end-user. Toyota could have avoided a massive recall and saved their current and future customers from moving towards alternative manufacturers.

The Solution

In order to ensure only the highest-quality builds are being released, organizations should perform manual QA testing alongside automatic testing. The “right” Manual Testing will vary from organization to organization and from build to build. Organizations will need to work to determine what testing is appropriate for them and where the balance between automatic and Manual Testing lies.

Take care to effectively plan, define, and document testing. This helps increase communication between teams and ensures efficiency and efficacy in testing.

An effective plan includes creating a Quality Management Plan, Test Strategy, and Test Case. This chart from AltexSoft helps break down this plan (Figure 1).

Ensuring that QA is a priority when testing software can help organizations reduce errors in their build, saving them time, money, and the loss of customers.

Figure 1—This chart from AltexSoft breaks down planning your QA testing journey

2. Work Silos

“Siloing” refers to the (real or artificial) separation that forms between workers or teams when collaboration isn’t required. While DevOps works to increase communication and collaboration across all teams within an organization, sometimes organizations focus solely on increasing communication between the Development and Operations teams, forgetting about the other teams in the organization. Not fully integrating QA into your DevOps culture permits silos to exist, further breaking down communication between Development/Operations and QA.

The Impact

People who don’t talk to one another on collaborative projects don’t produce excellent work. Ineffective communication and collaboration can lead to assumptions that go to production and ultimately impact end-users.

For example, the Development and Operations teams may assume that QA’s dedicated job is to find and fix any problems in their code. They may be less careful, therefore, checking and testing their code before sending it to QA, since that’s “not their job.”  QA is then bombarded with lower-quality software, which increases testing times, lengthens feedback loops, and significantly slows release cycles.

While this is an extreme example, it represents the problems of the poor communication and negative business KPIs that silos produce.

The Solution

The best solution to reduce the QA silo is to integrate QA into your organization’s DevOps culture. Start by asking questions about the current process such as:

  • How does QA get code and changes from development?
  • If issues are found, how does QA communicate this information to development? (Do they share it at all?)
  • After code passes QA, how does it pass to the Operations team? Is this automated or in-person?

There are several best practices for integrating QA into a DevOps culture, which include:

  1. Integrate Testing teams into Technical Teams, which automatically allows QA to focus on the appropriate human tests and to move beyond manual functional testing.
  2. Incentivize excellent quality (and therefore all teams prioritizing QA) by adjusting individual and team KPIs to include QA. This will help strengthen necessary behavior and encourage a cultural shift.
  3. Facilitate and encourage communication and collaboration between development, operations, and QA in order to optimize their efforts. Remember: DevOps is a cultural shift as much as a technological one!

Reducing—or better yet, eliminating—silos within the workplace will help organizations consistently produce high-quality software, faster.

3. Undefined Quality Expectations

Without QA being integrated into DevOps, the end-user experience remains undefined and walled off from development and operations. The result is often that these teams want their software to pass automated tests but may not consider the full experience. In fact, if silos are extreme enough, lack of end-to-end visibility may mean these teams have no concept of the full end-user experience.

The Impact

When quality expectations are not fully defined, users are not receiving the highest quality software possible. When users experience frustrating software, their negative experience will drive them away from that software, costing companies valuable market share. Additionally, SDLC teams will have to spend more time and money fixing software after its release.

The Solution

Development, operations, and QA should work together to define quality expectations, and management should make these conversations a priority. Quality expectations will look different from organization to organization, but every organization should define the standards of an excellent software company. Teams should also identify metrics to assess whether software meets quality standards. This serves the dual purpose of creating measurable data as well as presenting an opportunity to break down silos and encourage cross-team communication.

Examples of metrics include:

  • Total number of test cases
  • Number of test cases passed/failed
  • Number of defects found/accepted/rejected
  • Number of critical defects

Having well-defined quality standards ensures that development and operations are keeping the end-user experience in mind. Additionally, having minimum-acceptable metric standards helps QA ensure only the highest quality software is going out that will meet end-user needs. While having defined standards and metrics is important, QA must also be aware and keep an eye out for any edge-cases that may occur.

Don’t Risk Your Competitive Edge

Integrating QA into DevOps allows organizations to reduce silos and release higher-quality software, faster. For more information on integrating QA into DevOps, check out “The Role of QA in DevOps” and “5 QA Best Practices for DevOps.”

This article is a republication and was originally published on Inedo.com.

The Inedo Team
As “the tech behind the tech,” Inedo’s products provide Windows-primary DevOps solutions to organizations of any size and in any industry. Inedo’s products—BuildMaster, ProGet, and Otter—emphasize strong visualization of process, ease-of-use for Developers of all skill levels, and building on the tools and processes you already have in place.

The Related Post

In order to make the right choices among tools, you must be able to classify them. Otherwise, any choice would be at best haphazard. Without functioning classification, you would not be able to understand new tools fast, nor come up with ideas of using, or creating new tools.
From automotive Software Testing standards, testing techniques, and process, this article is an in-depth guide for those looking to transfer their existing skills to this exciting industry. For the Software Car, autonomous driving gets most of the hype, but most overlook the fact that there is so much more to Software Testing for the automotive ...
As I wrote in various articles, organization is one of the 3 key requisites for successful automated testing, the other two being test design and automation architecture.
Many organizations rely on HP Quality Center to design test plans and track test results. TestArchitect’s Quality Center integration makes working with QC as easy as pie. TestArchitect (TA) is a three-in-one tool for Test Management, Test Development, and Test Automation. Users can create and manage test assets, execute tests, track and analyze test results, ...
I recently came back from the Software Testing & Evaluation Summit in Washington, DC hosted by the National Defense Industrial Association. The objective of the workshop is to help recommend policy and guidance changes to the Defense enterprise, focusing on improving practice and productivity of software testing and evaluation (T&E) approaches in Defense acquisition.
An Overview of Four Methods for Systematic Test Design Strategy Many people test, but few people use the well-known black-box and white-box test design techniques. The technique most used, however, seems to be testing randomly chosen valid values, followed by error guessing, exploratory testing and the like. Could it be that the more systematic test ...
Test automation provides great benefits to the software testing process and improves the quality of the results. It improves reliability while minimizing variability in the results, speeds up the process, increases test coverage, and ultimately can provide greater confidence in the quality of the software being tested. However, automation is not a silver bullet. It ...
Utility: A program that performs a specific task related to the management of computer functions, resources, or files, as password protection, memory management, virus protection, and file compression. Tool: A program or application that software development teams use to create, debug, maintain, or otherwise support other programs and applications. The term usually refers to programs that can be combined together ...
LogiGear Magazine – September 2010
Framework: An abstraction in which software providing generic functionality can be selectively changed by additional user written code, thus providing application specific software. A software framework is a universal, reusable software platform used to develop applications, products and solutions. Harness: A collection of software and test data configured to test a program unit by running it under varying conditions and monitoring ...
Developers of large data-intensive software often notice an interesting — though not surprising — phenomenon: When usage of an application jumps dramatically, components that have operated for months without trouble suddenly develop previously undetected errors. For example, the application may have been installed on a different OS-hardware-DBMS-networking platform, or newly added customers may have account ...
Elfriede Dustin of Innovative Defense Technology, is the author of various books including Automated Software Testing, Quality Web Systems, and her latest book Effective Software Testing. Dustin discusses her views on test design, scaling automation and the current state of test automation tools. LogiGear: With Test Design being an important ingredient to successful test automation, ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe