Risks of Not Integrating QA Into DevOps

Automated Testing is a huge part of DevOps, but without human-performed quality assurance testing, you’re increasing the risk of  lower-quality software making it into production. 

Automated Testing is an essential DevOps practice to increase organizations’ release cadence and code quality. But there are definitely limits to only using Automated Testing. Without human quality assurance (QA) testing, software is released without ever taking the end-user experience into account. This all but ensures lower-quality software with a poor end-user experience goes to production.

Simply, Automation is a huge part of DevOps, but it shouldn’t be confused with eliminating all manual processes. Conflating the 2 can cause otherwise beneficial DevOps practices to do more harm than good. Here, we explain the top 3 risks organizations face if human QA is not integrated with DevOps.

1. Lack of Human Intervention Allows for Errors to Slip Through Cracks

Automated Testing has greatly improved both the speed and code quality associated with builds and releases. But machines aren’t quite human (yet…), and automated delivery practices cannot always grasp some human aspects of a project. At this time, customer experience cannot (by its very nature) be solely replicated by Automation.

Automated Testing is a type of ‘functional testing.’ Functional testing looks to ensure that defined requirements are properly satisfied by the software. There are 8 different types of functional testing that a build should undergo. Some of these tests are automated, while others require manual intervention. When QA teams are not involved in the testing process and integrated into a DevOps culture, builds can get pushed live with obvious user experience failures.

Just because a code passes Automated Testing doesn’t mean its experience does.

In the same way that a basic spell checker wouldn’t catch the error when talking about “Pear Harbor” (when you meant to type “Pearl Harbor”), many automated functional tests may miss what are obvious user experience failures to a human user. Even large-scale, well-established enterprises sometimes struggle to ensure quality in their builds when humans and machines fail to collaborate on QA testing.

The Impact

Failing to properly test software before it is live causes more time and money to fix the problem after the build has been released.

In the mid-2000s, Toyota drivers reported their cars accelerating without them even touching the pedal, causing several accidents and a recall of millions of affected vehicles. These errors within the cars’ installed software caused stock prices to drop and drivers to migrate towards other car brands.

Builds pushed live with user errors that would have been obvious to a human tester drive away current and potential customers due to a negative experience for the end-user. Toyota could have avoided a massive recall and saved their current and future customers from moving towards alternative manufacturers.

The Solution

In order to ensure only the highest-quality builds are being released, organizations should perform manual QA testing alongside automatic testing. The “right” Manual Testing will vary from organization to organization and from build to build. Organizations will need to work to determine what testing is appropriate for them and where the balance between automatic and Manual Testing lies.

Take care to effectively plan, define, and document testing. This helps increase communication between teams and ensures efficiency and efficacy in testing.

An effective plan includes creating a Quality Management Plan, Test Strategy, and Test Case. This chart from AltexSoft helps break down this plan (Figure 1).

Ensuring that QA is a priority when testing software can help organizations reduce errors in their build, saving them time, money, and the loss of customers.

Figure 1—This chart from AltexSoft breaks down planning your QA testing journey

2. Work Silos

“Siloing” refers to the (real or artificial) separation that forms between workers or teams when collaboration isn’t required. While DevOps works to increase communication and collaboration across all teams within an organization, sometimes organizations focus solely on increasing communication between the Development and Operations teams, forgetting about the other teams in the organization. Not fully integrating QA into your DevOps culture permits silos to exist, further breaking down communication between Development/Operations and QA.

The Impact

People who don’t talk to one another on collaborative projects don’t produce excellent work. Ineffective communication and collaboration can lead to assumptions that go to production and ultimately impact end-users.

For example, the Development and Operations teams may assume that QA’s dedicated job is to find and fix any problems in their code. They may be less careful, therefore, checking and testing their code before sending it to QA, since that’s “not their job.”  QA is then bombarded with lower-quality software, which increases testing times, lengthens feedback loops, and significantly slows release cycles.

While this is an extreme example, it represents the problems of the poor communication and negative business KPIs that silos produce.

The Solution

The best solution to reduce the QA silo is to integrate QA into your organization’s DevOps culture. Start by asking questions about the current process such as:

  • How does QA get code and changes from development?
  • If issues are found, how does QA communicate this information to development? (Do they share it at all?)
  • After code passes QA, how does it pass to the Operations team? Is this automated or in-person?

There are several best practices for integrating QA into a DevOps culture, which include:

  1. Integrate Testing teams into Technical Teams, which automatically allows QA to focus on the appropriate human tests and to move beyond manual functional testing.
  2. Incentivize excellent quality (and therefore all teams prioritizing QA) by adjusting individual and team KPIs to include QA. This will help strengthen necessary behavior and encourage a cultural shift.
  3. Facilitate and encourage communication and collaboration between development, operations, and QA in order to optimize their efforts. Remember: DevOps is a cultural shift as much as a technological one!

Reducing—or better yet, eliminating—silos within the workplace will help organizations consistently produce high-quality software, faster.

3. Undefined Quality Expectations

Without QA being integrated into DevOps, the end-user experience remains undefined and walled off from development and operations. The result is often that these teams want their software to pass automated tests but may not consider the full experience. In fact, if silos are extreme enough, lack of end-to-end visibility may mean these teams have no concept of the full end-user experience.

The Impact

When quality expectations are not fully defined, users are not receiving the highest quality software possible. When users experience frustrating software, their negative experience will drive them away from that software, costing companies valuable market share. Additionally, SDLC teams will have to spend more time and money fixing software after its release.

The Solution

Development, operations, and QA should work together to define quality expectations, and management should make these conversations a priority. Quality expectations will look different from organization to organization, but every organization should define the standards of an excellent software company. Teams should also identify metrics to assess whether software meets quality standards. This serves the dual purpose of creating measurable data as well as presenting an opportunity to break down silos and encourage cross-team communication.

Examples of metrics include:

  • Total number of test cases
  • Number of test cases passed/failed
  • Number of defects found/accepted/rejected
  • Number of critical defects

Having well-defined quality standards ensures that development and operations are keeping the end-user experience in mind. Additionally, having minimum-acceptable metric standards helps QA ensure only the highest quality software is going out that will meet end-user needs. While having defined standards and metrics is important, QA must also be aware and keep an eye out for any edge-cases that may occur.

Don’t Risk Your Competitive Edge

Integrating QA into DevOps allows organizations to reduce silos and release higher-quality software, faster. For more information on integrating QA into DevOps, check out “The Role of QA in DevOps” and “5 QA Best Practices for DevOps.”

This article is a republication and was originally published on Inedo.com.

The Inedo Team
As “the tech behind the tech,” Inedo’s products provide Windows-primary DevOps solutions to organizations of any size and in any industry. Inedo’s products—BuildMaster, ProGet, and Otter—emphasize strong visualization of process, ease-of-use for Developers of all skill levels, and building on the tools and processes you already have in place.

The Related Post

This article was developed from concepts in the book Global Software Test Automation: Discussion of Software Testing for Executives. Introduction There are many potential pitfalls to Manual Software Testing, including: Manual Testing is slow and costly. Manual tests do not scale well. Manual Testing is not consistent or repeatable. Lack of training. Testing is difficult ...
One of the basic challenges with test automation is adoption. I can’t tell you how many times I’ve cataloged licenses for a company and found out they already have many different automation software packages, none of which is being used. Traditionally I’ve been told that is because the tools don’t work and that the teams ...
Developers of large data-intensive software often notice an interesting — though not surprising — phenomenon: When usage of an application jumps dramatically, components that have operated for months without trouble suddenly develop previously undetected errors. For example, the application may have been installed on a different OS-hardware-DBMS-networking platform, or newly added customers may have account ...
Test execution and utility tools that can make your job easier My first exposure to the necessity for testers to have an array of tools was from the groundbreaking article “Scripts on my Toolbelt” by Danny Faught. Danny laid out the ideal approach to any testing job, and it got me thinking “How can I ...
This article was developed from concepts in the book Global Software Test Automation: A Discussion of Software Testing for Executives, by Hung Q. Nguyen, Michael Hacket and Brent K. Whitlock Introduction The top 5 pitfalls encountered by managers employing software Test Automation are: Uncertainty and lack of control Poor scalability and maintainability Low Test Automation ...
As our world continues its digital transformation with excitement in the advancement and convergence of so many technologies- from AI, machine learning, big data and analytics, to device mesh connectivity, nor should we forget VR and AR- 2017 promises to be a year that further transforms the way we work, play and take care of ...
The path to continuous delivery leads through automation Software testing and verification needs a careful and diligent process of impersonating an end user, trying various usages and input scenarios, comparing and asserting expected behaviours. Directly, the words “careful and diligent” invoke the idea of letting a computer program do the job. Automating certain programmable aspects ...
Even the highest quality organizations have tradeoffs when it comes to their testing coverage. In Japan, Europe, and the United States, automotive manufacturers are aiming to enhance automotive functions by using software; in Japan in particular, Toyota, Nissan, Honda, Mazda, and Subaru are all adding endless amounts of software to their vehicles in the form ...
An Overview of Four Methods for Systematic Test Design Strategy Many people test, but few people use the well-known black-box and white-box test design techniques. The technique most used, however, seems to be testing randomly chosen valid values, followed by error guessing, exploratory testing and the like. Could it be that the more systematic test ...
When automated tests are well-organized and written with the necessary detail, they can be very efficient and maintainable. But designing automated tests that deal with data can be challenging if you have a lot of data combinations. For example, let’s say we want to simulate a series of 20 customers, along with the number of ...
I got some comments on my post “Test Everything all the Time” — most notably people commenting that it’s impossible to test “everything”. I can’t agree more. The intention of the post was to make the point that we need to be able to test “everything we can” all the time. That is, you should ...
I recently came back from the Software Testing & Evaluation Summit in Washington, DC hosted by the National Defense Industrial Association. The objective of the workshop is to help recommend policy and guidance changes to the Defense enterprise, focusing on improving practice and productivity of software testing and evaluation (T&E) approaches in Defense acquisition.

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe