Put The ‘Q’ Within Your DevOps

The recent industry report estimated worldwide software failure costs to be USD 1.1 Trillion. A 2018 study by the Consortium for IT Software Quality maintained that for the US alone, the cost of defective software was USD 2.84 Trillion. These relative costs of fixing errors post a software release is unarguably higher than if they were uncovered during the design phase.

The 2019 Accelerate State of DevOps report maintains, “DevOps will eventually be the standard way of software development and operations.”  This is indeed true as enterprises acknowledge three factors: increased demand for superior quality software, faster product launches, and better value for tech with optimized software delivery performance. However, in today’s competitive landscape, is this enough to succeed?

Enterprises need something extra—the strategic integration of QA (Quality Assurance) with DevOps. While DevOps brings speed, innovation, and agility, its success lies in adopting the right QA strategy.

Devising the optimal QA strategy – DevOps

The success of Dev-Q-Ops depends on a robust QA strategy and essential best practices to boost rapid development and deployment of DevOps applications.

Earlier, the interaction between developers and testers was minimal, and both groups relied largely on their interpretation of the written / implied requirements without largely validating their understanding amongst themselves. Consequently, this led to too many back and forth arguments, counter-arguments that compromised, not just the quality, but the speed of product deployment too.

Today, most good DevOps implementations include strong interactions between developers, rigorous, in-built testing that comprehensively covers every level of the testing pyramid–from robust unit tests and contracts to API tests and functional end-to-end tests. However, the shifting of lines between what is ‘tested’ by the testers and what is automated by the developers often blurs the test strategies and approaches; thereby according a false sense of good test coverage.

Testing, both automated and manual, are continuous processes that remain active throughout the software development cycles. A definite change in mentalities is required to adopt continuous testing and continuous delivery. The Dev-Q-Ops culture is expected to lay the foundations for this and serve as the much-needed value add to your business.

It is then imperative to introduce a balance and adopt a QA strategy that not only ensures the right coverage and intent of testing, but also leverage automation to the maximum extent. The QA strategy should always be assessed on whether it helps attain vital DevOps metrics. Some examples of how the metrics can be effectively improved are:

  • Lead Time for Changes, where for example, the approach is on early identification of defects through testing early; leveraging the test pyramid, and undertaking improved impact analysis leading to better identification of ‘necessary’ coverage.
  • Deployment Frequency through facilitating the right automation spread throughout the test pyramid, deploying improved test environment and test data management and initiating parallel testing through rapid environment deployments and teardowns.
  • Time to Restore Services through shared monitoring of operational parameters and ensuring the ability to narrow down on change and impact, and thus regression, by leveraging AI.
  • Change Failure Rate by ensuring for example, risk identification, customer experience, and touchpoint testing and monitoring customer feedback and usage.

Effective execution of Dev-Q-Ops also enables Continuous Testing (CT) to easily zero-in on risks, resolve them, offer feedback and enhance quality.

Therein lies the importance of a structured QA strategy. It not only lays the foundations for a consistent approach toward superior quality but also enhances product knowledge while alerting enterprises to questions that would otherwise have remained unanswered. So, to release high-quality products without expensive rollbacks or staging, it is imperative to add the ‘Q’ to your DevOps.

In conclusion, new-age applications are complex. While DevOps can quicken their rollout, they may fail if not bolstered by a robust QA strategy. QA is integral to the DevOps process and without it continuous development and delivery are inconceivable.

Responsible Testing in the Times of COVID

As a software tester, I have ensured that software products and applications function and perform as expected. My team has been at the forefront of using the latest tools and platforms to enable a minimal defect product for the market release. We are proud to have exceeded industry standards in terms of defect escape ratios.

The health scare COVID has interrupted almost all industries and processes, but society is resilient, never gives up, and life (and business) must go on. We are up to the task and giving our best in adapting to the testing times and situations.

Testing times for a software tester

While we have leveraged existing resources and technology to run umpteen tests in the past, the current pandemic that has enveloped the world has put us in unchartered territory. While our clients understand the gravity of the situation, they also need to keep their business running. We now work from home and continue testing products just like before without interruption. There have been challenges, but we have ensured business continuity to protect our client from any adverse impact of this disruption. From our testers struggling to adapt to the new world order, I would like to share how we sailed through these trying times. It might help you do what you do best, test!

Ensure access, security/integrity

As testers, we work in different environments such as on-prem, on the cloud, or the client-side cloud environment. Working from a secure office environment, we have access to all environments. It is not the same anymore, as we now use public networks. The best and most secure way to access governed environments is to connect via a VPN to access different environments securely. VPN’s offer secure, pre-engineered access and provide additional levels of bandwidth and control.

Use cloud-devices for compatibility tests

Testing applications for different platforms and devices is simpler at the workplace as we have ready access to company-owned devices (some of which are expensive). It’s not the same when working from home. Besides, these devices cannot be a shared resource. The unavailability of devices cannot act as a blockade. I am leveraging the cloud using resources such as SauceLab, Devicefarm alongside options such as simulators and emulators configured on my system.

Augment access speed for reliable testing

One concern working from home is the need for a dependable, high-speed internet connection. However, I signed up with a service provider offering verified speed. I buttressed my connectivity by arranging for an alternate internet connection from a different service provider with similar bandwidth capability. I made a distinction between these networks as network1 and network2, ensuring that the networks get utilized for the designated purpose, and bandwidth issues avoided.

Coordinate test plans with collaboration utilities

In the initial days of the work-from-home arrangement, I found it difficult to coordinate with the team and there were productivity concerns. This is when we decided to chalk a schedule to address coordination issues. We decided to better utilize the messenger tools provided to us for seamless communication. As a first step towards making optimal use of these messenger tools we drew guidelines on the do’s and don’ts to optimally use our time. This article penned by a senior colleague worked as a handy reference on using one such communication tool.

The future looks uncertain with Covid’s impact deepening by the day. In these times when everything looks uncertain we as responsible testers can play our role in ensuring that we are available to our partners and help products and apps reach their respective audience.