Testing Cloud-Native Applications

Lipika
9 min readMar 7, 2021

--

We have been hearing the term “Cloud Native” quite very often but what exactly is this Cloud Native and how does it impact the Testing Strategy? In this article, I will try to explain the basics of Cloud Native, highlight some of the key features that need to be taken care of while designing the Test Strategy for Cloud-Native Applications and how can Cloud-Native help in Testing an Application.

Cloud Native app is an approach to building and running applications which uses the cloud computing delivery model. It helps in running scalable applications.

It is also now considered to be the future of App Development and 28% of our applications are likely to be migrated to Cloud-Native by 2022.

With Cloud Native Architecture being actively adopted for Software Development, Quality Analysts should also start working towards this.

Before going towards the Testing Approach for Cloud-Native Applications, let’s first understand the difference between monolithic architecture and cloud-native architecture.

Monolithic Architecture vs Cloud-Native Architecture

Monolith Architecture is the traditional model of app development where all the components are tightly coupled. For a small functionality change, we need to redeploy the entire application which would take a lot of time for the deployment process hence becoming a tedious job even for the QAs to perform regression on the entire application. And because deployment takes a lot of time, most likely deployment is done once in 6 months.

On the other hand, Cloud Native is pretty much the opposite of Monolithic Architecture. Every component called containers in Cloud Native is loosely coupled and is independent of each other. These containers are then connected via the cloud. Since every component is independent of each other, it becomes very easier to maintain and becomes a shared responsibility to manage. In Cloud Native for a small functionality change we just need to make changes to the existing component and deploy the same. Even QAs can focus on the respective component.

In Cloud Native, it is always believed to release early and often. Other major advantages of having Cloud-Native are its ability to scale and freedom to choose the tech stack.

Testing in Monolith and Cloud-Native Architecture

Now that we understand the difference between Monolith and Cloud-Native, let’s understand how is testing different in these architectures.

Testing in monolith architecture is a process to verify a product after a feature or functionality is developed and before it gets deployed to production.

While in Cloud-Native Apps since everything is hosted in the cloud, we testers need to be alert all the time. Testing should be a Continuous Process. QAs need to test every stage of the Software Development Life Cycle be it a small bug fix or a functionality change. After every build, developers can push the changes to the cloud, and we can test them immediately and independently. Not only in the dev environment QAs can perform the testing during deployment and even after the product is released in Cloud Native Architecture.

Essentially, We “QAs” are always testing. So, With Cloud Native apps we know we will never run out of work and our job is secured 😜

We can also say that development and testing in Cloud Native happens at a common place. Well, honestly it’s just a common server that is used!

Testing Approach for Cloud-Native Applications

In general, While designing a testing strategy or coming up with a testing approach one needs to understand certain features of a technology. Similarly, Cloud Native has a certain measure that we need to focus on while coming up with a Test Plan. These measures not only ensure the better development of the application but also ensures efficient and effective testing.

Measures in Cloud-Native Apps and Approach to testing

  1. Scalability: Scalability is the ability to scale the containers or pods based on the real-time requirement.

For instance, during Big billion days or any sale on e-commerce, there is a sudden spike in the number of users. So it is required to ensure the containers are automatically scaled up to incur such heavy demand based on Data storage capacity or processing power and also it auto-scales down when the usage is less.

As QAs, some of the basic scenarios to cover here would be

  • What happens when instances are no longer required?
  • How is the service scaled?
  • How is traffic routed when nodes are overloaded?

In such Auto Scaling testing, watching out for any memory leak becomes a key point here and to verify if all the threads are being closed with proper logging and CPU Utilisation.

2. Resiliency: The second measure is Resiliency. It is the ability to recover any failed container or cluster without losing the state.

To take an example, consider a user is in the middle of the payment and suddenly the app crashes. In such a scenario proper load balancing is required and redirecting to the “Disaster Recovery Cluster” to ensure that there is no impact on the user’s experience in terms of functionality or performance. This also helps in maintaining the impact of the cost associated with the business.

Handling the failure scenarios to have a resilient system should be the focus here:

  • Does the system degrade gracefully?
  • Can we still respond to user requests if legacy systems are unavailable?
  • Kubernetes can perform rolling restarts and ensure the automated spin up for disaster Recovery clusters in such scenarios.

3. Robust Deployment: When planning out the automation efforts for cloud-native apps, the team should focus on the valuable areas and not focus on automating everything. Since time is limited, it clearly will not be possible or add value to automate everything.

The focus should not be on large unmaintainable, unreliable regression packs that add no value but smart test packs that can:

  • simultaneously validate a deployment, and
  • perform a selected set of functional tasks

This type of automated regression will add a constant value to the development effort and accordingly can be expanded or reduced as the product matures.

4. Resource Utilisation: In traditional applications, operations teams would manage the infrastructure resource allocations manually. This is a tedious job and requires heavy maintenance.

In cloud-native applications Resources are utilized automatically via a central orchestrating process that dynamically manages and schedules containers. Since this is based on certain rules set by dev ops, we need to ensure the balanced loads are maintained otherwise it can result in low resource utilization or increase the response time of tests.

This reduces the costs associated with maintenance and operations.

CLOUD NATIVE AND SOFTWARE TESTING

While much of the functional aspects don’t change too much, the cloud offers and sometimes requires very different ways to meet non-functional requirements especially while developing the application. Cloud-Native helps testing apps in a better way.

Let’s look at how Cloud Native helps in achieving

  • E2E Automation
  • AB Testing
  • Performance Testing
  • Security Testing

1. E2E Automation: Selenium Grid is mostly used to achieve end-to-end testing in Traditional Model. Also, in Selenium Grid, we have come across a very common error while automating tests which is “Error communicating with the remote browser. It may have died” or “UnreachableBrowserException”

Deploying browsers using Cloud Native say with Kubernetes brings in all the advantages like the Self-healing feature which is nothing but automatically spinning up the mode or pod when it goes down.

From a scalability perspective — with just one command called “Replicas” we can scale to any number of browsers we need (which may not be the same in other technologies)

2. AB testing: AB testing is nothing but releasing a version of the Product to a Specific Set of Users. AB testing ensures while there is zero downtime one can test a feature with real-time production users and also monitor the traffic. QAs play a vital role especially while selecting the target audience and run the tests for a reasonable period to reach a statistical significance of at least 95%.

When performing an AB test, we route a subset of users to new functionality based on routing rules. Routing rules often include factors such as browser version, user agent, geolocation, and operating system. Based on the various factors and measuring which version produced the better results, the production environment can then be upgraded to the same version.

Key challenges for AB Testing:

Several factors can skew the test results, including false positives, biased sampling, or external factors (such as seasonality or marketing promotions).

Observability can potentially be a challenge when we perform such testing. When we run multiple AB tests on overlapping traffic, the processes of monitoring and troubleshooting can be difficult. To put it in simpler terms, if we test product page A versus product page B, or checkout page C versus checkout page D, distributed tracing becomes important to determine the metrics.

3. Performance Testing: Performance Testing in Cloud-Native Apps is a challenge here.

Fundamental Shift of Approach: For monolith applications, we have always been hitting URLs to do the performance testing but for Cloud-Native Apps hosted in the cloud with various containers; instead of hitting the URLs we would now look at the Cluster or Containers performance. So we are now moving from the URL approach to Container-based performance. I would not say that we do not hit URLs but something that we need to consider in Cloud-Native Apps is the Performance of the containers which is the core of this architecture.

For better performance in Cloud Native Application, Load Balancers are configured to ensure that requests are sent to various nodes based on their configuration or rules. Although everything is automated here, observing the logs and keeping track of the container logs is also required to ensure an effective troubleshooting mechanism.

Preparation of right set test data is crucial to analyze the results correctly.

4. Security Testing: Nowadays, all or most of the applications are hosted in the Cloud and since everything is hosted in the cloud security of the applications is at stake as most of the applications have highly sensitive data.

The main objective of Cloud-based security testing is to stop any threat or malware from accessing, stealing, or manipulating any of the sensitive data. It is responsible to identify any of the threats measure their potential vulnerabilities. One advantage with cloud-based application security testing is that it can be applied to the large application, applications with low to medium risk, and even to applications with a strict budget or time restrictions

Cloud-based Application Security Testing also provides the feasibility to host the security testing tools on the Cloud for testing. In traditional testing, we need to have on-prem tools and infrastructure to perform any kind of testing. With industries adopting Cloud-based testing techniques, the processes have also become faster and cost-effective.

New Era for QAs

With applications being upgraded towards Cloud Native apps we QAs also need to upgrade ourselves. Testing has always helped us build applications that are more maintainable, debuggable, reliable, and performant allowing us to ship faster with more confidence.

But as applications have become more and more complex, new types of failure modes have been introduced which are becoming more difficult to anticipate or troubleshoot. To efficiently detect, debug and fix them, we should review how we can use traditional testing techniques to embrace the new ones and apply them to all stages of the development lifecycle. Only then testing can return to be the valuable ally as it once was in delivering high-quality software.

Special Thanks to Manoj for guiding me through this topic :)

If you enjoyed reading, please do support with claps & share.

Do share your thoughts in the comments section!

--

--

Lipika
Lipika

Written by Lipika

"When there's code, there's bug". I find bugs to ensure quality software

No responses yet