Cruise Control: Automation in Performance Testing

LGMweb.shutterstock_231819124

When it comes to performance testing, be smart about what and how you automate

Listen closely to the background hum of any agile shop, and you’ll likely hear this ongoing chant: Automate! Automate! Automate! While automation can be incredibly valuable to the agile process, there are some key things to keep in mind when it comes to automated performance testing.

Automated performance testing is important for many different reasons. It allows you to refactor or introduce change and test for acceptance with virtually no manual effort. You can also stay on the lookout for regression defects and test for things that just wouldn’t come up manually. Ultimately, automated testing should save time and resources, so you can release code that is bug-free and ready for real-world use.

Recently, I spoke with performance specialist Brad Stoner about how to fit performance testing into agile development cycles. This week, we’ll use this blog post to follow up with greater detail around performance testing automation and recap which performance tests are good candidates for automation. After all, automation is an important technique for any modern performance engineer to master.

Automation Without Direction

Most of the time, automation gets set up without performance testing in mind. Performance testing is, at best, an afterthought to the automation process. That leaves you, as a performance engineer, stuck with some pretty tricky scenarios. Maybe every test case is a functional use case, and if you want to adapt them for performance, you have to go back and modify them for scale or high concurrency. Or perhaps the data required for a large performance test is never put together leaving you with a whole new pile of work to do.

Use cases are strung together in an uncoordinated way, so you have to create another document that describes how to use existing functional tests to conduct a load test. And of course, those test cases are stuck on “the happy path” making sure functionality works properly, so they don’t test edge cases or stress cases, and therefore, don’t identify performance defects.

None of these scenarios is desirable, but they can be easily rectified by incorporating performance objectives into your automation strategy from the start. You want to plan your approach to automation intelligently.

What Automation Is – And Isn’t – Good For

You can’t automate everything all the time. If you run daily builds, you can’t do a massive load test every night. That idea would be even worse if you build several times a day. Instead, you’ll have to pick and choose your test cases, mapping out what you do over periods of time in coordination with the release cycle for the app.

Too many use cases to cover at a time will kill your environment. Constantly high traffic patterns are next to
impossible to maintain. Highly specific test scenarios can also cause difficulty because you may need to adjust performance tests every time something changes. That’s why it pays to be smart about what you automate.

Look for a manageable number of tests that can be run generically and regularly. Then, benchmark those tests. After that, you can focus your manual time on ad hoc testing, bottlenecks, or areas under active development. This isolation will catch a ton of defects before production.

Get Automation Working for You

Automation can be great, but it has to identify performance defects and alert you. Just like functional tests validate a defined plan of how an application should behave, performance tests should validate your application’s service level agreement. Define the tests in which you want to leverage automation. Is it for workload capacity? Or are you looking for stress, duration, and soak tests? Will you automate to find defects on the front end?

It’s easy to automate these problems, and you can do it at a low cost. You’ll want to establish benchmarks and baselines often to see if performance degrades as applications are further developed. Testing with direction means that you don’t test just for the sake of testing. You always test with a purpose and motive: to find and isolate performance defects. This is a critical thing to do as a performance engineer because you’re always dealing with pushing the envelope of the application. You need to know where that boundary lies.

Get Ready for Smooth Sailing

Automated performance testing can be a huge time saver. To make the most of that time-saving potential, you want to do it right. Work smart by always testing with purpose. Ready to dive even deeper into these topics? Jump right in and check out the full webcast here where we go into greater detail about automation strategies. You can also learn how Neotys can help you with the overall agile performance testing cycle.

This article originally appeared in NEOTYS BLOG .

Tim Hinds
Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices. Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.
Tim Hinds on Linkedin

The Related Post

Learn how to leverage TestArchitect and Selenium for turnkey, Automated Web testing. TestArchitect lets you create, manage, and run web-based automated tests on different types of browsers—using either a WebDriver or non-WebDriver technique. In this article, we will explore employing WebDriver for testing a web-based application with TestArchitect. TestArchitect with WebDriver is a tool for automating ...
Jenkins is a Continuous Integration (CI) tool that controls repeatable tasks in software development. Check out this guide to see how TestArchitect seamlessly integrates with Jenkins to establish a CI environment for Automated Testing.
< Michael Hackett sat down with EA’s Stephen Copp to discuss the world of integrated test platforms.
Developers of large data-intensive software often notice an interesting — though not surprising — phenomenon: When usage of an application jumps dramatically, components that have operated for months without trouble suddenly develop previously undetected errors. For example, the application may have been installed on a different OS-hardware-DBMS-networking platform, or newly added customers may have account ...
LogiGear Magazine – September 2010
The path to continuous delivery leads through automation Software testing and verification needs a careful and diligent process of impersonating an end user, trying various usages and input scenarios, comparing and asserting expected behaviours. Directly, the words “careful and diligent” invoke the idea of letting a computer program do the job. Automating certain programmable aspects ...
Understanding the benefits and challenges of Automating ERP is critical. According to SAP, ERP (Enterprise Resource Planning) “is the core processes that are needed to run a company: finance, human resources, manufacturing, supply chain, services, procurement, and others. At its most basic level, ERP integrates these processes into a single system. But new ERP systems ...
One of my current responsibilities is to find ways to automate, as much as practical, the ‘testing’ of the user experience (UX) for complex web-based applications. In my view, full test automation of UX is impractical and probably unwise; however, we can use automation to find potential UX problems, or undesirable effects, even in rich, ...
LogiGear Magazine September Issue 2020: Testing Transformations: Modernizing QA in the SDLC
What is the Automation ROI ticker? The LogiGear Automation Return on Investment (ROI) ticker, the set of colored numbers that you see above the page, shows how much money we presumably save our customers over time by employing test automation as compared to doing those same tests manually, both at the design and execution level.
The success of Automation is often correlated to its ROI. Here are 5 KPIs that we find universally applicable when it comes to quanitfying your Test Automation.
LogiGear Magazine – October 2010

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe