Letter From The Editor – March 2020

Methods and strategy have been my favorite topics since I started working in testing. It’s essentially engineering problem-solving. It’s both looking for efficiency and attempting to measure effectiveness. So, how do we develop a set of practices to solve our Software Testing engineering problems?

One aspect of Lean Software Development and Lean Manufacturing that I love so much is the idea of “amplify learning,” or continuous learning. An example of amplified learning is finding bugs: The more bugs you find, the more you learn, and the better you understand your system. However, Lean practices do not include the idea of “best practices.” Some people live by best practices, and I am regularly asked, “What are the best testing practices?” My response is always the same: there are no best practices. There are better practices, but they depend on the context of your Software Development practices, tools, customers, platform, skills, release cycles, and other factors; thus, this dependency gives way to my conclusion that there are no standard best practices. What this means is that as your product matures and you gain more knowledge on your product, platform, and customers, then your testing strategy must change. The idea that a team is doing what they do because it’s how they have always done it tells me they are not learning nor are they growing. Strategies are not static: they evolve and get fine-tuned.

Strategy questions are some of the most frequently asked questions in my consulting work.

From Developers, I get planning questions:

  • How much testing is enough?
  • What’s the best and easiest method to show that everything’s working?
  • Do I have to unit test at all if I test everything at the API level?

From Test Engineers, I get a wide variety of questions:

  • If they’re running all of these tests at the unit level, what do I have to test at the API or UI level?
  • If I have automated tests in a headless browser, do I still need to test specific browsers?
  • Do we have enough Automated Testing? Do we have too much Automated Testing?
  • How can I get the “biggest bang for my buck” with Test Automation?

From Product Owners, I get the same questions I was asked 20 years ago:

  • Why can’t my test team test faster?
  • Why do they miss bugs?
  • Why can’t we automate all of our testing?

The questions I love the most are:

  • What’s the best method to use for the most coverage with a minimum number of test cases?
  • How can I cut maintenance costs on my automated tests?
  • I don’t know our users—how can I test?

With the exception of developers both doing more testing, these questions are testing questions across development methods. They don’t have to do with Agile, DevOps, or Continuous Integration/Continuous Delivery (CI/CD), and they are not unique to deploying faster, as we are today. The answers to all of these questions revolve around developing a strategy and being able to articulate:

  • The strategy and communicate it to the team
  • That the team of Developers, Product Owners, and Test Engineers all understand what is being tested at what level and how
  • What will be automated, what will be done manually, & what the biggest risks are

The overarching issue here is that you can go through an entire Software Engineering Program at a University and learn little to nothing about test methods and strategy. Most Test Engineers don’t come with built-in skills; they don’t always come with a working knowledge of 5 or 6 different test methods to tackle common or complex testing problems. This means you must solve them with 1 or 2 basic test methods. What I find is that most teams will validate acceptance criteria and then ‘mess around’ (i.e. perform Ad Hoc Testing) with the function for a while, and call it done. However, there are a great number of courses, programs, and certifications surrounding Software Testing today that can give you a great head start—there is no excuse for not knowing multiple ways to build a strategy these days.

So, this brings us to our March issue of the LogiGear Magazine: Smarter Testing Strategies for the Modern SDLC. This issue features some great articles that can start the conversation of, “Do my testing strategies need some tweaks?” Our cover story, Smoke Testing: An Exhaustive Guide for a Non-Exhaustive Suite delves deep into Smoke Testing, highlighting some better practices, and offering some tips and tricks. Blogger of the Month explores all of the necessary components of an effective Mobile Testing strategy. Considerations for Automating Testing for Data Warehouse and ETL Projects explains just what ETL Testing is, as well as how and why you should automate it.  TestArchitect Corner features an in-depth, step-by-step guide regarding automating Cross-Browser Testing, and our infographic, How to Develop a Test Automation Strategy provides a roadmap for ensuring your Automation strategy is effective from the start. Finally, Leader’s Pulse offers some advice for managers and leaders who may be struggling with promoting productivity within their team.

But, as I stated earlier: there are no overarching best practices. While this magazine issue is full of useful information, it is not the tell-all of the perfect testing practice. Rather, use this issue as a means to “amplify learning.” With that being said, please enjoy our newest issue of the LogiGear Magazine!

Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006). He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

The Related Post

In every year since 2011, we have devoted one edition of our magazine to the topic of mobile testing. In this year’s issue on mobile, we focus on testing from the point of view of the user experience. Most teams start with UI testing, and it may seem basic — until you look at the ...
Testing the Software Car. As usual with the LogiGear Magazine, we are tackling a big subject. With our goal of having single-topic issues, we have the ability to grab and disseminate as much information as we can related to a current topic that is interesting and also on the frontier of Software Testing.   Some ...
Integrated teams Something we’ve learned in the Covid-19 pandemic is that we have to work together-whatever together means. Very few teams stayed co-located; even teams in the same town worked at home. We’re all working remote. Hopefully all the thinking, tools, work and effort we put into having offshore teams work together benefited us here. ...
“Why do we need to understand a bunch of test methods? I write test cases from user stories or requirements, automate what I can and execute the rest manually, and its fine.” If this is your situation: good for you. If you are time crunched, if your automated tests have lost relevance, are hard to ...
DevOps can be a big scary thing. Culture change, constant collaboration— whatever that means— a big new set of tools… it’s a lot. What most teams want is to have a smooth running software development pipeline. I have stopped using the phrase “DevOps,” and now I say “Continuous Delivery.” There are many reasons for this.
Testing Embedded systems and testing the Internet of Things could each have their own issue of LogiGear magazine. But these days they are referred to presupposing knowledge of the other, so we thought it would be a good idea to tackle the two together in this issue to give a broad understanding of the landscape ...
As part of my work, I spend a lot of time at client’s sites and talk to various software development organizations. I am beginning to see a problem arise regarding Test Automation. There is too much automation! Surprised? While there are still many teams struggling to make progress with Test Automation, many teams have been doing ...
I was just recently at a company that had a beautiful test architecture, framework, and Cucumber with tons of well-automated tests. But there was no good test management on top of the Cucumber tests, and they did not do a good job tagging the tests. Although almost everybody on the team could write and maintain ...
Happy New Year from LogiGear to those of us who celebrated New Years on January 1! And for our lunar calendar followers, an almost Happy New Year come February 3rd. We look forward to an exciting and full 2011 as its predecessor was a tough year for many in the software business. At LogiGear Magazine, ...
Testers need to learn their craft and hone in on their skill set. That means building skills, sharpening their tools, and becoming creative detectives. There is no cookie-cutter tester and no best practice. The best circumstance is a fully-skilled, aggressive tester mixed with curiosity, nimbleness, and agility.
Change is constant. What’s different today is the rate of change. Moore’s law resulted from the observation that that the rate of change in computing power is exponential. The products, services and software landscape appears just as dynamic. At the same time, we pretty much take for granted the ubiquitous presence of software running our ...
Hi everyone and welcome to our fourth edition of LogiGear Magazine. This month we finish Michael Hackett’s piece on “Agile in Testing” with part five, Tools.

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe