Capitalizing Testware as an Asset

Companies generally consider the software they own, whether it is created in-house or acquired, as an asset (something that could appear on the balance sheet). The production of software impacts the profit and loss accounts for the year it is produced: The resources used to produce the software result in costs, and methods, tools, or practices that reduce those costs are considered profitable.

Software Testing is generally regarded as an activity, not a product; the test team tests the products of the development team. In that sense testing is seen in terms of costs and savings: The activity costs money; finding bugs early saves money. Test Automation can reduce the cost of the testing itself.

Managing the testing effort in financial terms of profit and loss (costs and savings) is a good thing, particularly if it leads managers to make conscious decisions about the amount of testing that should be performed: More testing costs more, and less testing increases risks, which are potential (often much higher) costs down the line.

Very few companies think of software tests as products, or in financial terms, company assets. Test teams are not seen as “producing” anything. This is unfortunate, since it underestimates, particularly in financial terms, the value of good “testware.”

The underlying reasons for not treating testware as a long term assets are hardly surprising:

  • In Manual Testing, the bulk of the hours are spent executing tests against the system, even if test cases are documented in a test plan.
  • In most Test Automation projects, the test scripts are not well architected and too sensitive to system changes.

If an organization begins to consider its tests as assets, then it can significantly enhance the way that it approaches testing. Consider the following:

  • Test cases for your application have a definite value, and just like any other capital asset, can depreciate over time as the underlying application changes.
  • Well-written test cases, along with thoroughly documented requirements and specifications, are one of the few ways to consolidate the ‘intellectual capital’ of your team members. With today’s global teams, and the increasing challenge of retaining engineers, especially overseas, being able to retain knowledge as people come and go is critical to the success of your testing effort (and the entire product development).
  • Well-automated tests can be re-used over and over again, thus forming assets which produce profits for the company.

So how can you apply this idea at your company?

Creating automated tests is the best way I’ve found to maximize the output of your investment in software testing. Not only does Test Automation reduce your costs (a positive impact to your P&L), but well-designed Test Automation is also a valuable asset (a positive impact on the balance sheet of the company) that can be used across many different versions of your product––even as you switch between platforms!

  • As much as possible, define your tests at the ‘business process’ level, leaving out unneeded details of the application under test (AUT), like its UI build-up or screen flow. Business processes change less frequently than the systems that are supporting them, so your test will require less maintenance (i.e. depreciate less quickly).
  • The tests should be executable either automatically or manually, so that they still provide value even when the system has changed and some updates to the Automation are required. Keyword-Driven Testing is a great example of how tests can be defined in a format that can be executed either way.
  • Remember that Test Automation tools are not silver bullets. To maximize the output of your investment in Test Automation, you must combine good methodology and technology. A poorly planned Test Automation effort can quickly become a burden on your organization that provides little value.

Hans Buwalda

Hans leads LogiGear’s research and development of test automation solutions, and the delivery of advanced test automation consulting and engineering services. He is a pioneer of the keyword approach for software testing organizations, and he assists clients in strategic implementation of the Action Based Testing™ method throughout their testing organizations.

Hans is also the original architect of LogiGear’s TestArchitect™, the modular keyword-driven toolset for software test design, automation and management. Hans is an internationally recognized expert on test automation, test development and testing technology management. He is coauthor of Integrated Test Design and Automation (Addison Wesley, 2001), and speaks frequently at international testing conferences.

Hans holds a Master of Science in Computer Science from Free University, Amsterdam.

Hans Buwalda
Hans Buwalda, CTO of LogiGear, is a pioneer of the Action Based and Soap Opera methodologies of testing and automation, and lead developer of TestArchitect, LogiGear’s keyword-based toolset for software test design, automation and management. He is co-author of Integrated Test Design and Automation, and a frequent speaker at test conferences.

The Related Post

At VISTACON 2011, Harry sat down with LogiGear Sr. VP, Michael Hackett, to discuss various training methodologies. Harry Robinson Harry Robinson is a Principal Software Design Engineer in Test (SDET) for Microsoft’s Bing team, with over twenty years of software development and testing experience at AT&T Bell Labs, HP, Microsoft, and Google, as well as ...
LogiGear Magazine – July 2011 – The Test Methods & Strategies Issue
This article was developed from concepts in the book Global Software Test Automation: Discussion of Software Testing for Executives. Introduction When thinking of the types of Software Testing, many mistakenly equate the mechanism by which the testing is performed with types of Software Testing. The mechanism simply refers to whether you are using Manual or ...
This article was originally featured in the July/August 2009 issue of Better Software magazine. Read the entire issue or become a subscriber. People often quote Lord Kelvin: “I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot express ...
Internet-based per-use service models are turning things upside down in the software development industry, prompting rapid expansion in the development of some products and measurable reduction in others. (Gartner, August 2008) This global transition toward computing “in the Cloud” introduces a whole new level of challenge when it comes to software testing.
One of the most dreaded kinds of bugs are the ones caused by fixes of other bugs or by code changes due to feature requests. I like to call these the ‘bonus bugs,’ since they come on top on the bug load you already have to deal with. Bonus bugs are the major rationale for ...
LogiGear Magazine March Issue 2018: Under Construction: Test Methods & Strategy
As I write this article I am sitting at a table at StarEast, one of the major testing conferences. As you can expect from a testing conference, a lot of talk and discussion is about bugs and how to find them. What I have noticed in some of these discussions, however, is a lack of ...
There are many ways to approach test design. These approaches range from checklists to very precise algorithms in which test conditions are combined to achieve the most efficiency in testing. There are situations, such as in testing mobile applications, complex systems and cyber security, where tests need to be creative, cover a lot of functionality, ...
Introduction Many companies have come to realize that software testing is much more than a task that happens at the end of a software development cycle. They have come to understand that software testing is a strategic imperative and a discipline that can have a substantial impact on the success of an organization that develops ...
Introduction This 2 article series describes activities that are central to successfully integrating application performance testing into an Agile process. The activities described here specifically target performance specialists who are new to the practice of fully integrating performance testing into an Agile or other iteratively-based process, though many of the concepts and considerations can be ...
Having developed software for nearly fifteen years, I remember the dark days before testing was all the rage and the large number of bugs that had to be arduously found and fixed manually. The next step was nervously releasing the code without the safety net of a test bed and having no idea if one ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe