Do Testers Have to Write Code?

Do testers have to write code?

For years, whenever someone asked me if I thought testers had to know how to write code, I’ve responded: “Of course not.”

The way I see it, test automation is inherently a programming activity. Anyone tasked with automating tests should know how to program.

But not all testers are doing test automation.

Testers who specialize in exploratory testing bring a different and extremely valuable set of skills to the party. Good testers have critical thinking, analytical, and investigative skills. They understand risk and have a deep understanding where bugs tend to hide. They have excellent communication skills. Most good testers have some measure of technical skill such as system administration, databases, networks, etc. that lends itself to gray box testing. But some of the very best testers I’ve worked with could not have coded their way out of a For Loop.

So unless they’re automating tests, I don’t think that testers should be required to have programming skills.

Increasingly I’ve been hearing that Agile teams expect all the testers to know how to write code. That made me curious. Has the job market really shifted so much for testers with the rise of Agile? Do testers really have to know how to code in order to get ahead?

My assistant Melinda and I set out to find the answer to those questions. Because we are committed to releasing only accurate data, we ended up doing this study three times. The first time we did it, I lost confidence in how we were counting job ads, so we threw the data out entirely. The second time we did it, I published some early results showing that more than 75% of the ads requested programming skills. But then we found problems with our data, so I didn’t publish the rest of the results and we started over. Third time’s a charm, right?

So here, finally, are the results of our third attempt at quantifying the demand for programming skills in testers. This time I have confidence in our data.

We surveyed 187 job ads seeking Software Testers or QA from across the 29 states in the US posted between August 25 and October 16, 2010.

The vast majority of our data came from Craigslist (102 job ads) and LinkedIn (69 job ads); the rest came from a small handful of miscellaneous sites.

The jobs represent positions open at 166 distinct, identifiable companies. The greatest number of positions posted by any single company was two.

Although we tried to avoid a geographic bias, there is a bias in our data toward the West Coast. (We ended up with 84 job listings in California alone.) This might reflect where the jobs are, or it could be because we did this research in California so it affected our search results. I’m not sure.

In order to make sure that our data reflected real jobs with real employers we screened out any jobs advertised by agencies. That might bias our sample toward companies that care enough to source their own candidates, but it prevents our data from being polluted by duplicate listings and fake job ads used to garner a pool of candidates.

Based on our sample, here’s what we found: out of the 187 jobs we sampled, 112 jobs indicate that programming of some kind is required; an additional 39 jobs indicate that programming is a nice skill to have. That’s just over 80% of test jobs requesting programming skill.

Just in case that sample was skewed by including test automation jobs, I removed the 23 jobs with titles like “Test Automation Engineer” or “Developer in Test.” Of the remaining 164 jobs, 93 required programming and 37 said it’s a nice to have. That’s still 79% of QA/Test jobs requesting programming.

It’s important to understand how we counted the job ads.

We counted any job ad as requiring programming skills if the ad required experience or knowledge of a specific programming language or stated that the job duties required using a programming language. Similarly, we counted a job ad as requesting programming skills if it indicated that knowledge of a specific language was a nice to have.

The job ads mentioned all sorts of things that different people might, or might not, count as a programming language. For our purposes, we counted SQL and shell/batch scripting as programming languages. A tiny number of job ads (6) indicated that they required programming without listing a specific language by listing broad experience requirements like “Application development in multiple coding languages.” Those counted too.

The bottom line is that our numbers indicate approximately 80% of the job ads you’d find if searching for jobs in Software QA or Test are asking for programming skills.

Regardless of my personal beliefs, that data suggests that anyone who is serious about a career in testing would do well to pick up at least one programming language.

So which programming languages should you pick up? Here were the top 10 mentioned programming languages (including both required and nice-to-haves):

  • SQL or relational database skills (84)
  • Java, including J2EE and EJBs (52)
  • Perl (44)
  • Python (39)
  • C/C++ (30)
  • Shell Scripting (27) note: an additional 4 mentioned batch files.
  • JavaScript (24)
  • C# (23)
  • .NET including VB.NET and ASP.NET but not C# (19)
  • Ruby (9)

This data makes it pretty clear to me that at a minimum, professional testers need to know SQL. I will admit that I was a little sad to see that only 9 of the job ads mentioned Ruby. Oh well.

In addition, there were three categories of technical skills that aren’t really programming languages but that came up so often that they’re worth calling out:

  • 31 ads mentioned XML
  • 28 ads mentioned general Web Development skills including HTTP/HTTPS, HTML, CSS, and XPATH
  • 17 ads mentioned Web Services or referenced SOAP and XSL/XSLT

We considered test automation technologies separately from programming languages. Out of our sample, 27 job ads said that they require knowledge of test automation tools and an additional 50 ads said that test automation tool knowledge is a nice to have. (As a side note, I find it fascinating that 80% of the ads requested programming skills, but only about half that number mentioned test automation. I’m not sure if there’s anything significant there, but I find it fascinating nonetheless.)

The top test automation technologies were:

  • Selenium, including SeleniumRC (31)
  • QTP (19)
  • XUnit frameworks such as JUnit, NUnit, TestNG, etc. (14)
  • LoadRunner (11)
  • JMeter (7)
  • Winrunner (7)
  • SilkTest (6)
  • SilkPerformer (4)
  • Visual Studio/TFS (4)
  • Watir or Watin (4)
  • Eggplant (2)
  • Fitnesse (2)

Two things stood out to me about that tools list.

First, the number one requested tool is open source. Overall, of the number of test automation tool mentions, more than half are for free or open source tools. I’ve been saying for a while that the commercial test automation tool vendors ought to be nervous. I believe that this data backs me up. The revolution I predicted in 2006 is well under way and Selenium has emerged a winner.

Second, I was surprised at the number of ads mentioning WinRunner: it’s an end-of-lifed product.

My personal opinion (not supported by research) is that this is probably because companies that had made a heavy investment in WinRunner just were not in a position to tear out all their automated tests simply because HP/Mercury decided not to support their tool of choice. Editorializing for a moment: I think that shows yet another problem with closed source commercial products. Selenium can’t ever be end-of-lifed: as long as there is a single user out there, that user will have access to the source and be able to make whatever changes they need.

But I digress.

As long as we were looking at job ads, Melinda and I decided to look into the pay rates that these jobs offered. Only 15 of the ads mentioned pay, and the pay levels were all over the map.

Four of the jobs had pay ranges in the $10-$14/hr range. All four of those positions were part time or temporary contracts. None of the ads required any particular technical skills. They’re entry-level button-pushing positions.

The remaining 11 positions ranged from $40K/year at the low end to $130K/year at the high end. There just are not enough data points to draw any real conclusions related to salary other than what you might expect: jobs in major technology centers (e.g. Massachusetts and California) tend to pay more. If you want more information about salaries and positions, I highly recommend spelunking through the salary data available from the Bureau of Labor Statistics.

And finally I was wondering how many of the positions referred to Agile. The answer was 55 of the job ads.

Even more interesting, of those 55 ads, 49 requested programming skills. So while 80% of all ads requested programming skills, almost 90% of the ads that explicitly referenced Agile did. I don’t think there’s enough data available to draw any firm conclusions about whether the rise of Agile means that more and more testers are expected to know how to write code. But I certainly think it’s interesting.

So, that concludes our fun little romp through 187 job listings. I realize that you might have more questions than I can answer. If you want to analyze the data for yourself, you can find the raw data here.

 

Elisabeth Hendrickson

Elisabeth Hendrickson founded her company as Quality Tree Consulting in 1997 to provide training and consulting in software quality and testing. She incorporated the company as Quality Tree Software, Inc. in 1998. In 2003, Elisabeth became involved with the Agile community. In 2005 she became a Certified Scrum Master and in 2006 she joined the board of directors for the Agile Alliance. You can follow her blog at http://testobsessed.com/ or visit her company site Quality Tree Consulting at  http://www.qualitytree.com/. To read the original post, visit http://testobsessed.com/?s=testers+have+to+write+code%3F
Elisabeth Hendrickson
Elisabeth Hendrickson founded her company as Quality Tree Consulting in 1997 to provide training and consulting in software quality and testing. She incorporated the company as Quality Tree Software, Inc. in 1998. In 2003, Elisabeth became involved with the Agile community. In 2005 she became a Certified Scrum Master and in 2006 she joined the board of directors for the Agile Alliance.

The Related Post

Introduction Keyword-driven testing is a software testing technique that separates much of the programming work of test automation from the actual test design. This allows tests to be developed earlier and makes the tests easier to maintain. Some key concepts in keyword driven testing include:
March Issue 2020: Smarter Testing Strategies for The Modern SDLC
This article was developed from concepts in the book Global Software Test Automation: Discussion of Software Testing for Executives. Introduction Metrics are the means by which the software quality can be measured; they give you confidence in the product. You may consider these product management indicators, which can be either quantitative or qualitative. They are ...
First, let me ask you a few questions. Are your bugs often rejected? Are your bugs often assigned back to you and discussed back and forth to clarify information? Do your leaders or managers often complain about your bugs?
Introduction Many companies have come to realize that software testing is much more than a task that happens at the end of a software development cycle. They have come to understand that software testing is a strategic imperative and a discipline that can have a substantial impact on the success of an organization that develops ...
With complex software systems, you can never test all of the functionality in all of the conditions that your customers will see. Start with this as a fact: You will never test enough! Step 2 in getting started is to read and re-read The Art of Software Testing by Glenford Myers. This classic will set the ...
Think you’re up for a challenge? Print this word search out! See if you can find all the words and learn a few new software testing terms in the process. To see how you’ve done, check your answers in the answer key below. *You can check the answer key here.
This article was adapted from a presentation titled “How to Turn Your Testing Team Into a High-Performance Organization” to be presented by Michael Hackett, LogiGear Vice President, Business Strategy and Operations, at the Software Test & Performance Conference 2006 at the Hyatt Regency Cambridge, Massachusetts (November 7 – 9, 2006). Introduction Testing is often looked ...
Test organizations continue to undergo rapid transformation as demands grow for testing efficiencies. Functional test automation is often seen as a way to increase the overall efficiency of functional and system tests. How can a test organization stage itself for functional test automation before an investment in test automation has even been made? Further, how ...
In today’s mobile-first world, a good app is important, meaning an effective Mobile Testing strategy is  essential.  
This article was originally featured in the May/June 2009 issue of Better Software magazine. Read the entire issue or become a subscriber. In my travels, I’ve worked with a number of companies that have attempted to assess the quality of their testing — or worse, their testers — using poorly considered metrics. Sometimes the measurement ...
This article was developed from concepts in the book Global Software Test Automation: Discussion of Software Testing for Executives. Introduction When thinking of the types of Software Testing, many mistakenly equate the mechanism by which the testing is performed with types of Software Testing. The mechanism simply refers to whether you are using Manual or ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe