Do Testers Have to Write Code?

Do testers have to write code?

For years, whenever someone asked me if I thought testers had to know how to write code, I’ve responded: “Of course not.”

The way I see it, test automation is inherently a programming activity. Anyone tasked with automating tests should know how to program.

But not all testers are doing test automation.

Testers who specialize in exploratory testing bring a different and extremely valuable set of skills to the party. Good testers have critical thinking, analytical, and investigative skills. They understand risk and have a deep understanding where bugs tend to hide. They have excellent communication skills. Most good testers have some measure of technical skill such as system administration, databases, networks, etc. that lends itself to gray box testing. But some of the very best testers I’ve worked with could not have coded their way out of a For Loop.

So unless they’re automating tests, I don’t think that testers should be required to have programming skills.

Increasingly I’ve been hearing that Agile teams expect all the testers to know how to write code. That made me curious. Has the job market really shifted so much for testers with the rise of Agile? Do testers really have to know how to code in order to get ahead?

My assistant Melinda and I set out to find the answer to those questions. Because we are committed to releasing only accurate data, we ended up doing this study three times. The first time we did it, I lost confidence in how we were counting job ads, so we threw the data out entirely. The second time we did it, I published some early results showing that more than 75% of the ads requested programming skills. But then we found problems with our data, so I didn’t publish the rest of the results and we started over. Third time’s a charm, right?

So here, finally, are the results of our third attempt at quantifying the demand for programming skills in testers. This time I have confidence in our data.

We surveyed 187 job ads seeking Software Testers or QA from across the 29 states in the US posted between August 25 and October 16, 2010.

The vast majority of our data came from Craigslist (102 job ads) and LinkedIn (69 job ads); the rest came from a small handful of miscellaneous sites.

The jobs represent positions open at 166 distinct, identifiable companies. The greatest number of positions posted by any single company was two.

Although we tried to avoid a geographic bias, there is a bias in our data toward the West Coast. (We ended up with 84 job listings in California alone.) This might reflect where the jobs are, or it could be because we did this research in California so it affected our search results. I’m not sure.

In order to make sure that our data reflected real jobs with real employers we screened out any jobs advertised by agencies. That might bias our sample toward companies that care enough to source their own candidates, but it prevents our data from being polluted by duplicate listings and fake job ads used to garner a pool of candidates.

Based on our sample, here’s what we found: out of the 187 jobs we sampled, 112 jobs indicate that programming of some kind is required; an additional 39 jobs indicate that programming is a nice skill to have. That’s just over 80% of test jobs requesting programming skill.

Just in case that sample was skewed by including test automation jobs, I removed the 23 jobs with titles like “Test Automation Engineer” or “Developer in Test.” Of the remaining 164 jobs, 93 required programming and 37 said it’s a nice to have. That’s still 79% of QA/Test jobs requesting programming.

It’s important to understand how we counted the job ads.

We counted any job ad as requiring programming skills if the ad required experience or knowledge of a specific programming language or stated that the job duties required using a programming language. Similarly, we counted a job ad as requesting programming skills if it indicated that knowledge of a specific language was a nice to have.

The job ads mentioned all sorts of things that different people might, or might not, count as a programming language. For our purposes, we counted SQL and shell/batch scripting as programming languages. A tiny number of job ads (6) indicated that they required programming without listing a specific language by listing broad experience requirements like “Application development in multiple coding languages.” Those counted too.

The bottom line is that our numbers indicate approximately 80% of the job ads you’d find if searching for jobs in Software QA or Test are asking for programming skills.

Regardless of my personal beliefs, that data suggests that anyone who is serious about a career in testing would do well to pick up at least one programming language.

So which programming languages should you pick up? Here were the top 10 mentioned programming languages (including both required and nice-to-haves):

  • SQL or relational database skills (84)
  • Java, including J2EE and EJBs (52)
  • Perl (44)
  • Python (39)
  • C/C++ (30)
  • Shell Scripting (27) note: an additional 4 mentioned batch files.
  • JavaScript (24)
  • C# (23)
  • .NET including VB.NET and ASP.NET but not C# (19)
  • Ruby (9)

This data makes it pretty clear to me that at a minimum, professional testers need to know SQL. I will admit that I was a little sad to see that only 9 of the job ads mentioned Ruby. Oh well.

In addition, there were three categories of technical skills that aren’t really programming languages but that came up so often that they’re worth calling out:

  • 31 ads mentioned XML
  • 28 ads mentioned general Web Development skills including HTTP/HTTPS, HTML, CSS, and XPATH
  • 17 ads mentioned Web Services or referenced SOAP and XSL/XSLT

We considered test automation technologies separately from programming languages. Out of our sample, 27 job ads said that they require knowledge of test automation tools and an additional 50 ads said that test automation tool knowledge is a nice to have. (As a side note, I find it fascinating that 80% of the ads requested programming skills, but only about half that number mentioned test automation. I’m not sure if there’s anything significant there, but I find it fascinating nonetheless.)

The top test automation technologies were:

  • Selenium, including SeleniumRC (31)
  • QTP (19)
  • XUnit frameworks such as JUnit, NUnit, TestNG, etc. (14)
  • LoadRunner (11)
  • JMeter (7)
  • Winrunner (7)
  • SilkTest (6)
  • SilkPerformer (4)
  • Visual Studio/TFS (4)
  • Watir or Watin (4)
  • Eggplant (2)
  • Fitnesse (2)

Two things stood out to me about that tools list.

First, the number one requested tool is open source. Overall, of the number of test automation tool mentions, more than half are for free or open source tools. I’ve been saying for a while that the commercial test automation tool vendors ought to be nervous. I believe that this data backs me up. The revolution I predicted in 2006 is well under way and Selenium has emerged a winner.

Second, I was surprised at the number of ads mentioning WinRunner: it’s an end-of-lifed product.

My personal opinion (not supported by research) is that this is probably because companies that had made a heavy investment in WinRunner just were not in a position to tear out all their automated tests simply because HP/Mercury decided not to support their tool of choice. Editorializing for a moment: I think that shows yet another problem with closed source commercial products. Selenium can’t ever be end-of-lifed: as long as there is a single user out there, that user will have access to the source and be able to make whatever changes they need.

But I digress.

As long as we were looking at job ads, Melinda and I decided to look into the pay rates that these jobs offered. Only 15 of the ads mentioned pay, and the pay levels were all over the map.

Four of the jobs had pay ranges in the $10-$14/hr range. All four of those positions were part time or temporary contracts. None of the ads required any particular technical skills. They’re entry-level button-pushing positions.

The remaining 11 positions ranged from $40K/year at the low end to $130K/year at the high end. There just are not enough data points to draw any real conclusions related to salary other than what you might expect: jobs in major technology centers (e.g. Massachusetts and California) tend to pay more. If you want more information about salaries and positions, I highly recommend spelunking through the salary data available from the Bureau of Labor Statistics.

And finally I was wondering how many of the positions referred to Agile. The answer was 55 of the job ads.

Even more interesting, of those 55 ads, 49 requested programming skills. So while 80% of all ads requested programming skills, almost 90% of the ads that explicitly referenced Agile did. I don’t think there’s enough data available to draw any firm conclusions about whether the rise of Agile means that more and more testers are expected to know how to write code. But I certainly think it’s interesting.

So, that concludes our fun little romp through 187 job listings. I realize that you might have more questions than I can answer. If you want to analyze the data for yourself, you can find the raw data here.

 

Elisabeth Hendrickson

Elisabeth Hendrickson founded her company as Quality Tree Consulting in 1997 to provide training and consulting in software quality and testing. She incorporated the company as Quality Tree Software, Inc. in 1998. In 2003, Elisabeth became involved with the Agile community. In 2005 she became a Certified Scrum Master and in 2006 she joined the board of directors for the Agile Alliance. You can follow her blog at http://testobsessed.com/ or visit her company site Quality Tree Consulting at  http://www.qualitytree.com/. To read the original post, visit http://testobsessed.com/?s=testers+have+to+write+code%3F
Elisabeth Hendrickson
Elisabeth Hendrickson founded her company as Quality Tree Consulting in 1997 to provide training and consulting in software quality and testing. She incorporated the company as Quality Tree Software, Inc. in 1998. In 2003, Elisabeth became involved with the Agile community. In 2005 she became a Certified Scrum Master and in 2006 she joined the board of directors for the Agile Alliance.

The Related Post

Test design is the single biggest contributor to success in software testing. Not only can good test design result in good coverage, it is also a major contributor to efficiency. The principle of test design should be “lean and mean.” The tests should be of a manageable size and at the same time complete and ...
Let’s look at a few distinctions between the two process improvement practices that make all the difference in their usefulness for making projects and job situations better! An extreme way to look at the goals of these practices is: what makes your work easier (retrospective) versus what did someone else decide is best practice (post-mortem)? ...
  Explore It! is one of the very best software testing books ever written. It is packed with great ideas and Elisabeth Hendrickson’s writing style makes it very enjoyable to read. Hendrickson has a well-deserved reputation in the global software testing community as someone who has the enviable ability to clearly communicate highly-practical, well-thought-out ideas. ...
From cross-device testing, to regression testing, to load testing, to data-driven testing, check out the types of testing that are suitable for Test Automation. Scene: Interior QA Department. Engineering is preparing for a final product launch with a deadline that is 12 weeks away. In 6 weeks, there will be a 1 week quality gate, ...
As I write this article I am sitting at a table at StarEast, one of the major testing conferences. As you can expect from a testing conference, a lot of talk and discussion is about bugs and how to find them. What I have noticed in some of these discussions, however, is a lack of ...
This article was originally featured in the July/August 2009 issue of Better Software magazine. Read the entire issue or become a subscriber. People often quote Lord Kelvin: “I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot express ...
PWAs have the ability to transform the way people experience the web. There are a few things we can agree we have seen happen. The first being that we figured out the digital market from an application type perspective. Secondly, we have seen the rise of mobile, and lastly, the incredible transformation of web to ...
LogiGear Magazine – February 2014 – Test Methods and Strategies
There are many ways to approach test design. These approaches range from checklists to very precise algorithms in which test conditions are combined to achieve the most efficiency in testing. There are situations, such as in testing mobile applications, complex systems and cyber security, where tests need to be creative, cover a lot of functionality, ...
One of the most dreaded kinds of bugs are the ones caused by fixes of other bugs or by code changes due to feature requests. I like to call these the ‘bonus bugs,’ since they come on top on the bug load you already have to deal with. Bonus bugs are the major rationale for ...
With complex software systems, you can never test all of the functionality in all of the conditions that your customers will see. Start with this as a fact: You will never test enough! Step 2 in getting started is to read and re-read The Art of Software Testing by Glenford Myers. This classic will set the ...
When it is out of the question to delay delivery, the solution is a prioritization strategy in order to do the best possible job within the time constraints. The scenario is as follows: You are the test manager. You made a plan and a budget for testing. Your plans were, as far as you know, ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news

Subscribe