home API Testing Solutions for Testing Web Services

Solutions for Testing Web Services

4An overview of web service testing solutions for traditional or non-technical testers.

Much has been written on the technical execution of API tests, yet there are gaps in the details of what tests to design and how to design them. The articles tend to either get too technical too fast, or are too vague and not much help. This article is written for the consumers of web services and addresses issues of what to test, test design, and solutions for traditional testers. The test strategy used by testers will be quite different than test strategy for developers or producers of web services. Also, it’s important to understand that when I use the phrase traditional tester, I mean someone who is more of a subject matter/domain expert than a technical expert, but who is well skilled in testing and QA.

A few observations about API testing:

  • Developers now are much more responsible for testing.
  • Most testers today have much more technical ability than traditional testers. Hence, testers can do more API/web service testing.
  • Much more testing happens earlier and at the API level.
  • Most teams don’t have the time to wait to test a web service through the GUI.
  • We all know about the cost of finding a bug: the longer you wait to find and fix bugs, the greater the cost. Waiting until late in a release cycle to test an API through a UI is indeed more expensive.

Problem Definition

Let’s first discuss what an API is, regardless of whether it’s a web service or old style API. It’s important to note that web services are APIs, but not all APIs are web services. An API is a function that someone wrote, which someone else’s application will use or consume. A web service is the call to the function and the data that goes along with it. What you are testing is the integration of the function and the data exchanged with that function.

Consider some of the names for an API: web service, remote procedure call, remote function call, library or
procedure. The function is written by one person, the producer, for another, the API consumer.

The API concept is very simple. From a testing standpoint, I like to think of an API as a function behind a locked door, one which you need a key to unlock so as to get access to the function. You need to use the key that’s an exact fit. You have to test your key to make sure it works in your lock. And you must test other keys that are almost correct, to make sure they will not work, and will, in fact, trigger an error.

Producers and Consumers of Web Services

The producers of a web service create a fully functioning service that meets the customer satisfaction level that’s required—for better or worse. They test the functionality, user scenarios, authentication, and error handling of the service, among other things, to the quality level of any other software development organization that publishes its functionality.

Additionally, there are nonfunctional tests called SLAs (Service level agreements). These are the nonfunctional requirements, primarily performance and security requirements that test teams verify or test to examine the set of standards which frame the contract that users/consumers have with the producer specifying how the service will work under various conditions. Producers need to test their API for usability, and (very importantly) they need to test the documentation of the web service/API. For API consumers, that testing is different: for them, testing will focus on the integration and not the functionality of the web service being consumed.

Testing strategy focuses on validation, authentication, varieties of data and especially of various user scenarios. An example of this would be logging into a social API from your site, navigating to another site from there, and then returning to your site. Should you, for instance, still be logged in with the same browser session? There are so many other possibilities, and they all need to be tested. And of course, after being created, those tests need to be automated.

What Interface to Test Through?

Many forget that sometimes the easiest way to test a web service is through a GUI (graphical user interface) in a browser.

I bring this up because, if the web service you are integrating is nonstandard or a custom web service, testing at the API level may be too complicated. Testing through the UI in a browser may suffice.

While the UI may be the simplest place for you to automate your API integration, there can be complications. Defect isolation can be complex when testing through the UI. If you find a problem when running the web service through the UI, the problem may actually be in the browser itself, or perhaps the previous function, or in the UI presentation layer, or interaction with another function or state or data access or, well, the list goes on. In short, testing through the UI does have the potential to make identifying the bug more complicated. The idea behind testing earlier is that executing APIs/web services on their own isolates the
defect, making it much cheaper, faster and easier to fix.

Solutions

Testers today are expected to take on web service testing. If you have the technical skill to execute tests of a remote function without a UI – good for you! Skip to the test design section.

  1. Execute the tests thru UI (see above)
  1. You design great tests for someone else to execute. Separation of writing and executing.

Before we think about executing tests, let’s think about the tests. Why not have developers test the API, since they integrated it and have more understanding of it? They should! But that does not let testers off the hook.

Test cases can be a great asset to programmers. Skilled testers are always better at test design and user scenarios than programmers, and can design tests that programmers can run at the API level. While equivalence class partitioning and boundary value analysis are essential skills of any tester, that is not true for all team members. It is better to have your programmers executing the API tests that you are better at writing.

For non-technical testers, the problem remains test design and where to apply those tests.

  1. Learn the technologies

To test web services, traditional testers need technical skills. First, understand the nature of APIs and web
services. Find out which technologies your web services use – most likely SOAP or REST – and learn the technologies. Note these two main technologies. For all the web services I have ever tested, these comprise the overwhelming majority of technologies used. I won’t go into the details here, since there is a vast amount of readily available information on them.

How web services are tested depends on what type of API is under test. The execution and tools for testing are very different for SOAP and REST web services. Learn more about them!

Learning about web services is straightforward. For example, many testers who do database testing use CRUD commands — Create, Read, Update and Delete — as the basis for testing. CRUD has a cousin in REST testing, with Get, Put, Post and Delet

  1. Get a tool

Tools can make the difference between good web service testing and no web service testing at all. Getting a tool to assist you in web service testing is easier than ever. Many web services—especially the most popular web services and social APIs—have tools available to help you run tests, which you can often find by checking the documentation site. The tools are not automation tools, but viewers that give you an interface, probably more of a graphical interface, to make it easier to run API level tests. Sometimes these tools are called viewers or consoles, and are pretty easy to learn. A few examples that you can check out are:

Get Postman for SOAP

https://www.getpostman.com/

REST Console

https://chrome.google.com/webstore/detail/rest-console/cokgbflfommojglbmbpenpphppikmonn?hl=en

restconsole.com: REST Console is an HTTP Request Visualizer and Constructor tool. It helps developers build, debug and test RESTful APIs.

Advanced REST Client

https://chrome.google.com/webstore/detail/advanced-rest-client/hgmloofddffdnphfgcellkdfbfbjeloo

Another possibility is to ask your developer for a viewer,
console, and tool. Maybe he or she could create a very simple HTML page with only the data results displayed—not thru the designed UI, but the function call, with easy data input and XML or JSON output to validate.

Test Design

Regardless of the solution for web service testing you choose, you must design great tests. Let’s examine that.

The real intelligence to any testing project is in the test design. Organizing tests, analyzing coverage and risk and– most importantly – writing efficient, effective tests, is a core skill for each tester.

So, what tests do you write as a consumer of a web service? As a consumer, remember that the API has been produced by a trusted partner who has already completed functional testing on the service. But it is a mistake to only validate that the integration happened. After validating the integration, testing should focus on user scenarios, testing data and various combinations of data, conditions, situations, scenarios, authentication, and all the error conditions you can think of.

The best way to start thinking about tests to design for web services is the same as thru GUI: boundary, combination, user scenario, and forced error. This strategy is not unique: the tests are the same, but executed at a different interface.

  • Do boundary and data combination testing. It’s not special or different; it’s a normal part of testing any function through the UI as well.
  • Do exploratory testing — explore data, sequence, conditions — as you would explore the function through the UI.
  • For data to test, do analyze the data for each parameter. Use the API documentation to help get you started. Do equivalent class partitioning, then
    boundary value analysis.
  • Do forced error testing. Use the same analysis skills you would for testing through the UI. What errors could a user generate? Leave mandatory fields
    empty, check optional fields with and without data, fill fields with incorrect data types (use text in date fields, dates in text fields, etc.)

A twist on error cases is that you will have to know something about the function to generate them. Get help from your programmers. Ask them, for example, “What tests will generate a false Boolean, or an hresult less than 0, or a null pointer?”

Think about what combinations will cause failures, a bad return value, or an anomaly in the operating environment.

When you think about the environment for designing tests, think about access and authentication at various steps of the service, such as access to other devices and databases. For example, if your web service needs a geolocation/GPS to execute successfully, what happens to the device when the battery runs low?

Many web service articles reference call sequencing without detailing what to test. Treat call sequencing as developing user scenarios, use cases, and soap operas. It’s a series of requests and scheduling.

Good tests should try to execute a task out of the intended order or in a different state. Some examples of these are:

  • Get a google map with GPS turned off, then turned on.
  • Try to go to Facebook from your app, then go to another site, log in to Facebook while at that app, and then come back to your app and try to access Facebook.
  • Try to execute a web service credit card payment. At some point, or maybe at various points far into the transaction, go back and change credit cards.
  • What state does the user have to be in? Change it. Logged in first? Authenticated? What if not? Go to another site, then come back.

These are normal tests a tester would run through the UI. Try executing these at the service level to find bugs faster.

Output or Expected Result

When you design tests, you need to think about the expected result or pass/fail criteria: what will you check to determine if the test passed or failed?

Make sure you have a good definition of the output and
return value for expected results/return values:

  • Return Value based on input condition – this is what we expect: If I enter a zip code, I get the weather for that zip code.
  • No return value- you enter data and get no return
    value. Perhaps there is a state change, access change, data entered in a database, records created or edited, but no return output.
  • The output is a call to another service. Checkout from one bill pay web service calls the credit card validation/protection authentication from another service.

Automation

Once you have great tests defined, it is important to get some of them automated.

The automation of web service testing needs its own study, and as always, automation follows test design. The goal is to automate as many web service tests as you can. Remember, this is testing of a service, one written by someone else and which you are consuming. The tests for a consumer are different from those for a producer. You are testing the integration, not every aspect of functionality. Once you have a properly designed and implemented test, get it off your plate of things to do by automating it!

Summary

Web Service testing is growing in importance in so many applications. For traditional testers, there is a much greater expectation you will take on part of the testing effort here. This can be a problem, or it can be an opportunity.

There are many solutions for testers to get web service testing done. From executing the tests through the UI, to learning the technologies, to designing tests other people may execute, to getting tools — you have choices.

If web service testing is new to you, get help from technical testers and programmers.

For test design, write interesting tests — boring tests don’t find bugs! Use interesting input data, create interesting conditions, generate interesting output.

And, as always, automate, automate, automate! Always look to automate more tests.

Michael Hackett

Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006).
He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006).

He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe