home IoT Testing Testing Strategy for the IoT

Testing Strategy for the IoT

Experience report: A guide through the challenges and risks of testing the IoT

Embedded software has been around for years, going back to the dawn of computers. Traditionally we tested these devices in isolation and did not worry about user interfaces (if there was one) or things such as internet connectivity. The connectivity of devices started not long after the internet’s arrival. However, in recent years the so called “internet of things” (IOT) has become of more importance and certainly more newsworthy as its use is growing rapidly.

The acronym, IoT, identifies the advanced connectivity of devices, systems and services beyond the class web and network connections of information technology systems (IT and PCs). IoT includes a number of protocols, many devices environments, and even more applications. There are millions of IoT devices currently connected and predictions are that there will be nearly 26 billion devices or more [http://www.gartner.com/newsroom/id/2636073] by 2020. IoT connections include wired and wireless devices with approaches such as low power radio, Wi-Fi, Bluetooth and others. Many of these devices will use an IP address or by a group connection through secondary IP addressable devices such as hubs, bridges and/or routers. We are putting IoT in our homes [Time Magazine, Vol 184, no 1, 2014], in health care, businesses, and everywhere else.

IoT devices will share the development and test issues found in embedded software systems as well as more traditional IT/Web systems. With the growth in numbers of IoT devices and software projects the need for testers and approaches for these devices will likewise increase. Testers coming from these historic environments will face different testing challenges and bugs. This article outlines some starting points for those going into IoT testing and offers considerations for those already testing IoTs. Testing is a large subject with many books and thousands of articles, so readers should follow the links and resources to continue their learning. Remember, no one can know it all, but there are great reference materials available in many forms.

Examples of Product Test Challenges and Risks that IoT Testers Face

Testers face both new and old potential problems (errors) in IoT devices. These include:

  • Embedded functionality,
  • Web provided functionality,
  • Performance both of the network communication and internal computation,
  • Security including privacy, autonomy and control,
  • Smartness of the device and the user interface or of the software in some devices (may hide bugs),
  • Architecture of the hardware and of software, means more configurations must be tested, e.g., Android fragmentation [http://opensignal.com/reports/fragmentation-2013/ ],
  • Complexity of the software and system (means more bugs may be in the code hiding in the complexity),
  • The devices may have large amounts of code e.g., smart phones now have 10-to-20 million lines of code (where errors can hide),
  • Development time considerations, such as time to market pressure, which exists in IT and Mobile, will continue with IoT,
  • Resource considerations such as limitations in: memory, processing power, bandwidth, battery life, etc.

Unique environments the devices will be used in: hot, cold, wet, noise, at altitude, etc.

Many testers will be familiar with two or three of these issues but not the others. For example, many historic embedded software testers verified functionality and CPU timing issues yet did not worry about connectivity, performance, security, usability, or large amounts of code. Historic Web/IT testers worked these secondary items and did not worry about issues common in embedded systems such as: limited resources, unique hardware functionality, and high-risk, critical device control problems.

Additionally, I have heard project stories where historic embedded devices were “updated” with a network card or mobile connection. The embedded device was working so all the new testing focused only on the “new” connection. Could there be a problem with this line of thinking and how much would that cost the company? Consider the possible limitations of this simplistic initial testing and usage:

  • Security holes from the historic code may be missed.
  • Performance testing was CPU usage based and did not consider the impact of the connections, e.g., long waits (seconds versus milli- or micro seconds), loads, slow network, dropped connections, etc.
  • Viability and completeness of recorded data.
  • Usability of the system with the new connection.
  • Coupling impact from the new logic to existing functionality.

Certainly these challenges and risk are not the only ones IoT testers will face, but these are a start. And, once costs are examined with finding issues after a product is released, companies could lose a lot of profit.

A Start: IoT Strategy and Planning Impacts

A team should consider the implication of test strategies for both the new and re-hosted IoT device. I would start by obtaining and using IEEE1012 Verification and Validation standard [http://standards.ieee.org/findstds/standard/1012-2012.html]. Using this standard, I would assess device V&V test activities against my estimations of risk and determine an integrity level (defined in IEEE1012 and determine amounts and types of test activities). When dealing with a historic device, try analyzing white and black-box coverage levels (e.g., statement coverage, requirements coverage, performance analysis, etc.). When dealing with new devices, consider the product’s risks, quality characteristics and functionality. Finally, consider the strategy in light of allocated cost and schedule. The complete strategic information is reviewed with the stakeholders so that everyone agrees on the strategy before beginning test planning and design efforts.

Next Step: IoT Implication to test plans

Once there is agreement on test strategy, use it to guide the IoT software test plan. Here again, if you are new to IoT testing, continue with a refinement of the concepts from IEEE1012 to the next level of detail. Follow this planning with the test concepts, processes, and techniques from ISO29119. When using standards, tailor them to your local context, since a standard is only a basic beginning and not an end or best practice. A test organization that already has strong test practices and skilled testers might not need this standard since a skilled group can leverage their history and knowledge to start an IoT test plan. However, for test groups without much IoT history, I would analyze in more detail the testing that has been completed, look for error taxonomy information [ref me], determine what test concepts to include, and have a sound method for regression testing [http://www.logigear.com/magazine/issue/3350/ ].

Both new and historic IoT organizations should consider what test concepts and environment should be added to the test plan including:

  • Risk-based testing [29119];
  • Test attacks [Whittaker, How to Break Softare, Hagar, Software Test Attacks to Break Mobile and Embedded Devices] to find risky bugs;
  • Exploratory testing times and efforts;
  • Required (regulatory) scripted testing and documentation [ISO29119-3 ];
  • Test tools and automation needed;
  • Test lab(s) set up;
  • Test tours [Exploratory Software Testing: Tips, Tricks, Tours, and Techniques to Guide Test
  • Design, Whittaker ] to use; and
  • Test techniques to apply [ISO/IEC 29119-4].
  • An example IoT test lifecycle pattern for a test plan might look like:
  • Strategy
  • Plan
  • Design with regression considerations
  • Act (test)
  • Report and document [ISO/IEC 29119-3]
  • Repeat (within resource boundaries such as test team skill, cost, and schedule).

 

These activities might take day or hours, depending on the project context.

Finally, I find it is easy to forget things in test planning, so I like to use checklists to help me complete my planning, remembering that as soon as I start to execute my plan, the plan will change and I’ll have to refer back to my strategy, plans and checklist frequently. A sample beginning checklist is given in table 1.

Test Tools Needed to Support IoT

When a tester says “test tools” everyone typically thinks automated test execution tool, and while this is part of the story when I say the word “tool”, I mean anything that helps me to do better testing. Tools can be pieces of software, such as capture-playback tools, but a tool can also be a checklist, which supports manual testing.

I recommend white-box and black-box testing including analysis concepts such as static code analysis tools [Hagar, Software Test Attacks to Break Mobile and Embedded Devices] . These levels and approaches to testing allow testing, verification, and validation to be done throughout the lifecycle. Also, these combinations are complementary increasing the likelihood that errors will be found. Finally, many IoT embedded projects may benefit from the use of model and mathematical analysis which, in my experience, more progressive organizations will have the ability to use.

Classic test execution automation will support many of the issues in IoT such as testing device configurations and capture/playback. Support by vendors for embedded, mobile, IoT testing has been increasing in recent years. Teams working on IoT are advised to conduct tool trade studies and searches to find the best candidate tools for their project’s context.

The use of tools and automation does not mean that all testing should be automated. I find the advanced groups mix automated test execution and manual testing, particularly guided exploratory testing, with tours and attack patterns [Whittaker and me]. This follows the concept of having complementary approaches and activities to guide the testing. Complementary ideas would be reflected in test plans and designs.

Recommend Tester Knowledge and Skills

More than test processes or tools, skilled testers are needed for IoT. Software testing is practiced well when the knowledge and skill of the person doing the work determines the effectiveness of the effort. A project can have good practices and the right tools, but unless there are skilled people to drive these efforts, good testing may not be realized. A skilled test team would have knowledge in the following areas:

  • Web environments
  • Embedded environments
  • General test considerations (knowledge e.g., ISTQB and skills such as those outlined in the AST skills guide
  • Hardware understanding
  • Systems thinking
  • Network communication understanding
  • Performance test experience and
  • Experience with testing other quality characteristics associated with the IoT context.

Likely no single person will have all of these, so a diverse team of experienced and new testers will be optimal. Additionally, training, on-the-job learning and mentoring should be included as part of the project test plans.

Summary

IoT has been around for years, but lagged in usage behind the Web/PCs and smart phones. Now there are indicators that the sheer number of devices and software in IoT is growing rapidly day-by-day. This means that more testing and testers will be needed to minimize the bugs in the IoT devices being released to consumers. This article has introduced some of the problems IoT testers many face, and made some high-level recommendations for testers in the area of test strategies and planning. Like all test contexts, there is much more to this subject. More work on strategies, planning, error taxonomies and tools for IoT is needed.

 

Jon Hagar

Jon is a senior software person with a M.S. degree in computer science with specialization in software engineering and testing from Colorado State University and B.S. Degree in Math with specialization in civil engineering and software from Metropolitan State College of Denver, Colorado. He has experience in the software domain of real-time, reactive embedded control systems and mobile smart devices as well as test software development using numerous languages. He has over 100 publications and presentations on software reliability, testing, test tools, formal methods, and critical-systems.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Jon Hagar
Jon is a senior software person with a M.S. degree in computer science with specialization in software engineering and testing from Colorado State University and B.S. Degree in Math with specialization in civil engineering and software from Metropolitan State College of Denver, Colorado. He has experience in the software domain of real-time, reactive embedded control systems and mobile smart devices as well as test software development using numerous languages. He has over 100 publications and presentations on software reliability, testing, test tools, formal methods, and critical-systems..

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe