banner-649x60

Interview: Robert V. Binder

CEO and founder of mVerify Corporation, Robert V. Binder tackles questions from field testers regarding such issues as strategic considerations when dealing with single stack apps versus globalized enterprise mobile apps, and methods and tools that developers and testers should be aware of. He also offers his own advice from lessons learned from experience.

1. What are the popular mobile platforms out there?

 

The handheld platform installed base is rapidly evolving, according to Comscore. From May to August of 2011, Android’s share of US smartphones was up 6% to 43.7%. Apple’s iOS increased slightly to 27.3% in the same period, while RIM’s Blackberry OS slipped 5 points to a 19.7% share. Windows Mobile was nearly unchanged at 5.7%, with Symbian shrinking to 1.8%, down from 2.1%.

Global shipments of tablet devices for Q3 2011 were up 280% year-over-year to 16.7 million, according to Strategy Analytics. Sixty-seven percent of this was iOS (iPad only), 27% Android (seven versions; shipped on Motorola Xoom, Samsung Galaxy and others), 2% Windows (desktop and mobile OS), and RIM’s BlackBerry Tablet OS at 1%. Also contending are Kindle, MeeGo, and Chromebook. WebOS was killed, but could reappear.

Mobile applications are not limited to handheld devices. Telematics and other systems that rely on wireless data links also use mobile platforms. Windows Mobile has the dominant share of the installed base for embedded and industrial applications which do not use a hard real-time OS. For example, Ford’s Synch system and ATT’s Uverse set-top boxes use variants of the Windows CE stack and run on many millions of endpoints.

A good survey of this rather complex space can be found at http://en.wikipedia.org/wiki/Mobile_operating_system

2a. With Agile development, Continuous Integration strategy, along with the wide variations of mobile platform, what are the challenges for testing and test automation for mobile apps?

 

First, let’s define “mobile apps” broadly – not just as a simple download from an app store. Let’s also assume that the risks of releasing a buggy mobile app can be much worse than losing a star or two from your ratings.

The typical practices of Agile development and continuous integration can be followed for mobile applications – these strategies are platform-agnostic. However, mobile stack tooling is limited compared with desktop or server environments. This means that lower productivity can be expected for mobile app development. For example, the features of automated testing tools that run on an actual handheld device are similar to early 1990’s tooling for client/server development. See Déjà vu All Over Again – The Mobile Testing Nightmare for more about that.

In addition to limited tooling, the unique dimensions of mobile apps can significantly increase project scope:

  • Which platform stacks/versions will be supported? If you plan to release on multiple stacks, you’ll need to specialize the code and UI, then test not only on each stack, but also on all supported versions of that stack, even if functionality is identical.
  • Do you plan to seek certification for each stack? For example, will you be seeking Windows Logo certification or acceptance in the Apple App store?
  • Do you plan to seek carrier certification for that data link? Major cellular carriers (ATT, Sprint, Verizon) have certification requirements for certain kinds of applications.
  • Which locales will be supported? Do you need internationalization? Even if your app relies on platform localization capabilities, you’ll need to design, develop, and test for each locale. Business policy and regulatory requirements vary considerably. For example, the Eurozone has stricter privacy regulations than the US.
  • Which wireless data links (air links) will be supported? High speed? Low speed? Legacy? Cellular? CMDA or GSM? 802.x? To what extent is your app dependent on variation in bandwidth, response time, and air link quality/QOS? For example, if you use a real-time streaming capability, what happens when the signal cuts out, or is switched to a different air link stack?
  • What about security? Security hazards for mobile systems are the single greatest business risk mobile apps present. You’ll need to understand the appropriate security profile, design and develop for that, and then provide some evidence that its goals are met.
  • If the mobile endpoint is a server’s client, to what extent can you test round trip transactions? How can you generate workloads that represent typical variations, locales, and modes for peak and off-peak use?

Developers of mobile candy – simple, single stack apps for convenience or amusement – probably can ignore most of these concerns. Developers of globalized enterprise mobile apps cannot. I’d target one slice of all these aspects first, then develop and stabilize that configuration. Next, taking into account your business, operational, and technical situation, layout and follow a roadmap for additional slices, possibly ramping up with parallel teams.

2b. What resources are available that developers and testers must know?

 

Clearly, the team needs sufficient skills in the development environments for the targeted stacks. There are two relatively new services that address some mobile app tooling limitations.

  • Crowdsourcing of alpha and beta testing can provide coverage of handheld configuration variations that would be otherwise very difficult to achieve. Leaders in this space include Mob4hire and uTest. Developers should be aware that the testing done in this way tends to be uncontrolled and subjective.
  • Recently offered cloud-based services can also provide a low cost alternative to covering device and stack combinations, as well as emulating certain aspects of load for the server side. Leaders in this space include MicroFocus, Compuware, and Keynote. This too has limitations, which should be taken into account in setting a test plan.

3. What are the various types and/or strategies of testing mobile applications one should consider?

 

I think the well-established patterns for testing functionality and performance are applicable to mobile apps, including the test design patterns in my Testing Object-Oriented Systems: Models, Patterns, and Tools. The new challenge is in recognizing and managing the hazards arising from the configuration combination explosion, also known as the “mobile testing nightmare.”

4. What benefits do you see Cloud Computing and virtualization bringing to mobile application testing?

 

The cloud modality has enabled crowdsourcing, shared emulation, and on-demand performance testing. The testing benefits of virtualization are less clear. Virtualization may allow us to run more than one mobile emulator or stack on a single computer. However, even a virtualized stack still supports only a single developer or tester. Although reducing overall hardware costs is useful, I don’t see that it will have a significant individual productivity effect. In a large lab, there is a possibility of significant reduction of setup and configuration work.

 

5. What are the issues with performance testing for mobile apps one should consider, and do you have recommendations for testing solutions?

 

The ideal configuration for performance and load testing is the deployed target environment. With mobile apps supporting millions of diverse endpoints, it is rarely feasible to replicate this in a dedicated test lab. So, the crowdsourcing, emulation, and load-generation services mentioned in 2b are worth considering, but with their limitations, there remains a risk of uncovered failure modes and bugs. I think testers are between a rock and hard place with this. A step-by-step approach can help to mitigate these risks.

First, devise a configuration roadmap. Then, for each configuration, do alpha testing in your own lab and/or a virtual lab, and follow that with crowd sourced beta testing. Then make a general release to a configuration subset of the user population, timed so that cyclical peak loads can be closely monitored. When that configuration subset is stabilized, repeat for the next roadmap milestone. Subject to budget and other operational considerations, some of these steps can be done in parallel to minimize time to market.

6. Do you have thoughts on security testing for mobile apps? Any practices, methods and tools the developers and testers should apply?

Why are banks robbed? The (apocryphal) answer is “That’s where the money is.” The huge and growing amount of personal and financial data stored on handhelds and transmitted over the air is a lucrative target for criminals, spies, and vandals. At the same time, handheld ubiquity and personal attachment to our devices (“crack-berry”, etc.) can induce a false sense of security. So, it isn’t surprising that mobile exploits are surging.

For designers and developers, I think this means security should be a first class aspect of a mobile system. Technical strategies to minimize the attack surface are not as well evolved for mobile stacks as they are for desktops or servers, so pay attention to emerging exploits and be paranoid. For testers, this means identifying abuse cases and aggressively probing for security flaws, as well as for typical bugs and failure modes.

7. What are some of the top lessons learned testing mobile apps you might have and want to share?

 

First, be sure to weave the unique characteristics of mobile endpoints into your test plan. For example, try critical updates while powering off or as the battery fades out. Interrupt usage with incoming calls, text messages, Bluetooth conversations, etc. Power cycle and check state. Install/uninstall updates. Set up a streaming video upload/download and then attempt to enter a transaction that should update a remote server. Establish a WiFi connection and repeat. Repeat on the smallest/worst quality display handheld and the biggest/best. For position-sensing devices, find a shaker and rock. Identify usage patterns that will drain the battery quickly – turn on all the radios, upload/download high data rate content, and run an app that saturates the CPU.

Second, systematically vary air link quality as part of your test plan. When I started developing mobile test automation ten years ago, I met a tester in New York City who would take a box full of handhelds into the subway. He’d found a spot on a subway platform that got zero bars from any carrier, but only a few feet away, got weak reception. He’d walk from the zero signal to the weak signal  area to test how robust the apps were to drops, repeating this until he’d worked through all the devices.

Third, do not ignore the well-established patterns and lessons of testing:  don’t limit your testing to sunny-day scenarios. Define a complete and realistic usage profile, then exercise each use case in proportion to its real-world use. Start your test planning as soon as possible. Work with developers to evaluate evolving designs/implementations for testability.

Finally, plan for and make the best use of fingers and eyeballs. Automate where it makes sense, but with today’s tooling limitations, plan on a lot of manual testing. This doesn’t mean testing by poking around – think through, document, and follow a strategy that best covers the unique aspects and risks of your app and its configurations. ■

 

Robert V. Binder
Robert V. Binder is a business leader, serial entrepreneur, and software technologist with extensive systems engineering experience.As President of System Verification Associates, he has provided solutions for clients facing existential regulatory challenges. As CEO and founder of mVerify Corporation, he took a unique solution for mobile app testing to market. He led RBSC Corporation’s consulting practice in software process and advanced software testing, delivering expertise and solutions globally.

Binder has developed hundreds of application systems and advanced automated testing solutions, including two projects released as open source. He was awarded a U.S. Patent for model-based testing of mobile systems.

He is internationally recognized as the author of the definitive Testing Object-Oriented Systems: Models, Patterns, and Tools and two other books.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

3 Responses to Interview: Robert V. Binder

  1. bpqiujnkf says:

    [url=http://www.hoganoutletccsj.it]Hogan, Hogan Scarpe.Scarpe Hogan, Hogan Outlet, 65% OFF[/url]
    bsnr [url=http://www.celineoutletccsj.com]celine bag[/url]
    tiiet [url=http://www.freerunoutletccsj.fr]Doudoune free run Femme, free run, free run Femme, free run pas cher[/url]
    askms [url=http://www.truereligionjeansoutletccsc.com]cheap true religion[/url]
    yqpia [url=http://www.michaelkorsoutletccsj.com]michaelkorsoutletccsj[/url]
    uehizh

  2. ppqspipvl says:

    [url=http://www.christianlouboutinpascherccsj.fr]louboutin[/url]
    dmus [url=http://www.christianlouboutinpascherccsj.fr]louboutin pas cher[/url]
    sqfei [url=http://www.guccioutletccsj.com]gucci outlet[/url]
    ktohg [url=http://www.burberryoutletccsj.com]burberry outlet online[/url]
    xiyqp [url=http://www.guccioutletccsj.com]guccioutletccsj[/url]
    qzaovy

  3. Bob Binder says:

    After doing this interview, I decided to take a fresh look at mobile app testing. It’s clear to me that mobile app devs need an approach to testing that is both practical and effective.

    So, I’ve developed a new course How to Test Mobile Apps, offered online.
    The backstory of the course is at
    http://www.robertvbinder.com/blog/the-genesis-of-how-to-test-mobile-apps/

    The course is aimed at developers with no testing experience who will do manual testing. BTW, it incorporates Buwalda’s action word concept an new test model.

    For more about the course, go to
    http://www.udemy.com/how-to-test-mobile-apps/

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe