The 4-Wheeled Monster

5 roadblocks in vehicular autonomy that complicate Software Testing

Experts in the field have previously referred to air travel as somewhat of a gold standard for autonomous vehicle safety, but after Boeing’s two tragedies, that analogy can no longer be used when talking about self-driving cars. This was after Boeing’s 737 MAX Jets have found themselves grounded following software issues that resulted in the deaths of nearly 400 people. However, it’s not the technology that failed in Boeing’s case; rather, it was the pilot’s re-training—or lack thereof—as well as a lack of standard safety features regarding the software that caused the accidents. Moving forward, consumers are not asking if they can trust autonomous cars’ technology, but instead, consumers are wondering, “Can we trust companies to properly develop these technologies and trust government bodies to regulate them?” Yet, no one asks how we can trust humans to properly operate non-autonomous vehicles. Rather, we just subject them to quasi-regular tests and send them on their way. Humans are not perfect: they text and drive, apply makeup while driving, eat while driving, or in some cases drink and drive, or fall asleep at the wheel-the list is exhaustive. Machines on the other hand do not partake in these  dangerous, behind-the-wheel activities; with their sensors and processors, they can easily navigate the roads and minimize operator error driven accidents.

But, there is one thing the human mind still can do better than the machine: analyze the unexpected. If a young child suddenly dashes into the street, the human brain will have a direct reaction, which is to slam on the brakes. A computer, on the other hand, has mere seconds to analyze the situation: Are there surrounding cars that will be hit if it swerves to avoid the child? Are there cars following closely behind that will rear end the car if it slams the brakes? Should it just proceed as if nothing is there?

These are the tough choices we as human vehicle operators (drivers) must be prepared to make at all times. Is the technology behind autonomous cars good enough to do the same?

Here are some common qualms consumers and Software Testers alike have regarding autonomous vehicles.

1. Unpredictable Humans

Computer algorithms can handle equipping autonomous driving software to handle the rules of the road—stop at a stop sign, don’t cross over a double yellow, obey the speed limits, etc. But, what computers cannot control is the behavior of other, human drivers on the road. As mentioned earlier, humans are not perfect drivers: they speed, tailgate, they cross double yellows, they even run red lights sometimes. An evolving solution for this issue is vehicle-to-vehicle (V2V) communication. However, this technology is still in the early development stages; furthermore, this technology will only be a viable solution when a majority of vehicles on the road are equipped with it. This means this will be a solution for the distant future, but will be largely dependent on consumers buying newer model year vehicles.

2. Weather

Human drivers have enough trouble navigating through hazardous weather conditions like rain, fog, snow, or hail and this is no different for autonomous cars. Autonomous cars maintain their lane by using cameras that track the lines of the road. Falling snow and rain can make identifying upcoming objects difficult for laser sensors. Reports of on-road tests of autonomous cars constantly cite weather as a primary cause in system failure. While there is no direct fix to this, it is something engineers will need to address as autonomous car companies begin testing their systems in snow ridden states such as Pennsylvania and Massachusetts.

3. Infrastructure

Although we would like them to be, roads are not perfect. Pot holes, sink holes, and cracked pavement are all daunting tasks an autonomous car must accomplish. What is that dark circle 150 feet ahead? Is it a puddle or a pot hole? Is it maybe just a shadow? However, it is currently unsure whether or not this is a true issue; we must design technology to work in the world that exists, not the utopia we wish it to be. Currently, multiple states are undergoing the process of removing the installed lane markers—known as Botts’ dots—and are instead replacing them with painted lines. This is because, these dots cannot always be recognized by the sensors on an autonomous vehicle. Additionally, inclement weather can cover these dots, making it near impossible for the vehicle’s camera system to identify and maintain lanes.

So, as a means of fostering the growth and implementation of autonomous vehicles, California is opting to replace them with wider, thicker, reflective lane markings, as these can be easily identified by sensors. Yet, not all infrastructure problems can be immediately addressed and fixed like Botts’ dots. But, it does pose the question of how a vehicle will react at sunset in an urban, downtown setting when the shadows of skyscrapers plague the road. Will autonomous vehicles mediate traffic congestion, or will they make it worse by stopping at the foot of a shadow?

4. Emergency Situations

Technology can sometimes fail. At the time of this writing, there is no car that is solely autonomous; they all require a driver be in the driver’s seat of the vehicle to intervene on the system’s behalf if something goes wrong.

But what happens if the safety driver does not take control of the situation? More importantly, what happens if the safety driver does not know they need to take control of the situation? The Information reported a story of an incident of this caliber with the self-driving car company Waymo. The safety driver behind the wheel happened to fall asleep after about an hour of testing. In the process of falling asleep, he inadvertently touched the gas pedal, returning the car to manual mode. With no proper notification to the unconscious driver, the vehicle eventually collided with a median. This story is all too common in regards to current autonomous driving solutions out there such as Tesla’s Autopilot. In one particular incident in March of 2018, a Tesla Model X owner died after failure to regain control of their vehicle before fatally colliding with a concrete barrier. In the investigation, Tesla stated that the vehicle reported that in the 6 seconds and 150 meters before the accident, following numerous audio and visual warnings, the driver’s hands did not touch the wheel and no corrective actions were taken.

While Tesla does instruct Autopilot users to stay completely involved with the drive during Autopilot, it seems eerily similar to the Boeing story. How will automakers and autonomous vehicle developers properly train users of these cars to use the system?

5. Hacking the Car

When it comes to computers, hacking and hackers are the unruly side effects we have to deal with. Given the amount of computer systems and software that are vital to the autonomous car’s function, hacking seems near certain. Hacking cars is already an issue with non-autonomous vehicles. Wireless carjackers can already hack into computer systems of cars, toying with the horn, disabling the brakes, even cutting off acceleration. Most counterarguments to this issue reference big-data breaches—such as the Target data breach—and note that they have not hindered the growth of the consumer internet; many times, these breaches happen and society shrugs its shoulders and moves on. However, hacking a 2-ton vehicle proves exponentially more dangerous to both the occupants of the vehicle and the surrounding area. It will be up to auto manufacturers and software developers to protect their car’s software to the best of their ability.

Finally, what about system outages? In early May, BMW drivers reported outages in their vehicles’ infotainment systems: BMW ConnectedDrive. The affected aspect was the Apple CarPlay interface. While a rather minor inconvenience in this instance, it does lead to the question: What if future, autonomous vehicles’ software “goes out,” leaving consumers stranded? Or, worse, what if the system shuts down while traveling and carrying passengers?


Despite the qualms, self-driving cars aren’t slowing down. According to CB Insights, $4.2 billion was allocated to autonomous driving programs in just the first three quarters of 2018. However, don’t expect full autonomy just yet. The Society of Automotive Engineers has a 0-5 ranking scale for autonomy in cars, with level 0 being all major systems controlled by humans and level 5 being the car being completely capable of self-driving in every situation. Level 5 technologies are seemingly getting further and further away, but based on automaker and technology developer estimates, level 4 self-driving cars could become available for sale in the next couple of years, meaning the car would be capable of autonomous driving in some scenarios, though not all. Ford Motor Company’s CEO, Jim Hackett recently announced that the industry overestimated the arrival of autonomous vehicles. Hackett claims that Ford will still deliver on its promise of self-driving cars for commercial services in 2021, but not to the previously stated magnitude nor autonomy. This acts as a set up for a commonly asked question: Will autonomous cars ever have the ability to be truly autonomous with no geographical limitations? Another question is in regards to regulations: What are they? For some insight on other automotive and software regulations and how they’re evolving, check out our cover story!

Noah Peters
Noah Peters is from the Bay Area. He holds a Bachelor's Degree in Marketing from the University of San Francisco. Noah started at LogiGear as a Marketing Intern and transitioned to a full-time Marketing Associate role post-graduation. Noah is passionate about content creation and SEO, and works closely with all content produced by LogiGear, including the LogiGear Magazine, the LogiGear Blog, and various LogiGear eBooks. In his free time, you can find Noah researching the automotive industry or teaching high school marching band.

The Related Post

We’re excited to share with you the latest and greatest features of TestArchitect Gondola, as well as how to use them. So, check them out below! Gondola Studio UI/UX ImprovementsGondola Studio’s new Test Execution Dialog makes it easy to configure and run your test. You can choose the browser or device you’d like to run ...
LogiGear Magazine – January 2011 – The Test Automation Issue
It can be complicated to automate model-based testing. Here’s how to employ action words to get the job done.
I recently came back from the Software Testing & Evaluation Summit in Washington, DC hosted by the National Defense Industrial Association. The objective of the workshop is to help recommend policy and guidance changes to the Defense enterprise, focusing on improving practice and productivity of software testing and evaluation (T&E) approaches in Defense acquisition.
Cross-Browser Testing is an integral part of the Software Testing world today. When we need to test the functionality of a website or web application, we need to do so on multiple browsers for a multitude of reasons.
The Cloud demands that we be as nimble as possible, delivering features and fixes in almost real-time fashion. Both customer and provider rely on software development that can maintain quality while being light on its feet and constantly moving. In addition, Cloud-oriented systems tend to be highly complex and dynamic in structure — more than ...
Recently while teaching a workshop on Testing Dirty Systems, I uttered this “Randyism” off the top of my head, “Test automation is not automatic.” I realized immediately that I had just concisely stated the problem in making test automation a reality in many organizations. Most testers know that test automation is not automatic. (Wouldn’t it be great?) However, ...
How lagging automotive design principles adversely affect final products. Cars are integrating more and more software with every model year. The ginormous screen introduced by Tesla in their flagship Model S a few years ago was seemingly unrivaled at the time. Nowadays, screens of this size are not only commonplace in vehicles such as the ...
Regardless of your current state of tools, building an effective Continuous Integration suite of significant automated regression tests is the key to moving to a higher level of confidence in today’s development world. In the evolution timeline of software development tools, new tools have recently proliferated. We have all been sold on collaboration, transparency and ...
“Testing Applications on the web” – 2nd EditionAuthors: Hung Q. Nguyen, Bob Johnson, Michael HackettPublisher: Wiley; edition (May 16, 2003) This is good book. If you test web apps, you should buy it!, April 20, 2001By Dr. Cem Kaner – Director of Florida Institute of Technology’s Center for Software Testing Education & Research Book Reviews ...
June Issue 2019: Testing the Software Car
Over the years, we’ve provided an extensive number of articles that provide a wealth of knowledge about Test Automation. Below is a compilation of some of those articles. Guide to Automated Voice Apps Testing This article explores some of the basic test tools you’ll need and how to blend the use different automated testing tools ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay in the loop with the lastest
software testing news