The 4-Wheeled Monster

5 roadblocks in vehicular autonomy that complicate Software Testing

Experts in the field have previously referred to air travel as somewhat of a gold standard for autonomous vehicle safety, but after Boeing’s two tragedies, that analogy can no longer be used when talking about self-driving cars. This was after Boeing’s 737 MAX Jets have found themselves grounded following software issues that resulted in the deaths of nearly 400 people. However, it’s not the technology that failed in Boeing’s case; rather, it was the pilot’s re-training—or lack thereof—as well as a lack of standard safety features regarding the software that caused the accidents. Moving forward, consumers are not asking if they can trust autonomous cars’ technology, but instead, consumers are wondering, “Can we trust companies to properly develop these technologies and trust government bodies to regulate them?” Yet, no one asks how we can trust humans to properly operate non-autonomous vehicles. Rather, we just subject them to quasi-regular tests and send them on their way. Humans are not perfect: they text and drive, apply makeup while driving, eat while driving, or in some cases drink and drive, or fall asleep at the wheel-the list is exhaustive. Machines on the other hand do not partake in these  dangerous, behind-the-wheel activities; with their sensors and processors, they can easily navigate the roads and minimize operator error driven accidents.

But, there is one thing the human mind still can do better than the machine: analyze the unexpected. If a young child suddenly dashes into the street, the human brain will have a direct reaction, which is to slam on the brakes. A computer, on the other hand, has mere seconds to analyze the situation: Are there surrounding cars that will be hit if it swerves to avoid the child? Are there cars following closely behind that will rear end the car if it slams the brakes? Should it just proceed as if nothing is there?

These are the tough choices we as human vehicle operators (drivers) must be prepared to make at all times. Is the technology behind autonomous cars good enough to do the same?

Here are some common qualms consumers and Software Testers alike have regarding autonomous vehicles.

1. Unpredictable Humans

Computer algorithms can handle equipping autonomous driving software to handle the rules of the road—stop at a stop sign, don’t cross over a double yellow, obey the speed limits, etc. But, what computers cannot control is the behavior of other, human drivers on the road. As mentioned earlier, humans are not perfect drivers: they speed, tailgate, they cross double yellows, they even run red lights sometimes. An evolving solution for this issue is vehicle-to-vehicle (V2V) communication. However, this technology is still in the early development stages; furthermore, this technology will only be a viable solution when a majority of vehicles on the road are equipped with it. This means this will be a solution for the distant future, but will be largely dependent on consumers buying newer model year vehicles.

2. Weather

Human drivers have enough trouble navigating through hazardous weather conditions like rain, fog, snow, or hail and this is no different for autonomous cars. Autonomous cars maintain their lane by using cameras that track the lines of the road. Falling snow and rain can make identifying upcoming objects difficult for laser sensors. Reports of on-road tests of autonomous cars constantly cite weather as a primary cause in system failure. While there is no direct fix to this, it is something engineers will need to address as autonomous car companies begin testing their systems in snow ridden states such as Pennsylvania and Massachusetts.

3. Infrastructure

Although we would like them to be, roads are not perfect. Pot holes, sink holes, and cracked pavement are all daunting tasks an autonomous car must accomplish. What is that dark circle 150 feet ahead? Is it a puddle or a pot hole? Is it maybe just a shadow? However, it is currently unsure whether or not this is a true issue; we must design technology to work in the world that exists, not the utopia we wish it to be. Currently, multiple states are undergoing the process of removing the installed lane markers—known as Botts’ dots—and are instead replacing them with painted lines. This is because, these dots cannot always be recognized by the sensors on an autonomous vehicle. Additionally, inclement weather can cover these dots, making it near impossible for the vehicle’s camera system to identify and maintain lanes.

So, as a means of fostering the growth and implementation of autonomous vehicles, California is opting to replace them with wider, thicker, reflective lane markings, as these can be easily identified by sensors. Yet, not all infrastructure problems can be immediately addressed and fixed like Botts’ dots. But, it does pose the question of how a vehicle will react at sunset in an urban, downtown setting when the shadows of skyscrapers plague the road. Will autonomous vehicles mediate traffic congestion, or will they make it worse by stopping at the foot of a shadow?

4. Emergency Situations

Technology can sometimes fail. At the time of this writing, there is no car that is solely autonomous; they all require a driver be in the driver’s seat of the vehicle to intervene on the system’s behalf if something goes wrong.

But what happens if the safety driver does not take control of the situation? More importantly, what happens if the safety driver does not know they need to take control of the situation? The Information reported a story of an incident of this caliber with the self-driving car company Waymo. The safety driver behind the wheel happened to fall asleep after about an hour of testing. In the process of falling asleep, he inadvertently touched the gas pedal, returning the car to manual mode. With no proper notification to the unconscious driver, the vehicle eventually collided with a median. This story is all too common in regards to current autonomous driving solutions out there such as Tesla’s Autopilot. In one particular incident in March of 2018, a Tesla Model X owner died after failure to regain control of their vehicle before fatally colliding with a concrete barrier. In the investigation, Tesla stated that the vehicle reported that in the 6 seconds and 150 meters before the accident, following numerous audio and visual warnings, the driver’s hands did not touch the wheel and no corrective actions were taken.

While Tesla does instruct Autopilot users to stay completely involved with the drive during Autopilot, it seems eerily similar to the Boeing story. How will automakers and autonomous vehicle developers properly train users of these cars to use the system?

5. Hacking the Car

When it comes to computers, hacking and hackers are the unruly side effects we have to deal with. Given the amount of computer systems and software that are vital to the autonomous car’s function, hacking seems near certain. Hacking cars is already an issue with non-autonomous vehicles. Wireless carjackers can already hack into computer systems of cars, toying with the horn, disabling the brakes, even cutting off acceleration. Most counterarguments to this issue reference big-data breaches—such as the Target data breach—and note that they have not hindered the growth of the consumer internet; many times, these breaches happen and society shrugs its shoulders and moves on. However, hacking a 2-ton vehicle proves exponentially more dangerous to both the occupants of the vehicle and the surrounding area. It will be up to auto manufacturers and software developers to protect their car’s software to the best of their ability.

Finally, what about system outages? In early May, BMW drivers reported outages in their vehicles’ infotainment systems: BMW ConnectedDrive. The affected aspect was the Apple CarPlay interface. While a rather minor inconvenience in this instance, it does lead to the question: What if future, autonomous vehicles’ software “goes out,” leaving consumers stranded? Or, worse, what if the system shuts down while traveling and carrying passengers?

Summary

Despite the qualms, self-driving cars aren’t slowing down. According to CB Insights, $4.2 billion was allocated to autonomous driving programs in just the first three quarters of 2018. However, don’t expect full autonomy just yet. The Society of Automotive Engineers has a 0-5 ranking scale for autonomy in cars, with level 0 being all major systems controlled by humans and level 5 being the car being completely capable of self-driving in every situation. Level 5 technologies are seemingly getting further and further away, but based on automaker and technology developer estimates, level 4 self-driving cars could become available for sale in the next couple of years, meaning the car would be capable of autonomous driving in some scenarios, though not all. Ford Motor Company’s CEO, Jim Hackett recently announced that the industry overestimated the arrival of autonomous vehicles. Hackett claims that Ford will still deliver on its promise of self-driving cars for commercial services in 2021, but not to the previously stated magnitude nor autonomy. This acts as a set up for a commonly asked question: Will autonomous cars ever have the ability to be truly autonomous with no geographical limitations? Another question is in regards to regulations: What are they? For some insight on other automotive and software regulations and how they’re evolving, check out our cover story!

Noah Peters
Noah Peters is from the Bay Area. He holds a Bachelor's Degree in Marketing from the University of San Francisco. Noah started at LogiGear as a Marketing Intern and transitioned to a full-time Marketing Associate role post-graduation. Noah is passionate about content creation and SEO, and works closely with all content produced by LogiGear, including the LogiGear Magazine, the LogiGear Blog, and various LogiGear eBooks. In his free time, you can find Noah researching the automotive industry or teaching high school marching band.

The Related Post

An automation framework is a way to organize your code in meaningful manner so that any person who is working with you can understand what each file contains. Automation frameworks differ based on how you organize your code – it can be organized based on your data, so that any person who wants to use ...
Test Automation is significant and growing-yet I have read many forum comments and blog posts about Test Automation not delivering as expected. It’s true that test automation can improve reliability while minimizing variability in the results, speed up the process, increase test coverage, and ultimately provide greater confidence in the quality of the software being ...
Developers of large data-intensive software often notice an interesting — though not surprising — phenomenon: When usage of an application jumps dramatically, components that have operated for months without trouble suddenly develop previously undetected errors. For example, the application may have been installed on a different OS-hardware-DBMS-networking platform, or newly added customers may have account ...
A short-list of selection criteria and popular automation tools. There are a lot of test automation tools available in the market, from heavy-duty enterprise level tools to quick and dirty playback-and-record tools for browser testing. For anyone just starting their research we’ve put together a short list of requirements and tools to consider.
How lagging automotive design principles adversely affect final products. Cars are integrating more and more software with every model year. The ginormous screen introduced by Tesla in their flagship Model S a few years ago was seemingly unrivaled at the time. Nowadays, screens of this size are not only commonplace in vehicles such as the ...
Has this ever happened to you: You’ve been testing for a while, perhaps building off of a branch, only to find out that, after all of this time, there is something big wrong. It’s a bad build and now you have to go backwards, fix something, and get a new build. Basically, you just wasted ...
The 12 Do’s and Don’ts of Test Automation When I started my career as a Software Tester a decade ago, Test Automation was viewed with some skepticism.
Looking for a solution to test your voice apps across devices and platforms? Whether you’re new or experienced in testing voice apps such as Alexa skill or Google Home actions, this article will give you a holistic view of the challenges of executing software testing for voice-based apps. It also explores some of the basic ...
There is no one recipe to make big testing a big success. It takes planning and careful execution of the various aspects, like test design, infrastructure and organization – a mix that can be different for each situation in which you may find yourself. In writing about big testing, the first question that comes up ...
Utility: A program that performs a specific task related to the management of computer functions, resources, or files, as password protection, memory management, virus protection, and file compression. Tool: A program or application that software development teams use to create, debug, maintain, or otherwise support other programs and applications. The term usually refers to programs that can be combined together ...
There are few topics in quality assurance testing that cause as much confusion as smoke testing versus sanity testing. The two names would seem to describe very different practices— and they do! But people still get them confused, since the distinction is somewhat subtle.
Take 5 and test your knowledge on the Software Testing essentials covered in our March 2019 issue. Good luck!

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe