We May Have Just Uncovered a Serious Problem With How AI “See”
Researchers from the University of Washington have discovered that autonomous driving systems can be fooled by physical alterations to street signs, such as the addition of stickers. Future self-driving systems will need to be trained to detect such alterations to avoid potentially devastating problems on the road.
Visually Impaired Cars?
People with certain visual impairments aren’t allowed to drive, for fairly obvious reasons. Now, a study from the University of Washington (UW) has shown that artificial intelligences (AI) aren’t immune to vision problems when operating motor vehicles either.
The researchers have determined that machine learning models can be prone to a kind of physical-world attack that impedes their ability to process images. Concretely, AI can have problems reading defaced street signs.
For their study, the researchers focused on two potential types of physical attacks. In the first, the attacker would place a printed poster over an actual road sign, while in the second, they would use stickers to alter an existing road sign. These attacks were designed to look like graffiti or art, which would make them more difficult for the casual observer to detect.
The attacks were extremely effective at confusing the AI. The printed overlay was 100 percent effective at fooling the system when placed over a Stop sign or a Right Turn sign. The stickers designed to look like abstract art were 100 percent effective on Stop signs, while the stickers placed to mimic graffiti were 66.7 percent effective.
Vision of the Future
If visually impaired people can correct their vision using glasses, contacts, or surgery, perhaps AI can improve its image recognition capabilities as well. One solution suggested by the UW team is teaching autonomous systems to recognize contextual information. This would mean the systems pay attention to where and how particular signs are placed and not just what’s on the signs themselves. Such a system would know that a Stop sign on a freeway, for example, doesn’t make contextual sense.
Thankfully, efforts to improve autonomous vehicles are in the works. Governments are showing support for these efforts, and legislation will make it easier to test such cars in real-world scenarios. Governments could also directly help by making road signs that are difficult to deface, as Engadget suggested.
This study doesn’t mean that driverless cars aren’t safe. On the contrary, it’s another example of how human activity — in this case, vandalism — is the primary cause of road accidents. In the U.S. alone, some 40,000 of these happen every year, so autonomous driving systems have the potential to save thousands, even millions, of lives if they supplant all human-operated vehicles on the roads worldwide.
Given enough studies like this one from the UW team, we’ll be able to detect and address the potential shortcomings of autonomous systems and eventually develop ones that can safely transport us where we need to go.