Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
Self-driving cars are the future of automotive technology. Although these vehicles are well programmed, they are still prone to issues and trickery. A team of eight researchers successfully altered street signs to confuse self-driving cars. The cars’ internal software wrongfully classified the signs and consequently made bad decisions.
Self-Driving Cars Are Not Immune to Error
One aspect of self-driving cars that makes them so appealing is how they can analyze the environment to make their own decisions. Most autonomous vehicles are more than capable of collecting sensory information from street signs, for instance. Cars will analyze those signs, alter their behavior accordingly, and ensure that the passenger arrives at the intended location safely. However, trouble arises when street signs are altered.
A team of eight researchers recently conducted an experiment involving altered street signs and autonomous vehicles. It turns out that the machine learning algorithm developed for most of the vehicles will misclassify altered signs and consequently make wrong decisions. In the worst case scenario, these vehicles would risk endangering the lives of passengers.
Conducting experiments like these is an absolute necessity when it comes to driverless cars. This research shows how an assailant could easily print and place posters on top of street signs or attach small stickers to make them look like something else entirely. For example, a sign with an arrow pointing to the right could be overlayed with stickers and be faked to appear as pointing to the left. This may seem harmless at first, but it could have widespread consequences.
Fake street signs also fool human drivers, so the problem is not unique to driverless cars. The latter may be more easily tricked, however. Unfortunately, nearly seven in ten sign defacements successfully fool autonomous cars. That number is absolutely unacceptable, especially considering how these signs could be misinterpreted. For example, a stop sign might be erroneously classified as reading “Speed Limit 45.”
Self-driving car technology will become more widespread and smarter over time. However, keeping street signs clear of vandalism, graffiti, and whatnot will be difficult. Thankfully, it appears that countermeasures exist. Using an anti-stick coating for street signs, for example, would certainly alleviate a lot of concerns. Car vendors will also need to upgrade their machine learning algorithms to take these potential issues into account first and foremost.
All of this research has been documented in the Robust Physical-World Attack on Machine Learning Models paper. It is not the first time we have seen similar research, and it is obvious self-driving cars can be disrupted easily. Aside from defaced street signs, cars’ own software makes them vulnerable more often than not. There is still a lot of work to be done.
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.