Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
There have been a lot of interesting developments when it comes to self-driving cars. Not all of these are positive, mind you, and some are pretty disturbing. According to USA Today, it seems some self-driving cars may essentially be programmed to decide who dies and who lives in a car crash. While this may all be hypothetical in some peopleâs minds, the reality may prove to be rather different.
Self-Driving Cars Must Make Tough Calls
As most people are well aware, autonomous vehicles are packed with a ton of sensors to collect data from the environment. These sensors observe traffic, cyclists, people walking on the curb, traffic lights, and the like. While some companies may claim their autonomous vehicles canât cause accidents, that doesnât mean they wonât be involved in them. When they are, a tough call must be made. The chances of someone dying are very real, after all.
If an autonomous vehicle is faced with a life-or-death situation, how will it respond based on the data to which it has access? That question has proven incredibly difficult, if not impossible, to answer properly. The biggest question is whether or not a car should even make these decisions on our behalf in the first place. After all, when it comes to possibly killing either the driver or other onlookers, a machine isnât always best-suited to make the final call.
It may seem like that is a theoretical question first and foremost, but this is no longer the case. In fact, manufacturers of autonomous vehicles are contemplating how they should tackle this problem moving forward. It poses a massive challenge that canât be addressed or circumvented with a simple yes or no. Tech companies have been working on new solutions involving machine learning and AI to help in these situations. With the demand for and focus on autonomous vehicles increasing all over the world, it is only normal that questions like these will need to be answered sooner rather than later.
Industry experts acknowledge that there will be accidents and crashes involving autonomous vehicles. Such situations are pretty much unavoidable, regardless of how potent the underlying technology may be, though self-driving cars can save thousands of lives in the process. Unfortunately, these vehicles canât save everyone, and every such incident will be scrutinized beyond belief.
Moreover, there is the legal aspect of such discussions to consider as well. If an autonomous vehicle decides to sacrifice another humanâs life in favor of its own driver, should there be legal repercussions for doing so? It is a very disturbing thought, which is why very few public debates regarding this topic are taking place right now. Ethical considerations of this magnitude should not be overlooked whatsoever. Rest assured this topic will be revisited a few times in the coming years.
It is evident that autonomous vehicle manufacturers will need to take these tough decisions into account as well. Programmers are tasked with writing the software for the cars in question, but they need to become familiarized with all of the variables. That is not an easy task, as there are thousands of possible outcomes for every possible accident one can think of. Getting self-driving cars to avoid accidents in the first place is a noble endeavor, but there is no foolproof solution in that regard. Googleâs self-driving car unit claims its vehicles will go for the âsmaller thingâ if an accident occurs. Whether that ends up being an animal or a child, there are sure to be severe repercussions.
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.