Proponents of self-driving vehicles insist that when the autonomous vehicles do start rolling down our roads, that traffic congestion will ease, gasoline consumption will drop and that people with physical limitations will be mobile. Proponents also believe that motor vehicle accidents, injuries and fatalities will drop significantly once computers and sensors are piloting our cars.
A recent article in the Charlotte Observer reminds us that there are still questions to be answered about the vehicles that appear to be getting closer with every passing day. The paper poses one of the thorny ethical questions yet to be resolved: “Who dies when the car is forced into a no-win situation?”
Even with self-driving cars making instantaneous decisions to avoid wrecks, car accidents are still going to happen, an attorney said.
And the makers of those autonomous vehicles are going to have to program some tough choices into the software. The Observer points out that last year, a Daimler executive said that company’s vehicles would prioritize the people inside its vehicles over those outside of the vehicles.
The company later backtracked, however, and said the executive had been misquoted and that it would be wrong “to make a decision in favor of one person and against another.”
A Google executive said recently that cars will be programmed to avoid accidents, but “if it happens where there is a situation where a car couldn’t escape, it’ll go for the smaller thing.”
Sounds good? Maybe. But what if “the smaller thing” is a child?
Tough ethical questions remain for designers and programmers and producers of self-driving vehicles, that’s for sure.
Drivers who make bad decisions about distractions, impairment, speed and other factors can be held accountable. You can speak with a qualified personal injury attorney to discuss details your case and your available legal options.