Well, since the common human reaction to a problem with the car is hitting the brakes, I don't see why a robot would be at fault doing that if he has no better response in mind. Again - the legal term you are looking for is "average competent person". E.g. a standard driver.salm wrote:That´s pretty much the point. It is unpredictable for the programmer, so they have to find a way to make the robot decide in unique situations. Simply having a standard reaction like hitting the breaks isn´t a good solution because the standard reaction might be the worst possible reaction.
The robot doesn't care about who dies, for dying is not a thing it could calculate or should. It's job is to drive the vehicle. It cares about avoiding an impact. If there is no way to evade the primary impact without causing another, it will simply try to reduce speed as much as possible. If it do this, it has acted exactly according to the letter of law*, and no matter what happens, you cannot sue anyone for it.I am interested in how a robot would declare one person to die and the other to live in scenario that would require at least one to die.
If somebody dies due to this - well, so what, that's life. Shit happens. If there were a human driver, the number of dead/injured would be the same or even worse, for the robot will definitely have initiated the stop earlier, thus reduced the impact significantly. No liability.
*Why letter of law?
Swerving to avoid a collision and create another collision makes you the instigator of that collision - you could be sued for it. On the other hand, you can't be sued for not driving into the ditch to avoid a collision with someone suddenly crossing into your lane. Because THAT OTHER GUY is liable for it.
In fact, even if you crash into someone else while (out of reflex) evading the collision, the other guy still shares liability with you, as you wouldn't have needed to evade if it wasn't for him.