Does your programmer know how fast you were

Does your programmer know how fast you were?

It was written everywhere that an autonomous Cruise taxi was driving through San Francisco without headlights. Local police tried to stop the vehicle and were a little startled that there was no driver. The car then went through an intersection and stopped, further confusing officers.

The company says the headlights were the result of human error and that the car was stopped at a traffic light and then pulled to a safe stop as scheduled. This leads to the question of how humans, including police officers, will interact with robotic vehicles.

Cruise has a video informing law enforcement and others how to approach one of their vehicles (see second video below). You must be wondering how many patrol officers saw it. We don’t think we would say, “We mentioned our automatic defense system in our YouTube video.”

To be honest, we’re not sure if we’d want to go through our list of automatic vehicle companies in an emergency situation to find the right number to call. At the very least, you would expect the number to be clearly written on the vehicle. Why the light didn’t turn on automatically is another question entirely.

We cannot imagine that there will be no more regulations when autonomous vehicles become mainstream. Just as fire departments have access to Knox boxes so they can get to places, we’re pretty sure a failsafe code that will stop a vehicle and unlock its doors, regardless of brand, is probably a good idea. Sure, a hacker could use it for bad ends, but they can also break into Knox boxes. You would need to ensure that the stop code security was robust.

What do you think? What happens when a robot car is stopped? What happens when a taxi driver has a heart attack? We’ve already talked about the issues related to anomalies in autonomous driving. There are no easy answers to some of the questions.