On the topic of autonomous cars the discussion meanders between the technology and the infrastructure required and the legal aspects of it. What technology is required to actually drive autonomous and do we need to change the roads significantly to do so? Also, in case of an accident, who is to blame? The driver, or the car? The focus is on how the autonomous car can deal with the other traffic. New research focusses on an area yet untapped; on how the other participants can interact with an autonomous vehicle.
Have a robot drive for you? As scary as it may sound to some, it is closer than you think.
Technology and safety
First off al, the main reason for autonomous driving is safety. The majority of traffic accidents comes from human error link. Taking out the human in the decision making process and putting the quicker computer in place to asses a situation and act accordingly can prevent a lot of bad stuff happening. There is also convenience; people who are not able to drive now, can see their mobility increased. See the example of the blind man being driven to Taco Bell in Google's autonomous vehicle, but many other alternatives come to mind. Technically, a lot is already possible and some more time is needed to refine the processes in the car to get it even better. As Google already predicted on their technology that is may be more common on our roads in a few years already. Oh, and that ever popping up myth that we need to change our roads to facilitate this is busted. There are no external sensors or guiders required to have an autonomous car drive around town safely. All recent autonomous vehicles can drive safely within our current cities, no changes required.
Legal and responsibility
The other question that frequently pops up is on the legal side of things. In case of a crash, which will be a lot less likely with a computer on board calling the shots, but who is responsible? In traditional law, the driver of the car is being held responsible. But if the driver is not actually doing the driving, is the computer then liable? In this discussion science meets law and ethics. Some argue the person behind the wheel is still responsible, while others argue that the manufacturer of the autonomous car is.
An innovative setup to test autonomous cars interaction with the world
I see you, do you see me?
Another interesting aspect is not how the autonomous cars will deal with the other traffic, but how other traffic will deal with the autonomous car. As a pedestrian in a normal situation, you can make eye contact with the driver of a car. With this you can check if the other driver has seen you and you can judge wether or not it is safe to cross the street. With an autonomous car, who do you make eye contact with? To overcome this, research is being done on how an autonomous car signal you, indicating it has seen you.
The colour of the wheels change when someone comes close by
Methods vary from signal lights at the front, directional speakers and coloured lights at the sides. The signal lights at the front resemble the normal head lights and can be used to signal other traffic that it has seen you. I assume it will also be the indication that the car will stop and it is safe for the pedestrian to cross. Furthermore, the headlights swivel towards you (using the Kinect technology from the X-Box). The directional speakers are used to tell you it is safe to cross, though it would require a language kit if you travel abroad. The last thing they experiment with is coloured lights at the sides. If a person comes near the side of the wheels, the wheels colour green. I the person comes closer, the colour shifts to orange and eventually a deep red. It is used to signal that the car is aware of your presence there.
What do you think, when will autonomous cars hit the road in a large scale? What other problems, or challenges if you will, need to be addressed?