A comic pair of googly eyes on the front of a self-driving car could reduce traffic accidents, a new study suggests.
Researchers in Japan fitted a golf cart with two large, remote-controlled robotic eyes, making it look like the beloved children’s TV character ‘Brum’.
In experiments in virtual reality (VR), they found pedestrians were able to make ‘safer or more efficient choices’ when the eyes were fitted than when they weren’t.
According to the researchers, pedestrians generally like to look at vehicle drivers to know that they’ve registered their presence.
But in a future where self-driving cars are commonplace, pedestrians won’t be able to do this as the driver’s seat will be empty.
Therefore, having a set of eyes on a self-driving car can help pedestrians judge if they should not cross the road, and in turn avoid potential traffic accidents.
According to the researchers, pedestrians like to look at vehicle drivers to know that they’ve registered their presence. But in a future where self-driving cars are commonplace, pedestrians won’t be able to do this as the driver’s seat will be empty. Therefore, having a set of eyes on a self-driving car can help pedestrians judge if they should not cross the street, and in turn avoid potential traffic accidents
The team fitted a self-driving golf cart with two large, remote-controlled robotic eyes, making it look not unlike the beloved children’s TV character ‘Brum’ (pictured)
SELF-DRIVING CARS AND THE IMPORTANCE OF EYE CONTACT
Self-driving vehicles often use cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
One key difference with self-driving vehicles is that human drivers are lost.
This makes it difficult for pedestrians to gauge whether a vehicle has registered their presence or not, as there’ll be no eye contact or indication from people inside it.
Therefore, a big pair of eyes on the front of the self-driving vehicle will be an important indicator to a pedestrian of what the vehicle’s autonomous technology is seeing.
‘There is not enough investigation into the interaction between self-driving cars and the people around them, such as pedestrians,’ said study author Professor Takeo Igarashi at the University of Tokyo.
‘So, we need more investigation and effort into such interaction to bring safety and assurance to society regarding self-driving cars.’
If there were eyes on self-driving cars in the future, the direction of the eyes would have to be synced up to the self-driving car’s vision system.
In other words, if a pedestrian were to see the eyes looking at them, they would know that the self-driving technology has ‘seen’ and registered them.
Self-driving cars often use cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
For the study, Professor Igarashi and colleagues wanted to test whether putting moving eyes on the cart would affect risky behaviour – in this case, whether people would still cross the road in front of a moving golf cart when in a hurry.
The golf cart was actually not self-driving, but driven by one of the researchers. The windshield was covered to give the impression that there was no driver inside.
What’s more, the team opted to conduct experiments in VR, rather than real life, on the basis that it would be dangerous to ask volunteers to walk in front of a moving vehicle.
In all, 18 Japanese participants (nine women and nine men, aged 18-49 years) experienced four scenarios in the VR experience – two when the cart was fitted with eyes, and two when it wasn’t.
When the vehicle was fitted with robotic eyes, it either looked at the pedestrian (registering their presence) or away (not registering them).
Participants experienced the scenarios multiple times in random order and were given three seconds each time to decide whether or not they would cross the road in front of the cart.
In all, 18 Japanese participants (nine women and nine men, aged 18-49 years) experienced four scenarios in the VR experience – two when the cart was fitted with eyes, and two when it wasn’t
If there were eyes on self-driving cars in the future, the direction of the eyes would have to be synced up to the self-driving car’s vision system
The researchers recorded their choices and measured the ‘error rates’ of their decisions – so how often they chose to stop when they could have crossed and how often they crossed when they should have waited.
THE PROBLEM WITH SELF-DRIVING CARS
Self-driving cars and vehicles are powered by artificial intelligence (AI) that’s trained to detect pedestrians in order to know when to stop and avoid a collision.
But they can only be widely adopted once they can be trusted to drive more safely than human drivers – and this seems to be years away.
Autonomous vehicle technology is still learning how to master many of the basics – including recognising dark-skinned faces in the dark.
Several self-driving cars have been involved in nasty accidents – in March 2018, for example, an autonomous Uber vehicle killed a female pedestrian crossing the street in Tempe, Arizona in the US.
The Uber engineer in the vehicle was watching videos on her phone, according to reports at the time.
Overall, participants were able to make safer or more efficient choices when the eyes were fitted to the cart, although there was a gender split in the results.
Male participants made many dangerous road-crossing decisions (such as choosing to cross when the car was not stopping), but these errors were reduced by the cart’s eye gaze.
However, there was not much difference in safe situations for men, such as choosing to cross when the car was going to stop.
On the other hand, the female participants made more inefficient decisions (such as choosing not to cross when the car was intending to stop), but likewise these errors were reduced by the cart’s eye gaze.
However, there was not much difference in unsafe situations for women, such as choosing to cross when the car was not stopping.
‘The results suggested a clear difference between genders, which was very surprising and unexpected,’ said study author Chia-Ming Chang.
‘While other factors like age and background might have also influenced the participants’ reactions, we believe this is an important point, as it shows that different road users may have different behaviors and needs, that require different communication ways in our future self-driving world.’
As for how did the eyes made the participants feel, some thought they were cute, while others saw them as creepy or scary.
For many participants, when the eyes were looking away, they reported feeling that the situation was more dangerous, and when the eyes looked at them, others said they felt safer.
As for how did the eyes made the participants feel, some thought they were cute, while others saw them as creepy or scary
The research team admitted that this study is limited by the number of participants playing out just one scenario, and that it’s possible that people might make different choices in VR compared to real life.
But eyes on the front of self-driving cars could ultimately save people’s lives, they claim.
‘If eyes can actually contribute to safety and reduce traffic accidents, we should seriously consider adding them,’ said Igarashi.
‘In the future, we would like to develop automatic control of the robotic eyes connected to the self-driving AI instead of being manually controlled, which could accommodate different situations.’
SELF-DRIVING CARS ‘SEE’ USING LIDAR, CAMERAS AND RADAR
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
However, others make use of visible light cameras that capture imagery of the roads and streets.
They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.
In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.