Driverless cars battle the weather
No matter where you live in the UK, you’ll have driven in a whole host of weather conditions. Some are enjoyable – a warm summer’s day for instance – and some not so. Some will have put you to the test: strong side winds, torrential rain and black ice to name but a few.
If you’ve ever gone on a road trip abroad, or driven to your holiday destination, you might have encountered road and weather conditions which are alien (or mostly alien) to your common or garden UK variety.
Take Spain for example, which regularly experiences sirocco winds from north Africa. Depending on the exact time of year and prevailing weather conditions, these winds bring with them all manner of effects. From strong winds lasting for days at a time, to extreme temperatures, high levels of humidity, dust, grit, moisture, rain, fog and cloud. Any one of those conditions can be more than enough for a human driver to contend with on a good day, but how will driverless cars fair?
Google’s dedicated website proclaims their cars “have sensors designed to detect objects as far as two football fields away in all directions, including pedestrians, cyclists and vehicles—or even fluttering plastic shopping bags and rogue birds. The software processes all the information to help the car safely navigate the road without getting tired or distracted.”
This is truly amazing technology, but what about the objects that aren’t hundreds of feet away – which can be detected in good time – or objects smaller than a grain of rice? What about those almost microscopic dust particles brought in on the wind? What about the clouds of exhaust and oil droplets from that dirty old lorry in front of you on the way to work? Solid and liquid particles which might suddenly cover and impair your driverless car’s LiDar* sensor and stop it from detecting those “fluttering plastic shopping bags” or that child darting into the road, from behind a parked car barely five feet ahead? Anyone who commutes will testify to how quickly their cars become covered in dirt and grime. Similarly, anyone who has ever driven in the suburbs knows what danger might lie behind the ice cream van.
Solid and liquid particles which might suddenly cover and impair your driverless car’s LiDar* sensor and stop it from detecting those “fluttering plastic shopping bags” or that child darting into the road, from behind a parked car barely five feet ahead? Anyone who commutes will testify to how quickly their cars become covered in dirt and grime. Similarly, anyone who has ever driven in the suburbs knows what danger might lie behind the ice cream van.
In spite of the million road miles Google’s self-driving project may have covered so far, and the 3 million a day its simulators accrue, they’re sorely in need of test experience in a whole host of other real-world road and weather conditions.
To date, their autonomous cars have been clocking up miles in California (a mostly Mediterranean but often foggy climate) and Texas (mostly arid or humid except for the coastal region). More recently they’ve embarked on test drives in Washington State – the north-easterly and rainy state. These three climates are certainly a good start but Google is going to need to expand its experience rapidly if it wants to offer driverless cars to the public by 2020, or sooner.
The trouble with the weather, of course, is its unreliable nature and variable characteristics. But it’s not just the weather a driverless car will need to account for, It’s the secondary impacts too. People’s driving changes in response to the weather: they can swerve suddenly if they become blinded by a flash of ultra-bright sunshine reflecting off the surface of an oil-slicked road, or drive too closely to other vehicles in drifting fog.
It’s a tough challenge but all driverless cars will have to go head to head with all types of weather and related driving behaviour at some point. If they don’t, how else will the manufacturers be able to develop cars we can entrust with our lives? Whether lawmakers insist that that point will need to be reached before driverless cars are allowed to carry passengers – outside of a test scenario – is something that should be debated and decided sooner rather than later.
And let’s not forget about the snow…
As Jim McBride, Ford technical leader for autonomous vehicles says: “It’s one thing for a car to drive itself in perfect weather, but it’s quite another to do so when the car’s sensors can’t see the road because it’s covered in snow.”
Even if the sensors aren’t covered, falling snow can disrupt the car’s LiDar sensor from building a clear picture of its surroundings, and subsequently interfere with the car being able to integrate that data with its in-built mapping system. Combined, all three challenges can make it difficult for the car to know where it is and act accordingly. And that’s just one form of weather in one particular place and time. Who knows if the snow in Moscow is entirely different from Michigan and requires an alternative solution?
Continually shifting weather patterns and the vast array of local conditions might mean it will be impossible to create a driverless car which is capable of handling all situations. But perhaps this is an opportunity for insurance providers to offer policies protecting people or manufacturers against snow blindness or random acts of freak weather.
*LiDar sensors are fixed to the car in a variety of places and emit pulses of laser light to create a 3D map in real-time of the car’s surroundings.