MIT researchers create new self-driving system that can steer in low visibility settings, including fog and snow
- Researchers from MIT are testing a new self-driving car system for bad weather
- Instead of lidar and cameras, it uses a sensor system that reads the ground directly below and around the car instead of in front of it
- The system works in tandem with GPS data, but struggles with rainy conditions
Researchers from MIT have developed new self-driving car system capable of navigating in low visibility settings, including in fog and snow.
The system relies on Localizing Ground Penetrating Radar (LGPR), which takes readings the shape and composition of the road directly below and around the car with electromagnetic pulses.
Other self-driving car systems use a combination of Lidar, radar, and cameras to develop a real-time topographical model of where the car is in space.
A team of researchers at MIT have created a new self-driving car system capable of navigating in low-visibility settings, including fog and snow
These systems are generally reliable but have been vulnerable to visual tricks like fake road signs and lane makers, and can become significantly less reliable during bad weather conditions.
The LGPR system aims to improve on these vulnerabilities by focusing on the road itself and not the open space in front of the car.
To work, the LGPR system needs access to GPS data about the roads it’s travelling on, as well as a reference set of LGPR data to compare against the live sensor readings from the car.
To do this the MIT team sent out a car with a human driver to build a reference set of LGPR data, which catalogs small changes in road height, potholes, or other minute irregularities that form a kind of fingerprint-like textural map of the road.
‘If you or I grabbed a shovel and dug it into the ground, all we’re going to see is a bunch of dirt,’ MIT’s Teddy Ort told Engadget.
‘But LGPR can quantify the specific elements there and compare that to the map it’s already created, so that it knows exactly where it is, without needing cameras or lasers.’
The car uses ‘Localizing Ground Penetrating Radar’ or LGPR, which takes a detailed reading of the ground directly below the car, not out in front of it
The live LGPR data about the roads is compared to a stored data that had previously been mapped, to help the car maintain its specific position on the road without the possibility of being tricked by road signs or old lane markers
The system combines GPS data with LGPR data about the composition of the exact part of the road being driven over to make navigational choices about how fast to travel and when to speed up or slow down.
The data used for the LGPR systems is also significantly less than the fully 3D data used by traditional self-driving systems.
‘Intuitively, these are smaller because the sensor measures only a thin slice directly below the vehicle, while typical 3D maps contain a detailed view of the entire environment including surrounding buildings and vegetation,’ the team say, in a paper detailing their research.
‘Thus, LGPR maps can provide precise localization in changing surface conditions without requiring as much storage space.’
The team admit the system is still early in its development and is likely many years away from being road ready.
Current testing has been limited to private country roads near the university, and kept at low speeds.
Researchers acknowledge the system performs suboptimally during heavy rains, potentially because water absorption alters the shape and composition of many roads in ways that are too subtle for the naked eye to notice but significant enough to throw the LGPR sensors off.