Tuesday, 29 October 2019

MIT Taught Self-Driving Cars to See Around Corners with Shadows

We’re inching ever-closer to a world where cars can drive themselves. The most advanced autonomous vehicles in testing by companies like Alphabet’s Waymo can drive in many situations as well as a human, but the hope is to make cars better than human drivers. Ideally, a self-driving car would be able to use all its processing might to avoid an accident, but to do that, they need data. Researchers from MIT have developed a system that could help cars prevent collisions by, essentially, looking around corners. They call it ShadowCam. 

Several teams have developed systems that can give AI-powered vehicles hints about what’s around the next bend in the road. For example, Stanford scientists created a system using lasers that could detect objects around corners. The new MIT system is simpler because it just looks for subtle changes in light and shadow on the ground that indicate another vehicle is approaching. 

This work comes from MIT’s famous Computer Science and Artificial Intelligence Laboratory (CSAIL). The team thinks this approach to monitoring the environment around a vehicle could be the equivalent of x-ray vision for cars. Whereas the LIDAR used for object mapping in current self-driving cars has high resolution and collects more data than a visible light camera, it can only objects that are directly visible. A shadow, however, might be enough to trim as much as half a second off the car’s reactions. That could be the difference between a major accident and a near miss. 

ShadowCam uses a sequence of four video frames from a camera pointed at the region just ahead of the car. The AI maps changes in light intensity over time, and specific changes can indicate another vehicle is approaching from an unseen area. This is known as Direct Sparse Odometry, a way to estimate motion by analyzing the geometry of a sequential image — it’s the same technique NASA uses on Mars rovers. The system classifies each image as stationary or dynamic (moving). If it thinks the shadow points to a moving object, the AI driving the car can make changes to its path or reduce speed. 

The researchers tested this system with a specially rigged “autonomous wheelchair” that navigated hallways. ShadowCam was able to detect when a person was about to walk out in front of the wheelchair with about 70 percent accuracy. With a self-driving car in a parking garage, the researchers were able to tune ShadowCam to detect approaching vehicles 0.72 seconds sooner than lidar with an accuracy of about 86 percent. However, the system has been calibrated specifically for the lighting in those situations. The next step is to enable ShadowCam in varying light and situations.

Now read:



from ExtremeTechExtremeTech https://www.extremetech.com/extreme/301073-mit-taught-self-driving-cars-to-see-around-corners-with-shadows

No comments:

Post a Comment