Robots Given Human-Like Perception to Navigate Unwieldy Terrain

Staff
By Staff
3 Min Read

Robots have long relied solely on visual information such as cameras or lidar to move through the world. Multisensory navigation has long remained challenging for machines. The forest, with its beautiful chaos of dense undergrowth, fallen logs and ever-changing terrain, is a maze of uncertainty for traditional robots.

Now, researchers from Duke University have developed a novel framework named WildFusion that fuses vision, vibration and touch to enable robots to “sense” complex outdoor environments much like humans do.

WildFusion, built on a quadruped robot, integrates multiple sensing modalities, including an RGB camera, LiDAR, inertial sensors, and, notably, contact microphones and tactile sensors. As in traditional approaches, the camera and the LiDAR capture the environment’s geometry, color, distance and other visual details. What makes WildFusion special is its use of acoustic vibrations and touch.

As the robot walks, contact microphones record the unique vibrations generated by each step, capturing subtle differences, such as the crunch of dry leaves versus the soft squish of mud. Meanwhile, the tactile sensors measure how much force is applied to each foot, helping the robot sense stability or slipperiness in real time. These added senses are also complemented by the inertial sensor that collects acceleration data to assess how much the robot is wobbling, pitching or rolling as it traverses uneven ground.

Each type of sensory data is then processed through specialized encoders and fused into a single, rich representation. At the heart of WildFusion is a deep learning model based on the idea of implicit neural representations. Unlike traditional methods that treat the environment as a collection of discrete points, this approach models complex surfaces and features continuously, allowing the robot to make smarter, more intuitive decisions about where to step, even when its vision is blocked or ambiguous.

Looking ahead, the team plans to expand the system by incorporating additional sensors, such as thermal or humidity detectors, to further enhance a robot’s ability to understand and adapt to complex environments. With its flexible modular design, WildFusion provides potential applications beyond forest trails, including disaster response across unpredictable terrains, inspection of remote infrastructure and autonomous exploration.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *