Robots can save us if they can see us: Heckman receives CAREER award
Autonomous robots could save human lives more easily if they could “see” and react better in adverse environmental conditions. By pursuing the possibilities of using millimeter wave radar for robotic perception, Christoffer Heckman is making this fundamental shift possible.
An associate professor of computer science at Boulder, Heckman will receive $600,000 over the next five years through the National Science Foundation's CAREER award for this research.
Currently, most robots use sensors based on the visible spectrum of light, like cameras or lasers. In environments with smoke, fog or dust, however, visible light bounces off these particles.
Robots, like humans, can't plan their movements accurately if they don't know where they are or what is around them.
"Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do," Heckman said.
This is because millimeter waves pass through smoke, fog and dust.
A new path
Traditionally, Heckman explained, radar has been viewed with skepticism for these kinds of tasks. The sensors have been too large and energy-intensive for agile robots. The long wavelength of radar creates complex, confused signals.
With the advent of new, smaller system-on-a-chip radar sensors, the traditional energy and size limitations have been removed. This leaves the complexity of radar waveform signals.
"This is a fascinating problem," Heckman explained. "People really understand how radar works, down to equations that have existed for almost a century, but radar can be difficult to precisely interpret in cluttered environments. It bounces around within an enclosed area, and can pass right through small objects."
Heckman's solution is to fuse the knowledge we have about electromagnetic waves with supervised machine learning.
Datasets from high-fidelity optical sensors are paired with low-fidelity radar signals of the same scene. Machine learning then cleans the radar signal to match the high-fidelity scene. This training then can be used to build clear radar reconstructions of environments where optical sensors are obscured.
This powerful synthesis of physics and computer science stands to dramatically improve the capability of radar as a perception sensor.
Beyond sensing
Heckman has further plans as well. He wants to use this advance to support quick and accurate actions and replanning for autonomous systems.
Robotic thinking has traditionally followed the saying "sense, plan, act." A robot understands a scene, plans its route according to its inputs, and acts on that plan. Segmenting these activities, however, can lead to slow movement and inability to react to changes.
Heckman seeks to use radar in conjunction with optical and lidar sensors to improve navigation strategies as a robot is navigating a space, allowing it to respond more quickly to changes.
Robots that can plan for themselves better and can see into obscured spaces have a valuable role in search-and-rescue, firefighting and space missions.
Heckman's MARBLE team has used robots to explore dark caves through the DARPA Subterranean Challenge and as a firefighting assistant finding active embers. As the research advances made possible by this CAREER Award take shape, where will robots be able to see next?
Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do." - Chris Heckman