Robust Perception

Robust Perception

Besides securing autonomous system perception against deliberate attack, another imperative is to robustify perception to ensure reliable autonomous system navigation, collision avoidance, and timing despite harsh sensing environments. Our work on this topic has two themes:
  1. precise vision-based sensing, and
  2. massive signal-of-opportunity exploitation.
Our work in vision-based sensing is a novel fusion of GPS carrier phase measurements with camera images at the level of the so-called bundle adjustment process that is central to robust visual simultaneous localization and mapping (SLAM). In future work, our technique will attempt joint visual SLAM and carrier integer ambiguity resolution. If we are successful, the result will be a tight fusion of GPS and visual sensing that will be highly trustworthy due to the extraordinary richness and strong keyframe-to-keyframe correlation of the visual data.  Our second approach to robust perception extracts navigation and timing information from a large set of heterogeneous terrestrial and satellite signals-of-opportunity—essentially a diversity approach to robustness. The cooperative opportunistic navigation concept is at bottom an exercise in highly agile software-defined radio and in signal landscape SLAM, which differs from traditional landmark-based SLAM in that the landscape is dynamic. We have resolved fundamental questions of joint landscape and receiver state observability, and have built a sophisticated software-defined multi-system radio. This groundwork provides a launching point for a broader and deeper study of cooperative navigation and timing extraction based on signals of opportunity.