Multimodal Sensor Fusion for UXO Classification and Remediation

Aaron Marbug | University of Washington

MR18-1440

Objective

Remotely-operated and semi-autonomous robotic platforms are a powerful tool for subsea perform hazardous Unexploded Ordnance (UXO) mapping, identification and remediation, but their perceptual scope is limited by their constrained sensor suite. Skilled operators are able to mentally transform streaming Remotely-Operated Vehicle (ROV) data into spatial situational awareness, but this mental three-dimensional (3D) model is not easily shared, stored, or reviewed. This project will use complementary near field acoustic and optical sensors to construct an accurate, detailed and dynamic 3D map of subsea objects as observed by an ROV during UXO localization, identification, and remediation operations. This internally consistent inspection and intervention, reducing the risks and costs associated with using divers to and dynamically updated 3D reconstruction is a powerful tool for in situ UXO visualization and classification, is a shareable artifact of the underwater scene for remediation planning and post-operation analysis, and is an absolute prerequisite for effective semi- or fully-autonomous robotic remote manipulation of UXO for removal or remediation.

Back to Top

Technical Approach

This project assesses the feasibility of using concurrent video and high-frequency imaging sonar data, coupled with platform motion, to generate local-scale 3D reconstructions of the seafloor, including fully proud and partially buried UXO and clutter objects which can confound the search for UXO. The resulting reconstruction will fuse structural information from the sonar and motion-induced visual parallax with color information from the video to produce a 3D representation of the UXO and its environment in situ.

Researchers will do this by adapting the current state of the art in probabilistic multi-modal Simultaneous Localization and Mapping (SLAM) and Structure from Motion (SfM) reconstruction algorithms to include a detailed predictive model of underwater light propagation, estimation of the optical and acoustic properties of the local environment, and of imaging sonar behavior. Researchers will focus on testing with real data collected under a spectrum of environmental circumstances and with controlled sensor motion, providing repeatable, ground-truthed sensor tracks and scene geometry. Researchers will also focus on robust, real-time methods which enhance ROV-based search, inspection and classification, as well as providing an essential component for haptically-guided remediation of munitions.

Back to Top

Benefits

3D reconstructions provide a valuable, compact format for storing, sharing, and summarizing ROV video, providing an information-dense, interactive synopsis of an Munitions and Explosives of Concern (MEC) as it sits in the local environment, greatly simplifying MEC identification and classification tasks, and reducing risks to divers in the water. Looking forward, active robotic situational awareness and perception is an essential component of semi- and fully-automatic behaviors, including automated mapping, classification, and computer-assisted remediation.

Back to Top

Points of Contact

Principal Investigator

Dr. Aaron Marburg

University of Washington Applied Physics Laboratory

Phone: 206-685-8461

Program Manager

Munitions Response

SERDP and ESTCP

Share