Back to News & Media
Thursday, August 3, 2017

Drones Reporting for Work at 45 MPH

Draper Equips UAVs with Vision for GPS-denied Navigation

CAMBRIDGE – When a firefighter, first responder or soldier operates a small, lightweight flight vehicle inside a building, in urban canyons, underground or under the forest canopy, the GPS-denied environment presents unique navigation challenges. In many cases loss of GPS signals can cause these vehicles to become inoperable and, in the worst case, unstable, potentially putting operators, bystanders and property in danger. 

Attempts have been made to close this information gap and give UAVs alternative ways to navigate their environments without GPS. But those attempts have resulted in further information gaps, especially on UAVs whose speeds can outpace the capabilities of their onboard technologies. For instance, scanning LiDAR routinely fails to achieve its location-matching with accuracy when the UAV is flying through environments that lack buildings, trees and other orienting structures. 

To address these drawbacks, a team from Draper and MIT has developed advanced vision-aided navigation techniques for UAVs that do not rely on external infrastructure, such as GPS, detailed maps of the environment or motion capture systems. Working together under a contract with the Defense Advanced Research Projects Agency (DARPA), Draper and MIT created a UAV that can autonomously sense and maneuver through unknown environments without external communications or GPS under the Fast Lightweight Autonomy (FLA) program. The team developed and implemented unique sensor and algorithm configurations, and has conducted time-trials and performance evaluations in indoor and outdoor venues. 

“The biggest challenge with unmanned aerial vehicles is balancing power, flight time and capability due to the weight of the technology required to power the UAVs,” said Robert Truax, Senior Member of Technical Staff at Draper. “What makes the Draper and MIT team’s approach so valuable is finding the sweet spot of a small size, weight and power for an air vehicle with limited onboard computing power to perform a complex mission completely autonomously.”

Draper and MIT’s sensor- and camera-loaded UAV was tested in a number of environments ranging between cluttered warehouses and mixed open and tree filled outdoor environments with speeds up to 10 m/s in cluttered areas and 20 m/s in open areas. The UAV’s missions were composed of many challenging elements, including tree dodging followed by building entry and exit and long traverses to find a building entry point, all while maintaining precise position estimates.

“A faster, more agile and autonomous UAV means that you’re able to quickly navigate a labyrinth of rooms, stairways and corridors or other obstacle-filled environments without a remote pilot,” said Ted Steiner, Senior Member of Draper’s Technical Staff. “Our sensing and algorithm configurations and unique monocular camera with IMU-centric navigation gives the vehicle agile maneuvering and improved reliability and safety—the capabilities most in demand by first responders, commercial users, military personnel and anyone designing and building UAVs.” 

Draper’s contribution to the DARPA FLA program—documented in a recent research paper for the Aerospace Conference, 2017 IEEE—was a novel approach to state estimation (the vehicle’s position, orientation and velocity) called SAMWISE—Smoothing And Mapping With Inertial State Estimation. SAMWISE is a fused vision and inertial navigation system that combines the advantages of both sensing approaches and accumulates error more slowly over time than either technique on its own, producing a full position, attitude and velocity state estimate throughout the vehicle trajectory. The result is a navigation solution that enables a UAV to retain all six degrees of freedom and allows it to fly autonomously without the use of GPS or any communication with vehicle speeds of up to 45 miles per hour.

Smart quadcopters find their way without human help or GPS during a flight tests at DARPA’s FLA program. Video credit: DARPA.

The team’s focus on the FLA program has been on UAVs, but advances made through the program could potentially be applied to ground, marine and underwater systems, which could be especially useful in GPS-degraded or denied environments. In developing the UAV, the team leveraged Draper and MIT’s expertise in autonomous path planning, machine vision, GPS-denied navigation and dynamic flight controls.

Draper’s SAMWISE sensor-fusion algorithm enables drones to fly 45 mph in unmapped, GPS-denied environments. A team from Draper and MIT equipped a UAV with vision for GPS-denied navigation.
Capabilities Used
Positioning, Navigation & Timing (PNT)

Draper develops novel PN&T solutions by combining precision instrumentation, advanced hardware technology, comprehensive algorithm and software development skills, and unique infrastructure and test resources to deploy system solutions. The scope of these efforts generally focuses on guidance, navigation, and control GN&C-related needs, ranging from highly accurate, inertial solutions for (ICBMs) and inertial/stellar solutions for SLBMs, to integrated Inertial Navigation System(INS)/GPS solutions for gun-fired munitions, to multisensor configurations for soldier navigation in GPS-challenged environments. Emerging technologies under development that leverage and advance commercial technology offerings include celestial navigation (compact star cameras), inertial navigation (MEMS, cold atom sensors), precision time transfer (precision optics, chip-scale atomic clocks) and vision-based navigation (cell phone cameras, combinatorial signal processing algorithms).

Autonomous Systems

Draper combines mission planning, PN&T, situational awareness, and novel GN&C designs to develop and deploy autonomous platforms for ground, air, sea and undersea needs. These systems range in complexity from human-in-the-loop to systems that operate without any human intervention. The design of these systems generally involves decomposing the mission needs into sets of scenarios that result in trade studies that lead to an optimized solution with key performance requirements.  Draper continues to advance the field of autonomy through research in the areas of mission planning, sensing and perception, mobility, learning, real-time performance evaluation and human trust in autonomous systems.

Precision Instrumentation

Draper develops precision instrumentation systems that exceed the state-of-the-art in key parameters (input range, accuracy, stability, bandwidth, ruggedness, etc.) that are designed specifically to operate in our sponsor’s most challenging environments (high shock, high temperature, radiation, etc.).  As a recognized leader in the development and application of precision instrumentation solutions for platforms ranging from missiles to people to micro-Unmanned Aerial Vehicles (UAVs), Draper finds or develops state-of-the-art components (gyros, accelerometers, magnetometers, precision clocks, optical systems, etc.) that meet the demanding size, weight, power and cost needs of our sponsors and applies extensive system design capabilities consisting of modeling, mechanical and electrical design, packaging and development-level testing to realize instrumentation solutions that meet these critical and demanding needs.

Image & Data Analytics

Draper combines specific domain expertise and knowledge of how to apply the latest analytics techniques to extract meaningful information from raw data to better understand complex, dynamic processes. Our system design approach encompasses effective organization and processing of large data sets, automated analysis using algorithms and exploitation of results. To facilitate user interaction with these processed data sets, Draper applies advanced techniques to automate understanding and correlation of patterns in the data. Draper’s expertise encompasses machine learning (including deep learning), information fusion from diverse and heterogeneous data sources, optimized coupling of data acquisition and analysis and novel methods for analysis of imagery and video data.

Media Contact

Media Relations

Contact Info: 
Strategic Communication

Media Relations

P: 
617-258-2464
C: 
617-429-2883