Vibrating footwear, augmented reality glasses help make sense of the world
CAMBRIDGE, MA—Anybody who has ever stumbled and bumped into furniture while attempting to walk through a dark room knows how important vision is for navigating our surroundings. Most people can reliably count on their vision to avoid obstacles in the environment. But for the 285 million people around the world suffering from visual impairment, life is one big dark room with no light switch. The inability to see obstacles severely limits how these individuals can safely navigate their surroundings.
Change the setting to the rocky, arid terrain of Mars, and suddenly tripping and falling is an activity that can be life-threatening. For an astronaut on Mars, maintaining solid footing can be a real challenge, where reduced gravity and the constraints of a bulky pressured suit limit sensory feedback. The protective helmet further limits an astronaut’s peripheral vision, forcing space explorers to lean forward and look down to see tripping hazards. In this environment, a punctured suit or damaged life support system can be fatal.
A team of researchers at Draper studied this problem and developed a new approach for how astronauts see and feel the terrain around them. By equipping a special boot with built-in sensors and tiny haptic motors that vibrate, the research team aims to give astronauts the information they need to stay safe.
“We call them vibrotactile boots,” said Alison Gibson, a Draper Fellow and former graduate student in MIT’s Department of Aeronautics and Astronautics. “The boots have built-in sensors and vibration motors, all connected to a small microcontroller that processes the sensor data and determines which cue to send to the user.” The front of each boot contains an ultrasonic range-finder, a proximity sensor and a six degree-of-freedom Inertial Measurement Unit. The vibratory feedback delivered to the feet is supplemented with an augmented reality visual display that also indicates the location and proximity of approaching obstacles.
“When we tested the system, most participants found the visual-tactile and visual-only cues easier to use than the tactile-only or no cues presentation style,” Gibson said. She added that Draper’s and MIT’s research in this area could have applications in the design of navigation systems for the visually impaired, and serve as an added safety measure for first responders and firefighters as they navigate smoke-filled rooms.
The research into the space boots is part of Draper’s growing human-centered design portfolio. The portfolio includes a wearable technology called isaWear (Immersive Situational Awareness) that helps the wearer recognize more data from their surroundings and understand them faster. For NASA, Draper designed a spacesuit to keep NASA astronauts healthy during long-duration space exploration missions and stabilize them while they work in microgravity,
Draper has designed and developed microelectronic components and systems going back to the mid-1980s. Our integrated, ultra-high density (iUHD) modules of heterogeneous components feature system functionality in the smallest form factor possible through integration of commercial-off-the-shelf (COTS) technology with Draper-developed custom packaging and interconnect technology. Draper continues to pioneer custom Microelectromechanical Systems (MEMS), Application-Specific Integrated Circuits (ASICs) and custom radio frequency components for both commercial (microfluidic platforms organ assist, drug development, etc.) and government (miniaturized data collection, new sensors, Micro-sats, etc.) applications. Draper features a complete in-house iUHD and MEMS fabrication capability and has existing relationships with many other MEMS and microelectronics fabrication facilities.
Draper has continued to advance the understanding and application of human-centered engineering to optimize the interaction and capabilities of the human’s ability to better understand, assimilate and convey information for critical decisions and tasks. Through its Human-Centered Solutions capability, Draper enables accomplishment of users’ most critical missions by seamlessly integrating technology into a user’s workflow. This work leverages human-computer interaction through emerging findings in applied psychophysiology and cognitive neuroscience. Draper has deep skills in the design, development, and deployment of systems to support cognition – for users seated at desks, on the move with mobile devices or maneuvering in the cockpit of vehicles – and collaboration across human-human and human-autonomous teams.
Draper continues to develop its expertise in designing, characterizing and processing materials at the macro-, micro- and nanoscales. Understanding the physical properties and behaviors of materials at these various scales is vital to exploit them successfully in designing components or systems. This enables the development and integration of biomaterials, 3D printing and additive manufacturing, wafer fabrication, chemical and electrochemical materials and structural materials for application to system-level solutions required of government and commercial sponsors.