December 13, 2004
Engineers develop assistive technologies
for the blind
By Tim Stephens
UCSC researchers are developing new assistive technologies
for the blind based on advances in computer vision that have
emerged from research in robotics. A "virtual white cane"
is one of several prototype tools for the visually impaired
developed by Roberto Manduchi, an assistant professor of computer
engineering, and his students.
The "virtual white cane" combines
a laser, a camera, and a computer processor to give a
blind person feedback about features such as stairs and
curbs. Photo: R. Manduchi
|
The traditional white cane is still the most common mobility
device for the blind. It is a simple and effective tool that
enables users to extend their sense of touch and "preview"
the area ahead of them as they walk. But the long, rigid cane
is not well-suited to all situations or all users.
Manduchi's high-tech alternative is a laser-based range-sensing
device about the size of a flashlight. A laser, much like the
one in an ordinary laser pointer, is combined with a digital
camera and a computer processor that analyzes and integrates
spatial information as the user moves the device back and forth
over a scene. The user receives feedback about the scene in
the form of audio signals, and an additional tactile interface
is being developed for future prototypes.
"In the audio signal, the pitch corresponds to distance,
and there are also special sounds to indicate features such
as a curb, step, or drop-off," Manduchi said.
Dan Yuan, a graduate student working with Manduchi on the virtual
white cane project, built the initial prototype. The UCSC researchers
are collaborating with the Smith-Kettlewell
Eye Research Institute, a nonprofit research institute in
San Francisco, on the virtual white cane and other projects.
"The people at Smith-Kettlewell are helping us to understand
the real needs of the blind, and they have blind engineers who
test the systems we develop," Manduchi said.
Roberto Manduchi Photo:
Tim Stephens |
In another project, for example, Manduchi is working with Smith-Kettlewell
scientist James Coughlan on a system that uses a compact device
with a camera to detect and gather information from small labels
or tags placed in key locations. For example, the tags might
help a blind person locate a doctor's office in a medical building.
The device would only work where tags have been placed in the
environment, but the tags--small colored labels with bar codes
on them--are very inexpensive and require no maintenance.
"A blind person staying at a hotel could put a sticker
on their door so they could easily find their way back to the
room," Manduchi said. "Or I could put tags here in
the Engineering 2 Building to help a blind visitor find my office."
The tags could be detected by a handheld computer with a simple
camera, or even a camera phone, he said. Michi Mutsuzaki, a
UCSC undergraduate working in Manduchi's lab, used a small handheld
computer with a camera to develop a protoype device that can
detect the colored targets.
A third collaboration with Smith-Kettlewell is a project Manduchi
refers to as "MapQuest for the blind," in reference
to the Internet map site MapQuest.com.
"The problem is how to enable a blind person to explore
a map," Manduchi said. "The current devices are braille
maps, but those require a special printer. We want to create
a feedback environment to enable a blind person to explore a
map on the computer."
The feedback would be provided by a "force-feedback mouse,"
which vibrates to produce a variety of physical sensations the
user can feel as the pointer moves across features on a computer
screen. These devices are readily available, so the project
involves creating software that will enable the blind to use
a force-feedback mouse to "feel" their way through
a map.
Michele Clarke, an undergraduate at St. Mary's University of
Minnesota, began working with Manduchi on this project last
summer as a participant in UCSC's Summer Undergraduate Research
Fellowship in Information Technology (SURF-IT) program, funded
by the National Science Foundation. She is continuing to work
on the project at St. Mary's during the current academic year.
Before coming to UC Santa Cruz in 2001, Manduchi worked for
several years at NASA's Jet Propulsion Laboratory, applying
computer vision technology to autonomous robotic systems.
"It is a natural evolution from helping a robot drive
around to helping a blind person navigate their environment,"
he said.
Return to Front Page