[optacon-l] Questions about tactile displays and assistive devices for the blind

  • From: Seth Teller <teller@xxxxxxxxxxxxx>
  • To: optacon-l@xxxxxxxxxxxxx
  • Date: Fri, 06 May 2011 05:53:37 -0400

Hello Optacon list members,

My lab at MIT develops assistive technology.  For some years we have
worked with people with MS to develop robotic assistants.  We have
started to collaborate with blind people to develop a wearable device
that will sense the wearer's surround (using color or depth cameras),
interpret the sensor data (as obstacles, objects, hazards, text, people
etc. or their attributes) and relay the interpreted data to the wearer
via one or more channels (speech, spatialized sound, braille, or tactile
display), using whatever channel(s) are appropriate to the current
task, user context, and user preferences.

We realize that it is critical to involve users in research from the
very start, in order to understand their needs and increase the
chance that any solutions we come up will actually be useful and used.
Since this group includes many Optacon users, I'm writing to ask about
your real-world experiences with tactile displays and your desires for
assistive technology in general.  (As background I found Harvey Lauer's
2003 essay on "The Reading Machine That Hasn't Been Built Yet," linked
at http://www.afb.org/afbpress/pub.asp?DocID=aw040204, quite powerful.)

Anyway, here are a few questions to start:

1. People use the Optacon to read text, math symbols and engineering
plots.  For what sort of other pictorial representations do you use
the device?  How well does this work for you?  Where or how does it fail?

2. The psychophysics literature cites spatial resolutions as fine as
40 microns (25 dots per millimeter) on the fingertips, but the spacing
on the Optacon is much coarser than that.  What spatial resolution do
you want or need?

3. Can you imagine other methods beside vibration (e.g., variation in
pin height) that would effectively convey information over an area?

4. Several blind people have asked us to develop a larger surface,
perhaps the size of a smartphone or even an iPad, which could be felt
with 4 or 8 (non-thumb) fingers simultaneously in order to "explore"
the user's surround.  If you had a larger device and wanted to use it
in a mobile context, where would you place it on your body?  On one of
your forearms, to be felt with the other hand?  On one hip?  What about
making small, light finger pads or a glove to keep a small part of the
display surface in contact with each finger regardless of how your hands
move?  This would get in the way of direct touch sensing of other
things (though in the long run, perhaps we could make the tactile array
a kind of "pass-through" device).  What do you think of these ideas?

5. In the long run, our goal is to support many activities in the home
and out in the world: navigation, object finding, text-spotting and
reading (e.g. from signs at a distance), people detection, reporting
facial expressions, shopping, taxi hailing, etc.  It's a much longer
conversation, but: what are the capabilities you would want from such
a device?

Best,

Seth Teller

Professor of Computer Science and Engineering
Computer Science and Artificial Intelligence Laboratory
Electrical Engineering and Computer Science Department
Massachusetts Institute of Technology

MIT Stata Center Room 32-333
32 Vassar Street
Cambridge, MA 02139
Email:  teller@xxxxxxx
Phone:  617-258-7885
http://people.csail.mit.edu/teller

to view the list archives, go to:

www.freelists.org/archives/optacon-l 

To unsubscribe at any time, just send a message to:

optacon-l-request@xxxxxxxxxxxxx with the word "unsubscribe" (without the 
quotes) in the message subject.  

Tell your friends about the list.  They can subscribe by sending a message to:

optacon-l-request@xxxxxxxxxxxxx with the word "subscribe" (without the quotes) 
in the message subject.  

Other related posts: