How blind people use their hearing to negotiate unfamiliar environments

Silvia Cirstea

As a principal lecturer in applied mathematics and media technology, Dr Silvia Cirstea's expertise encompasses mathematical modelling and software development for emerging technologies and medical applications, acoustics, optics, signal processing, artificial intelligence, and inverse problems.

Dr Cirstea worked as a research scientist before joining Anglia Ruskin University.

Q: How do blind people use their hearing to negotiate unfamiliar environments?

A: By interpreting not just the sound level, but its reverberation as well

How did the project come about?

As part of my PhD, I looked at how to reconstruct space from two-dimensional images. Then in a job for the Medical Research Council, I studied the psychoacoustics of hearing and models of hearing. It became apparent that there were similarities between the ways we hear, see and map the environment; it’s just that the auditory cues are a lot weaker than the visual cues. The idea is that you can interpret the propagated sound and come up with a map of the environment. It’s a lot more difficult than when you look at a series of images. When my colleague, Prof Shahina Pardhan, started at the Vision and Eye Research Unit (VERU) at Anglia Ruskin, we talked about this idea and how it could be translated into meaningful outcomes for the blind and low-vision community. That’s how the project started. Our colleague Dr Andrew Kolarik joined VERU and we began theoretical and experimental work on this theme.

What were your research objectives?

Anecdotal evidence suggests blind people are better at understanding environments, such as whether they’re in a corridor, large room or a small room, through interpreting sound reverberation alone – more so than normal-sighted individuals. Through our research, we were able to prove that this was indeed the case, regardless of whether the sound came from the front or side. We also looked at how normal sighted and blind people perceived the absolute distance. Could they say whether the sound comes from one or five metres away just by listening? Our other objective is to incorporate features, such as reverberation and level information, into mobility aids for the blind and develop an automatic process that is able to interpret the reverberation in the environment from the noise or other known sources. This technology can then produce a map of the environment and guide the user to avoid obstacles and traffic, and stay on the correct side of the road.

How is the new mobility aid progressing?

It is still in the software development stage. I have two final-year project students who are developing bracelets and headsets that are able to push a user in one direction or another, not through using acoustic cues, but via radio frequency identification (RFID) tags in the environment. This is operating in a smart environment. The tags send information about dimensions and obstacles such as corners and stairs, then the processor on the mobility aid interprets those signals.

Can you foresee other applications for this work?

Yes. My colleague, Steven Harris, is working on a related theme. Initially, it was closely related to this work, but has now moved more into the development of virtual and gaming environments for the blind. This could be for training or entertainment, but can also work as a simulation environment for mobility. We can  also see it being used in Assisted Living or Smart Home applications.

Research matters

What inspired you?

The desire to come up with better solutions for people who are visually impaired and face difficulty in their lives as a result. Unfortunately, existing mobility aids are still deficient, either in what they are able to detect, or in the way they convey what they detect. If I manage to solve this, it would be a great achievement.

Any surprises?

I was surprised that low-vision people do not show the same or better ability of interpreting reverberation compared to normal-sighted people – whereas blind people do. I wasn’t expecting that.

Why does this research matter?

It matters to the visually-impaired community. By being able to understand reverberation, blind or low-vision people can be trained to listen better to it and manage by themselves in unfamiliar environments. This work will also inform future technology development in mobility aids.

Research funding

The research was funded by Anglia Ruskin University

Reading matter

A summary of research investigating echolocation abilities of blind and sighted humans

Hearing Research, Hear Res. 310 pp. 60_68, 2014 A.J. Kolarik, S. Cirstea, S. Pardhan, B.C.J. Moore

Discrimination of virtual auditory distance using level and direct-to reverberant ratio cues

Journal of the Acoustical Society of America 134(5) pp. 3395_3398, 2013 A.J. Kolarik, S. Cirstea, S. Pardhan

Sensory substitution information informs locomotor adjustments when walking through apertures

Exp Brain Res 232(3) pp. 975_984, 2014 A.J. Kolarik, M.A. Timmis, S. Cirstea, S. Pardhan

Team members:

Prof Shahina Pardhan

Dr Andrew Kolarik

Dr Matthew Timmis

Steven Harris

Anglia Ruskin University

Collaborator:

Prof Brian Moore

Cambridge University