Wayfinding solutions for people with low vision have yet to take full advantage of emerging technologies. This project looks at designing systems that enable better navigation of public spaces and buildings using near-future technology.
Most wayfinding solutions are geared to the needs of people with good eyesight. Where systems have been designed for low vision users, they are generally limited to audio loops that can be expensive to install or Braille, which only a small percentage of people can read. However the rapid development of technology and the high uptake of personal electronic devices mean that there is an opportunity for change.
This project set out to look at how emerging technologies could be harnessed to enable new forms of navigation that rely less on sight and more on the other senses. The aim was to realise the inclusive potential of ‘sensory substitution’ through practical design Wayfinding was defined from the start as comprising four essential components: orientation, route decision, route monitoring and destination recognition. A key objective was to develop concepts that would enable these tasks to be carried out effectively.
A user perspective
In order to understand the difficulties of wayfinding first-hand, field research was conducted at the Vassall Centre in Bristol, a building which houses a number of different disability organisations but has no reception and relies on visual signage for direction. It is a location with many navigation problems – and as visitors represent a range of age and ability, it provided an ideal test-site for the project.
People were filmed as they struggled to find their destination and then informally interviewed to gain further insight. Experts such as Dr John Gill from the Royal National Institute for the Blind were consulted to supplement the user research and provide further perspective on the project. Emerging technologies were painstakingly researched and evaluated.
The project proposes three different solutions. All aim to limit the amount of information involved, convey only the most important information first, and allow the user to access more detail should they wish. The first design concept develops a tactile map that combines a physical object with voice information to describe a building using hearing and touch. The materials that distinguish different sections of the model are used in the real building so users can run their fingers along a material strip in the corridor to their destination.
The next two ideas use different technologies to run a similar system. They build on the fact that most people have electronic devices such as camera-equipped mobile phones or mp3 players and the majority of these will soon have easy access to the internet. A building can therefore upload navigation information on the internet that can be accessed in ‘real time’ as a person walks through that space, giving ‘blow by blow’ directions. People can post their own directions and comments on a particular space to aid other users and the recipients can adjust the amount of information they want to hear. One system uses QR codes, a two-dimensional barcode that can be read by a mobile phone camera and interpreted into directions. The other uses RFID technology to allow a seamless interchange between a building and a personal device.
The three solutions were prototyped and tested in situ in the Vassall Centre, giving the study a practical application and adding to the sum of knowledge on using new technology to aid wayfinding for both the low vision user and the fully sighted.