Next Article in Journal
A 3D Printed Membrane-Based Gas Microflow Regulator for On-Chip Cell Culture
Previous Article in Journal
Interface Characteristics of Ti-Clad V–4Cr–4Ti Alloy Diffusion-Bonded Joint Produced by Hot Forging
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Outdoor Navigation System for Blind Pedestrians Using GPS and Tactile-Foot Feedback

by
Ramiro Velázquez
1,*,
Edwige Pissaloux
2,
Pedro Rodrigo
1,
Miguel Carrasco
3,
Nicola Ivan Giannoccaro
4 and
Aimé Lay-Ekuakille
4
1
Faculty of Engineering, Universidad Panamericana, Aguascalientes 20290, Mexico
2
Physics Department, Université de Rouen, Mont-Saint-Aignan 76821, France
3
Faculty of Engineering and Sciences, Universidad Adolfo Ibáñez, Peñalolén, Santiago 7941169, Chile
4
Department of Innovation Engineering, Università del Salento, Lecce 73100, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2018, 8(4), 578; https://doi.org/10.3390/app8040578
Submission received: 17 March 2018 / Revised: 2 April 2018 / Accepted: 4 April 2018 / Published: 7 April 2018

Abstract

:
This paper presents a novel, wearable navigation system for visually impaired and blind pedestrians that combines a global positioning system (GPS) for user outdoor localization and tactile-foot stimulation for information presentation. Real-time GPS data provided by a smartphone are processed by dedicated navigation software to determine the directions to a destination. Navigational directions are then encoded as vibrations and conveyed to the user via a tactile display that inserts into the shoe. The experimental results showed that users were capable of recognizing with high accuracy the tactile feedback provided to their feet. The preliminary tests conducted in outdoor locations involved two blind users who were guided along 380–420 m predetermined pathways, while sharing the space with other pedestrians and facing typical urban obstacles. The subjects successfully reached the target destinations. The results suggest that the proposed system enhances independent, safe navigation of blind pedestrians and show the potential of tactile-foot stimulation in assistive devices.

Graphical Abstract

1. Introduction

Navigation assistive technology for visually impaired and blind people has been an active subject of study for decades.
Two different processes of human mobility have been identified for navigation assistive system design: sensing of the immediate environment and orientation during travel [1]. While the former refers to the gathering of spatial information for obstacle (or any other travel obstruction) detection, the latter involves the update of the traveler’s location in a route and the continuous guidance to reach a destination.
Systems that address one or both navigation processes can be found in the literature.
Examples of obstacle detection systems can be tracked back to the 1970s when sonar technology was at its peak [2,3]. A short while later, sonar systems evolved to ultrasonic sensors, improving the measurement accuracy of the distance to an obstacle [4,5]. Laser telemeters have also been explored to detect obstacles ahead of a user [6,7]. More recently, video cameras and image sensors together with computer vision techniques have been employed to find obstacles and clear paths in images from the nearby space [8,9].
Examples of localization and guidance can be distinguished into two categories: outdoor and indoor navigation. Currently, there is no single technology that can offer a solution for both. Most outdoor navigation assistive systems use global positioning systems (GPS) for the traveler’s localization and have made outdoor navigation much easier for blind pedestrians. The main drawback of GPS is that satellite signals become significantly weaker in indoor environments as buildings block the line of sight between the satellite and the receiver. Solutions for indoor settings have explored beacon-based approaches that use cameras [10,11], infrared sensors [12,13], radio frequency identification (RFID) tags [14,15], wireless communication devices [16,17], computer-aided design (CAD) maps [18], among others. The main challenge of the beacon approach is to install a dedicated network that efficiently covers the traveler’s entire path.
Despite intensive research and the variety of technological approaches, navigation assistive devices have not yet become common, and user acceptance is, in fact, quite low. Several fundamental shortcomings can be found in both obstacle detection and localization systems [19]:
(1)
Obstacle detection technologies involve complex, continuous, and time-consuming operations; environment scanning requires the user to actively scan the environment. The gathered information must be analyzed before making a decision—constant activity and cognitive effort that reduces walking speed and quickly fatigues the user.
(2)
They provide acoustic feedback. Blind people rely on hearing environmental cues for key navigation tasks, such as awareness, orientation, mobility, and safety. A system providing continuous acoustic feedback might distract the user from the environment. While obstacle detection systems typically provide different frequency tones to indicate the distance to an object, localization systems provide spoken voice commands to guide the pedestrian along a path.
(3)
Long learning and training times are required to master most navigation assistive systems. Blind people invest non-negligible time to get used to the device and understand fully the system’s feedback, which is certainly frustrating and discouraging.
(4)
They are still burdensome and conspicuous as portable or wearable devices, which are essential needs for blind pedestrians.
These shortcomings limit navigation assistive devices from attaining significant and perceivable improvements in comparison with primary aids (e.g., white cane and guide dog).
The white cane and the guide dog are the most popular and longstanding navigational aids for blind pedestrians. The white cane is multifunctional, reliable, cheap, compact, and lightweight. The guide dog is a very capable and reliable travel aid for leading blind people around obstacles. It also becomes a friend and companion. Furthermore, both are the icons of a blind pedestrian, which is useful for getting assistance from other pedestrians.
Some limitations of the white cane involve a limited range of scanning (1–2 m) and no overhanging obstacle detection, while the constant care and high cost of guide dogs may not suit every blind pedestrian. Unfortunately, both the white cane and guide dog are only meant for obstacle detection and still need to be complemented with another device that addresses the orientation process [20].
Decades of research and a wide assortment of academic prototypes and commercial products have shown that any attempt to replace the primary aids in the obstacle detection task has failed. One should wonder if it would be more pertinent to design complementary systems to the primary aids rather than to keep trying to replace them.
This paper introduces a novel outdoor navigation assistive system that is a complement to the primary aids by trusting them with the immediate environment sensing. Localization is conducted by GPS data provided by a smartphone, which are processed in the context of spatial databases for the computation of optimal routes of travel. Guidance is achieved by mechanotactile sensations that the system delivers to the foot to identify the navigational directions to follow. This device addresses the four aforementioned major challenges of navigation assistive devices: (1) the obstacle detection process is conducted through the traditional and successful haptic feedback provided by the white cane or dog; (2) tactile feedback is used to prevent user distraction from the environment; (3) such tactile feedback is simple, fast to understand, and provides small amounts of meaningful information, which reduces learning and training times; and (4) the system is totally wearable.
The rest of the paper is organized as follows: Section 2 reviews relevant related work on complete navigation assistive systems integrating GPS and audio and haptic feedback. Section 3 presents a technical overview of the proposed system, while Section 4 presents a set of experiments that demonstrate the effectiveness of the prototype and the initial feasibility of the approach. Finally, the Conclusion summarizes the main contributions and gives future work perspectives.

2. Related Work

The idea of exploiting GPS to assist the navigation of blind pedestrians was first introduced in the mid-1980s by Collins [21] and Loomis [22]. In these works, conceptual designs of navigation assistive systems based on GPS were independently proposed. In particular, the latter work established the functional modules of such systems (see Figure 1): a GPS receiver that calculates the traveler’s position (longitude/latitude/altitude); a geographic information system (GIS) comprised of the system software that captures, stores, manipulates, and analyzes the GPS coordinates; and the interface for presenting the information to the user.
Brusnighan conducted the first experimental evaluation of GPS for assisting localization of blind pedestrians [23]. Guidance was provided via verbal instructions. However, poor positioning due to nascent GPS technology inhibited any practical application.
Commercial products exploiting GPS and verbal instructions through speech synthesizers also appeared upon the development of GPS. The Strider (1994), the Atlas (1995), the GPS-talk (2000), and the BrailleNote (2001), all Sendero Group products, included in their GISs detailed digital maps of United States (US) cities with streets and points of interests and provided verbal instructions for traveling to desired destinations [24]. Improvement upon Sendero GPS technology has continued over the past decade. The most recent product, the Seeing Eye GPS, is a GPS/GIS application for mobile phones. A similar product from HumanWare Group is the Trekker Breeze [25], a handheld talking GPS/GIS, which includes information on street names, landmarks, points of interest, and public services around the pedestrian’s location.
Endeavors to make cities more accessible to blind pedestrians have also explored GPS and speech interfaces. The e-Adept project aims to implement in the city of Stockholm a digital network with information on paths, sidewalks, signs, stairs, and many other features to enhance the navigation of blind and disabled people [26]. In Finland, the NOPPA project uses GPS, a mobile phone, and an information server to provide door-to-door guidance for both blind and sighted users taking public transportation [27].
Navigation guidance using verbal instructions has been studied in several research projects [28,29,30,31,32], exploring different hardware architectures and GIS approaches. Most of these systems have been validated from a strictly technical point of view.
Tactile feedback has also been examined to assist the navigation and wayfinding of blind people.
Zelek presented in [33] a prototype of tactile gloves to augment “the white cane and guide dog” approach to assist the navigation of blind pedestrians. Pielot introduced in [34] the PocketNavigator, an Android application for smartphones that uses vibrotactile feedback to provide pedestrian navigational information. Similarly, Jacob designed an Android-based application that uses web-service routing algorithms to provide navigational information through the vibrations of the smartphone [35]. Pielot’s Tactile Wayfinder consists of a torso belt display that provides directions through vibrotactile feedback [36]. Schirmer introduced in [37] a device consisting of a pair of actuators that stimulate the ankle with vibrations encoding directions provided by a smartphone. Spiers compared [38] GPS navigation with two haptic modalities: vibrotactile and shape-changing with two novel interfaces, the Cricket and Animotus. Both modalities successfully guided test users to the target destination. However, information displayed by the shape-changing interface proved to be faster to understand. Meier investigated in [39] vibrotactile feedback at five different body locations by reconstructing tactile display designs available in the literature, such as a sock bandage, a wristband, a belt, a foot-sole matrix display, and a side wall shoe. The last prototype was further evaluated in smartphone-based GPS navigation test by normally sighted users not wearing a blindfold.
This short review shows that GPS-haptics navigation has been considered for assisting blind pedestrians for quite some time. However, most of the haptic user interfaces reviewed require constant hand interaction either for receiving the system’s feedback or for carrying the device. This has two shortcomings: (1) users get quickly tired of hand-holding a device during walking and (2) busy hands limit the use of the primary aids. In this paper, we propose a novel, wearable user interface for the foot that addresses this issue by supporting hands-free interaction.

3. System Description

The assistive device reported in this paper was designed to facilitate blind pedestrian outdoor navigation and follows the three-module concept of Figure 1. Commercial hardware components and software modules were used to implement a fully operational prototype with special attention on wearability, reliability, efficiency, and low cost.

3.1. Localization and Guidance

The operation principle and main components of the proposed system are shown in Figure 2a.
Modern smartphones containing GPS, a digital compass, accelerometers, inertial sensors, and connectivity capabilities have become ideal candidates for portable computing applications [40]. Nowadays, they are the simplest and most common solutions for GPS receiver modules. In this current prototype, a smartphone (Samsung Galaxy S5 running Android 4.4.2) was used for user GPS coordinate and orientation acquisition.
Using the smartphone’s internet connectivity, the user’s geospatial information is transmitted to a cloud server as a data sentence. Upon reception, the server constructs a text file containing longitude, latitude, and orientation and discards other information, such as time, altitude, number of satellites, etc. The smartphone updates the server at 5 Hz, which extensively covers any walking speed.
A remote computer accesses the server’s text file via internet and locates the user in a locally running spatial database. The GIS used is OpenStreetMap [41]. A dedicated script links to the YOURS application programming interface (API) [42], an open source route planner, to compute the shortest pedestrian route to a previously chosen destination. The route waypoints are returned from YOURS, and they are placed and stored in the GIS. Directions to the destination are processed locally. The GIS no longer needs to communicate with the route planner API unless a route recalculation is necessary.
Directions are translated to actuator commands, which are transmitted via radiofrequency (RF) to an electronic module that the user carries with him/her. Finally, signals are interpreted by a microcontroller, which sets the actuators in the tactile display.
Figure 2b illustrates how a blind pedestrian would use the device. Though it would be possible to implement the full navigation software as an Android application and have it all running in a smartphone, the cloud server and the remote station provide a more flexible option at this moment for participant monitoring, system debugging, and concept validation.

3.2. User Interface

A vibrotactile interface provides the user with navigational directions to reach a destination. The main novelty of this approach is that directions are conveyed via tactile foot-stimulation.
Based on the physiology of the plantar surface of the foot [43], a tactile interface consisting of a four-point array of actuators was conceived to stimulate the fast adapting type I (FAI) mechanoreceptors sensitive to low frequency vibrotactile stimulation on the medial, lateral, and tibial plantar areas (Figure 3a).
Figure 3b shows the prototype developed. This device integrates four vibrating actuators in a commercial, inexpensive foam shoe insole. Actuators provide axial forces up to 13 mN and vibrating frequencies between 10 and 55 Hz. Each actuator can be independently controlled with a specific vibrating frequency command. Dots of an epoxy paste cover the actuators’ entire upper surfaces and ensure a 133 mm2 contact surface with the foot sole. Vibrations are correctly transmitted through the dots, while the absorption characteristics of the foam prevent them from having an expanding vibration effect throughout the insole.
The tactile display is meant to be used on the right foot, and it is completely wearable (Figure 3c). Note that battery, RF transmission module, and control circuitry are all embedded in an electronic module that the user carries comfortably attached to the ankle. Experimental tests revealed a 6 h continuous operation of the device, and a 200-m communication distance range with the controlling computer. Note that the tactile display is further inserted into the shoe, and the electronic module can be covered by the user’s clothing, thus becoming an inconspicuous, visually unnoticeable assistive device.
This device is the third version of tactile interfaces for the foot that we have developed. It integrates technological and tactile rendering improvements of the two previous prototypes [44,45].

4. Evaluation

Preliminary experiments were carried out to evaluate the system’s performance. Prior to testing outdoor navigation, the user interface was first verified as being able to transmit discernible tactile information. This section presents the two evaluation stages of the prototype.

4.1. Experiment I: Direction Recognition

The purpose of the first experiment was to determine whether a group of voluntary subjects could recognize navigational instructions displayed to their feet.

4.1.1. Study Participants and Experimental Procedure

A total of 20 undergraduate students (5 women and 15 men) at Panamericana University (Mexico) took part in the experiment. Their ages varied between 18 and 24 years with an average of 20.5. All participants were normally sighted. None of them had any known tactile sensory or cognitive deficit. All provided their informed consent before participating according to the university ethics guidelines.
During the experiment, the subjects were comfortably seated wearing the user interface on the right foot. For hygiene, all subjects were wearing socks. Prior to the experiment, they were totally unaware of the details and purpose of the test. General instructions about the task were comprehensively given. A short familiarization time with the user interface was granted. During this time, the participants tested different actuator vibration frequencies and had the opportunity to choose the preferred one. All 20 subjects chose 55 Hz, the actuators’ maximum vibration frequency.
A laboratory assistant operated the remote computer and sent commands to the user interface. The smartphone, cloud server, and navigation software were not required for this experiment.

4.1.2. Method

Each contact pin of the user interface represents a navigational direction: go ‘forward’ (F), go ‘backward’ (B), turn ‘left’ (L), and turn ‘right’ (R). A navigational direction was displayed as follows (t1–t5): three consecutive vibrations in the correct contact pin, then one vibration in the opposite contact pin, and again a vibration in the correct contact pin.
Figure 4a shows, for example, the codification for ‘forward’. Note that the contact pin F vibrates three times, then pin B once, and again pin F. Figure 4b presents the corresponding vibrotactile pattern showing the set times for the contact pins.
A fifth pattern representing ‘stop’ (S) was displayed as the well-known pattern for SMS alerts in mobile phones: two consecutive vibrations, then a pause, and again two consecutive vibrations.A set of 20 directions was presented to the subjects in one trial. In this set, each direction was randomly displayed four times. The subjects were asked to report the direction perceived with no time restriction. Upon request, they could have the vibrating pattern refreshed on the interface.

4.1.3. Results

The results obtained are presented in the confusion matrix in Table 1. The average recognition rates were 100%, 97.78%, 88.89%, 90%, and 100% for F, B, L, R, and S, respectively.
Note the high recognition rates. These are due to an optimized tactile rendering approach that tackles two perception issues identified in previous experiments: (1) Missed vibrations: the first three vibrations prevent users from missing a vibration (because of distraction, performing another activity, etc.) and failing at direction identification; and (2) Inaccurate discrimination: the fourth vibration points to the opposite direction. This provides a reliable reference to identify points of vibration when users cannot accurately distinguish which actuator is actually vibrating. The fifth vibration allows reconfirming of the correct direction. Displaying both correct and opposite directions in the same tactile pattern actually eases their recognition [45].

4.2. Experiment II: Outdoor Navigation

The two main purposes of the second experiment were to evaluate the performance of the navigation system in outdoor environments and to determine whether blind pedestrians could successfully arrive at a destination with the assistance provided by the prototype.

4.2.1. Study Participants and Experimental Procedure

This experiment complies with Panamericana University ethics guidelines concerning human test subjects.
Two blind subjects participated in this experiment. Both were male adults aged 31 and 35. The younger subject (subject A) was congenitally blind, while the older (subject B) developed retinitis pigmentosa in early infancy. Due to the severe loss of vision [46], subject B is considered legally blind. None of them reported any known impairment in tactile sensory or cognitive functions. Both are considered expert white cane users as they reported more than 10 years of experience.
Two urban environments in the city of Aguascalientes, Mexico were chosen for the experiment. For safety reasons, they were carefully selected for exhibiting low vehicle traffic. Static and dynamic obstacles caused by objects and other people were present in both locations. Starting and finishing points representing 380–420 m distances were fixed prior to the test.
During the test the remote computer was placed inside a slow-moving vehicle, which kept a comprehensive distance from the subject to ensure RF transmission at all times. A human assistant walked 10 m ahead of the subject with the sole purpose of previewing/intervening in any unexpected risky situation.
All system components of Figure 2a were tested in this experiment.

4.2.2. Method

Prior to the test, both subjects completed experiment I in their own homes to get used to the navigational directions provided by the user interface. The results were considered satisfactory.
The subjects were then transferred by car from their respective homes to the chosen environments. They were totally naive about the locations. The subjects were asked to move according to the pattern felt. They were asked to use the white cane as they would do normally in a common walk. They were requested to raise their hand if they felt lost at some point or if they needed the navigational instruction refreshed on the interface. In that case, a route recalculation would be performed, and new instructions would be generated. They had no time restrictions in which to complete the test. They were aware that an assistant was nearby and that they could terminate the test at any time if they felt uncomfortable.
For both subjects, GPS coordinates and navigation times were registered in the GIS.

4.2.3. Results

The proposed environments and the paths followed by the test subjects are shown in Figure 5. Both subjects successfully completed the task with no errors.
The task in environment 1 (E-1) consisted of guiding the subjects along a 380 m pathway from a random street point to a church (Figure 5a). A set of 11 directions was communicated for this purpose. The task in environment 2 (E-2) consisted of guiding the subjects along a 420 m pathway from a close road to a park’s corner (Figure 5b). A set of 17 directions was communicated regarding the route. Table 2 compares the navigation times of both subjects. Note that the difference in performance is negligible.
Figure 6 compares the average walking speeds of the subjects during the test. Speeds from 1.08 to 1.19 m/s were observed. The average walking speed of a normally sighted person is 1.4 m/s [47], which suggests that the navigation system does not reduce walking pace.
The subjects’ navigation was performed on sidewalks except when crossing streets. Typical urban obstacles, such as street poles, trees, trash bins, and other people, were encountered and overcome using the white cane. A natural fearless walking could be observed during navigation together with firm doubtless post-instruction movements.
An approximate 2–4 m resolution of the GPS longitude/latitude data provided by the smartphone was observed during the test. Its impact on navigation was quantified by calculating the path efficiency (defined as the ratio between the length of the path actually traversed versus the length of the optimal path) for each path section bounded by two waypoints where the user received directional instructions. Figure 5c,d show these waypoints for both outdoor environments, while Table 3 summarizes the calculated path efficiencies for all path sections.
Note that high path efficiency rates were obtained by the subjects. Again, similar performances were observed.
Despite these high scores, GPS resolution could eventually confuse users, mostly at street corners. Figure 7 illustrates this situation: when instructed to turn, users could encounter a wall. Turning would be possible a few steps beyond.
During the experiments, it was observed that subjects were initially confused by the GPS imprecisions, but with the aid of the white cane, they quickly learned to manage the situation fairly well.
After the test, the subjects stated the intuitiveness of the information displayed and the low cognitive load demanded (navigational directions were displayed only when an action was required. Most of the journey the user interface did not display any information). They also stated that there was no point in displaying ‘turn left or right’ and then ‘forward’, because it can be deduced that after turning one continues forward. This is true in all cases; the directions in E-1 can be further reduced to 7, while the E-2 path can be completed by displaying only 12 directions.
The results are undoubtedly encouraging. They confirm that the prototype and its modules are operational and suggest that it is feasible to exploit GPS and tactile-foot stimulation for assisting outdoor navigation of blind pedestrians.

5. Conclusions

Ubiquitous and wearable computing have ignited a new type of human-computer interaction with palpable applications in e-commerce, consumer electronics, business, education, and healthcare, among others. Assistive technology can benefit from these ubiquitous computing resources to improve the quality of life of people in need.
In this context, this paper has presented the design, implementation, and experimental evaluation of a novel navigation assistive device for blind pedestrians based on GPS and tactile feedback.
The device is intended to be used as a localization and guidance system complementary to the primary aids. Its approach relies on the haptic feedback provided by the white cane and the guide dog for detecting obstacles along a route.
For user localization in an outdoor environment, the approach exploits the GPS coordinates provided by a smartphone. Guidance to a destination is achieved by dedicated navigation software that processes the GPS data and computes the optimal route of travel. Directions to a destination are conveyed to the user via an on-shoe tactile display. The aim of this system is to enhance outdoor navigation of visually impaired and blind people with a truly wearable, inconspicuous, reliable, and low-cost device.
The prototype was evaluated in two stages. The first one verified the interface’s ability to transmit tactile information to the user, and the user’s comprehension level to this feedback. The results showed very high recognition rates, which suggests that the information displayed is intuitive and fast to understand. The second stage evaluated both the system and subjects’ performances in real outdoor urban environments. The results showed that the system is capable of guiding users from a starting point to a destination by providing the pertinent directional instructions.
The prototype reported in this paper was recently featured by the BBC Horizons Series. Similar navigation experiments and blind users’ testimonials can be seen in this documentary program [48].
Future work will focus on the implementation of an Android application that fully runs in the smartphone, integrating the GIS and the route planner, and that generates the instructions for the tactile display. Figure 8 shows the proposed architecture. Note that a high-sensitivity GPS external chipset receiver will be used to increase the smartphone’s position accuracy to the one-meter or sub-meter level (enhancement technologies to the GPS, such as differential GPS and satellite-based augmentation system (SBAS) will be evaluated as well). The RF module will be replaced by a Bluetooth one so that the smartphone can set directly the actuators in the tactile display. An embedded system is expected to replace the microcontroller. The cloud server and the remote station will no longer be essential modules of the system. However, they are expected to continue operating as they could be useful for the user’s relatives or caretakers. Remote monitoring undoubtedly increasers the user’s safety and his/her family’s peace of mind.

Author Contributions

R.V. and E.P. conceived the system and designed the experiments. P.R. contributed to the tests with normally sighted and blind subjects. M.C. analyzed the data and advised on all aspects related to the GPS. N.I.G. and A.L.-E. validated the experimental results and reviewed the paper. R.V. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Loomis, J.; Golledge, R.; Klatzky, R.; Speigle, J.; Tietz, J. Personal Guidance System for the Visually Impaired. In Proceedings of the Annual ACM Conference on Assistive Technologies, Marina del Rey, CA, USA, 31 October–1 November1994; pp. 85–91. [Google Scholar] [CrossRef]
  2. Kay, L. A Sonar Aid to Enhance Spatial Perception of the Blind: Engineering Design and Evaluation. Radio Electron. Eng. 1974, 44, 605–627. [Google Scholar] [CrossRef]
  3. Pressey, N. Mowat Sensor. Focus 1977, 11, 35–39. [Google Scholar]
  4. Veelaert, P.; Bogaerts, W. Ultrasonic Potential Field Sensor for Obstacle Avoidance. IEEE Trans. Robot. Autom. 1990, 15, 774–779. [Google Scholar] [CrossRef]
  5. Hoyle, B.; Waters, D. Mobility AT: The Batcane (UltraCane). In Assistive Technology for Visually Impaired and Blind People; Hersh, M., Johnson, M., Eds.; Springer: London, UK, 2008; pp. 209–229. ISBN 978-1-84628-867-8. [Google Scholar] [CrossRef]
  6. Farcy, R.; Damaschini, R. Triangulating Laser Profilometer as a Threedimensional Space Perception System for the Blind. Appl. Opt. 1997, 36, 8227–8232. [Google Scholar] [CrossRef] [PubMed]
  7. Milios, E.; Kapralos, B.; Kopinska, A.; Stergiopoulos, S. Sonification of Range Information for 3-D Space Perception. IEEE Trans. Neural Syst. Rehabil. Eng. 2003, 11, 416–421. [Google Scholar] [CrossRef] [PubMed]
  8. Pissaloux, E.; Velazquez, R.; Maingreaud, F. A New Framework for Cognitive Mobility of Visually Impaired Users in Using Tactile Device. IEEE Trans. Hum.-Mach. Syst. 2017, 47, 1040–1051. [Google Scholar] [CrossRef]
  9. Rodriguez, A.; Yebes, J.; Alcantarilla, P.; Bergasa, L.; Almazan, J.; Cela, A. Assisting the Visually Impaired: Obstacle Detection and Warning System by Acoustic Feedback. Sensors 2012, 12, 17476–17496. [Google Scholar] [CrossRef] [PubMed]
  10. Lee, Y.; Medioni, G. RGB-D Camera Based Wearable Navigation System for the Visually Impaired. Comput. Vis. Image Understand. 2016, 149, 3–20. [Google Scholar] [CrossRef]
  11. Vlaminck, M.; Hiep, Q.; Hoang, V.; Vu, H.; Veelaert, P.; Philips, W. Indoor Assistance for Visually Impaired People Using a RGB-D Camera. In Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation, Santa Fe, NM, USA, 6–8 March 2016; pp. 161–164. [Google Scholar] [CrossRef]
  12. Hesch, J.; Roumeliotis, S. An Indoor Localization Aid for the Visually Impaired. In Proceedings of the IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 3545–3551. [Google Scholar] [CrossRef]
  13. Jain, D. Path-Guided Indoor Navigation for the Visually Impaired Using Minimal Building Retrofitting. In Proceedings of the International ACM SIGACCESS Conference on Computers & Accessibility, Rochester, NY, USA, 20–22 October 2014; pp. 225–232. [Google Scholar] [CrossRef]
  14. Kulyukin, V.; Gharpure, C.; Nicholson, J.; Pavithran, S. RFID in Robot-Assisted Indoor Navigation for the Visual Impaired. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, 28 September–2 October 2004; pp. 353–357. [Google Scholar] [CrossRef]
  15. Ivanov, R. Indoor Navigation System for the Visually Impaired. In Proceedings of the International Conference on Computer Systems and Technologies, Sofia, Bulgaria, 17–18 June 2010; pp. 143–149. [Google Scholar] [CrossRef]
  16. Ando, B.; Baglio, S.; Lombardo, C.O.; Marletta, V. Smart Multisensor Strategies for Indoor Localization. In Mobility of Visually Impaired People: Fundamentals and ICT Assistive Technologies; Pissaloux, E., Velazquez, R., Eds.; Springer International Publishing: Basel, Switzerland, 2018; pp. 585–595. ISBN 978-3-319-54444-1. [Google Scholar] [CrossRef]
  17. Alvarez, Y.; Las Heras, F. ZigBee-Based Sensor Network for Indoor Location and Tracking Applications. IEEE Lat. Am. Trans. 2016, 14, 3208–3214. [Google Scholar] [CrossRef]
  18. Muñoz, J.; Li, B.; Rong, X.; Xiao, J.; Tian, Y.; Arditi, A. Demo: Assisting Visually Impaired People Navigate Indoors. In Proceedings of the International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016; pp. 4260–4261. [Google Scholar]
  19. Velazquez, R. Wearable Assistive Devices for the Blind. In Wearable and Autonomous Biomedical Devices and Systems for Smart Environment: Issues and Characterization; Lay-Ekuakille, A., Mukhopadhyay, S.C., Eds.; LNEE, 75; Springer: Berlin/Heidelberg, Germany, 2010; pp. 331–349. ISBN 978-3-642-15687-8. [Google Scholar] [CrossRef]
  20. Pissaloux, E.; Velazquez, R. On Spatial Cognition and Mobility Strategies. In Mobility of Visually Impaired People: Fundamentals and ICT Assistive Technologies; Pissaloux, E., Velazquez, R., Eds.; Springer International Publishing: Basel, Switzerland, 2018; pp. 137–166. ISBN 978-3-319-54444-1. [Google Scholar] [CrossRef]
  21. Collins, C. On Mobility Aids for the Blind. In Electronic Spatial Sensing for the Blind; Warren, D., Strelow, E., Eds.; Springer: Dordrecht, The Netherlands, 1985; pp. 35–64. ISBN 978-90-247-3238-8. [Google Scholar] [CrossRef]
  22. Loomis, J. Digital Map and Navigation System for the Visually Impaired; Unpublished Manuscript; Department of Psychology, University of California: Santa Barbara, CA, USA, 1985. [Google Scholar]
  23. Brusnighan, D.; Strauss, M.; Floyd, J.; Wheeler, B. Orientation Aid Implementing the Global Positioning System. In Proceedings of the IEEE Annual Northeast Bioengineering Conference, Boston, MA, USA, 27–28 March 1989; pp. 33–34. [Google Scholar] [CrossRef]
  24. Sendero Group LLC. Davis, CA, USA. Available online: http://www.senderogroup.com/ (accessed on 15 March 2018).
  25. Humanware Group. Drummondville, Quebec, Canada. Available online: www.humanware.com/ (accessed on 15 March 2018).
  26. Dawidson, E. Pedestrian Navigation in Stockholm, How Local Data Together with Advanced Positioning Techniques Can Be Used for Detailed Routing. In Proceedings of the ITS World Congress, Stockholm, Sweden, 21–25 September 2009; pp. 1–8. [Google Scholar]
  27. Koskinen, S.; Virtanen, A. Navigation System for the Visually Impaired Based on an Information Server Concept. In Proceedings of the Mobile Venue, Athens, Greece, 14–17 March 2004; pp. 1–6. [Google Scholar]
  28. Gaunet, F. Verbal Guidance Rules for a Localized Wayfinding Aid Intended for Blind-Pedestrians in Urban Areas. Univers. Access Inf. Soc. 2006, 4, 338–353. [Google Scholar] [CrossRef]
  29. Golledge, R.; Klatzky, R.; Loomis, J.; Speigle, J.; Tietz, J. A Geographical Information System for a GPS Based Personal Guidance System. Int. J. Geogr. Inf. Syst. 1998, 12, 727–749. [Google Scholar] [CrossRef]
  30. Matsuda, K.; Kondo, K. Towards a Navigation System for the Visually Impaired using 3D Audio. In Proceedings of the IEEE Global Conference on Consumer Electronics, Kyoto, Japan, 11–14 October 2016; pp. 1–2. [Google Scholar] [CrossRef]
  31. Holland, S.; Morse, D.; Gedenryd, H. AudioGPS: Spatial Audio Navigation in a Minimal Attention Interface. Pers. Ubiquitous Comput. 2002, 6, 253–259. [Google Scholar] [CrossRef]
  32. Guerrero, L.A.; Vasquez, F.; Ochoa, S. An Indoor Navigation System for the Visually Impaired. Sensors 2012, 12, 8236–8258. [Google Scholar] [CrossRef] [PubMed]
  33. Zelek, J. Seeing by Touch (Haptics) for Wayfinding. Int. Congr. Ser. 2005, 1282, 1108–1112. [Google Scholar] [CrossRef]
  34. Pielot, M.; Poppinga, B.; Heuten, W.; Boll, S. PocketNavigator: Studying Tactile Navigation Systems In-Situ. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 3131–3140. [Google Scholar] [CrossRef]
  35. Jacob, R.; Mooney, P.; Corcoran, P.; Winstanley, A. Integrating Haptic Feedback to Pedestrian Navigation Applications. In Proceedings of the GIS Research UK Annual Conference, Portsmouth, UK, 27–29 April 2011; pp. 20–210. [Google Scholar]
  36. Pielot, M.; Boll, S. Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems. In Pervasive Computing; Floréen, P., Krüger, A., Spasojevic, M., Eds.; LNCS, 6030; Springer: Berlin/Heidelberg, Germany, 2010; pp. 76–93. ISBN 978-3-642-12653-6. [Google Scholar] [CrossRef]
  37. Schirmer, M.; Hartmann, J.; Bertel, S.; Echtler, F. Shoe me the Way: A Shoe-Based Tactile Interface for Eyes-Free Urban Navigation. In Proceedings of the International Conference on Human-Computer Interaction with Mobile Devices and Services, Copenhagen, Denmark, 24–27 August 2015; pp. 327–336. [Google Scholar] [CrossRef]
  38. Spiers, A.; Dollar, A. Outdoor Pedestrian Navigation Assistance with a Shape Changing Haptic Interface and Comparison with a Vibrotactile Device. In Proceedings of the IEEE Haptics Symposium, Philadelphia, PA, USA, 8–11 April 2016; pp. 34–40. [Google Scholar] [CrossRef]
  39. Meier, A.; Matthies, D.; Urban, B.; Wettach, R. Exploring Vibrotactile Feedback on the Body and Foot for the Purpose of Pedestrian Navigation. In Proceedings of the International Workshop on Sensor-based Activity Recognition and Interaction, Rostock, Germany, 25–26 June 2015. Article 11. [Google Scholar] [CrossRef]
  40. Gutierrez-Martinez, J.M.; Castillo-Martinez, A.; Medina-Merodio, J.A.; Aguado-Delgado, J.; Martinez-Herraiz, J.J. Smartphones as a Light Measurement Tool: Case of Study. Appl. Sci. 2017, 7, 616. [Google Scholar] [CrossRef]
  41. OpenStreetMap Project. Available online: www.openstreetmap.org/ (accessed on 15 March 2018).
  42. YOURS map and router-finder. Available online: http://yournavigation.org/ (accessed on 15 March 2018).
  43. Kennedy, P.M.; Inglis, J.T. Distribution and Behaviour of Glabrous Cutaneous Receptors in the Human Foot Sole. J. Physiol. 2002, 538, 995–1002. [Google Scholar] [CrossRef] [PubMed]
  44. Velazquez, R.; Bazan, O.; Varona, J.; Delgado-Mata, C.; Gutierrez, C.A. Insights into the Capabilities of Tactile-Foot Perception. Int. J. Adv. Robot. Syst. 2012, 9, 1–11. [Google Scholar] [CrossRef]
  45. Velazquez, R.; Pissaloux, E.; Lay-Ekuakille, A. Tactile-Foot Stimulation Can Assist the Navigation of People with Visual Impairment. Appl. Bionics Biomech. 2015, 2015. [Google Scholar] [CrossRef] [PubMed]
  46. Velazquez, R.; Varona, J.; Rodrigo, P. Computer-Based System for Simulating Visual Impairments. IETE J. Res. 2016, 62, 833–841. [Google Scholar] [CrossRef]
  47. Browning, R.; Baker, E.; Herron, J.; Kram, R. Effects of Obesity and Sex on the Energetic Cost and Preferred Speed of Walking. J. Appl. Physiol. 2006, 100, 390–398. [Google Scholar] [CrossRef] [PubMed]
  48. BBC World News. Horizons Series—Sharper Senses. Episode 5 Clip 3: When in Doubt, Follow Your Feet. Available online: http://www.bbc.com/specialfeatures/horizonsbusiness/seriessix/sharper-senses/ (accessed on 15 March 2018).
Figure 1. Functional modules of a global positioning system (GPS)-based navigation assistive system.
Figure 1. Functional modules of a global positioning system (GPS)-based navigation assistive system.
Applsci 08 00578 g001
Figure 2. The assistive navigation system: (a) main components and operation principle; (b) Depiction of a blind pedestrian using the device. White cane or guide dog can be used in conjunction with the device (RF: radiofrequency).
Figure 2. The assistive navigation system: (a) main components and operation principle; (b) Depiction of a blind pedestrian using the device. White cane or guide dog can be used in conjunction with the device (RF: radiofrequency).
Applsci 08 00578 g002
Figure 3. On-shoe tactile display: (a) Distribution of fast adapting type I (FAI) mechanoreceptors sensitive to vibrotactile stimulation in the foot sole, Reproduced with permission from P.M. Kennedy, J.T. Inglis, Journal of Physiology; published by John Wiley and Sons, 2004.; (b) Prototype; and (c) Fully wearable device with wireless connection.
Figure 3. On-shoe tactile display: (a) Distribution of fast adapting type I (FAI) mechanoreceptors sensitive to vibrotactile stimulation in the foot sole, Reproduced with permission from P.M. Kennedy, J.T. Inglis, Journal of Physiology; published by John Wiley and Sons, 2004.; (b) Prototype; and (c) Fully wearable device with wireless connection.
Applsci 08 00578 g003
Figure 4. (a) Schedule of activation of the vibrating actuators for the direction recognition task—example for ‘forward’; (b) the corresponding vibrotactile pattern: the whole pattern is 7 s.
Figure 4. (a) Schedule of activation of the vibrating actuators for the direction recognition task—example for ‘forward’; (b) the corresponding vibrotactile pattern: the whole pattern is 7 s.
Applsci 08 00578 g004
Figure 5. Examples of subject performance in the outdoor navigation task. (a) Subject A in E-1; (b) Subject B in E-2. GPS data in these plots were downsampled by a factor of 10. The corresponding walking paths and waypoints correspond to where the instructions led: (c) E-1 and (d) E-2.
Figure 5. Examples of subject performance in the outdoor navigation task. (a) Subject A in E-1; (b) Subject B in E-2. GPS data in these plots were downsampled by a factor of 10. The corresponding walking paths and waypoints correspond to where the instructions led: (c) E-1 and (d) E-2.
Applsci 08 00578 g005
Figure 6. The participants’ average walking speeds observed during the tests.
Figure 6. The participants’ average walking speeds observed during the tests.
Applsci 08 00578 g006
Figure 7. An eventual situation involving GPS resolution.
Figure 7. An eventual situation involving GPS resolution.
Applsci 08 00578 g007
Figure 8. Proposed architecture for the assistive navigation system. The smartphone incorporates the GIS and the route planner and sends the direction commands to the tactile display.
Figure 8. Proposed architecture for the assistive navigation system. The smartphone incorporates the GIS and the route planner and sends the direction commands to the tactile display.
Applsci 08 00578 g008
Table 1. Direction recognition rates with the user interface and the proposed tactile rendering approach.
Table 1. Direction recognition rates with the user interface and the proposed tactile rendering approach.
Answered (%)
ForwardBackwardLeftRightStop
PresentedForward1000000
Backward2.2297.78000
Left0088.8911.110
Right1.671.676.66900
Stop0000100
Table 2. Navigational times for the outdoor experiment 2.
Table 2. Navigational times for the outdoor experiment 2.
E-1 (380 m)E-2 (420 m)
Subject A320 s381
Subject B334 s389
Difference+4.4% (B)+2.1% (B)
Table 3. Path efficiencies for the two navigation environments.
 
 
E-1s-p1p1-p2p2-p3p3-f
Subject A97.1%99.4%99.1%96.2%
Subject B96.2%98%99.5%97.5%
Difference+0.9% (A)+1.4% (A)+0.4% (B)+1.3 (B)
 
 
E-2s-p1p1-p2p2-p3p3-p4p4-p5p5-f
Subject A95.2%92.3%95.2%98.6%93.5%98%
Subject B94.7%96.7%95.4%99.8%98.7%98.3%
Difference+0.5% (A)+4.4% (B)+0.2% (B)+1.2% (B)+5.2% (B)+0.3% (B)

Share and Cite

MDPI and ACS Style

Velázquez, R.; Pissaloux, E.; Rodrigo, P.; Carrasco, M.; Giannoccaro, N.I.; Lay-Ekuakille, A. An Outdoor Navigation System for Blind Pedestrians Using GPS and Tactile-Foot Feedback. Appl. Sci. 2018, 8, 578. https://doi.org/10.3390/app8040578

AMA Style

Velázquez R, Pissaloux E, Rodrigo P, Carrasco M, Giannoccaro NI, Lay-Ekuakille A. An Outdoor Navigation System for Blind Pedestrians Using GPS and Tactile-Foot Feedback. Applied Sciences. 2018; 8(4):578. https://doi.org/10.3390/app8040578

Chicago/Turabian Style

Velázquez, Ramiro, Edwige Pissaloux, Pedro Rodrigo, Miguel Carrasco, Nicola Ivan Giannoccaro, and Aimé Lay-Ekuakille. 2018. "An Outdoor Navigation System for Blind Pedestrians Using GPS and Tactile-Foot Feedback" Applied Sciences 8, no. 4: 578. https://doi.org/10.3390/app8040578

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop