Abstract
Most wheelchair interfaces, such as joystick and sip/puff, are not suitable for severely disabled patients. Furthermore, cost is a predominant limiting factor for the deployment of custom engineered platforms. Here we describe and discuss two low-cost yet efficient gaze-based wheelchair control interfaces in detail. Firstly, we superimposed gaze and depth data to achieve real-time 3D gaze data estimation and floor-detection. The accuracy of the 3D gaze data was assessed using a commercial IR-based tracking camera system as a control. Later, natural eye gaze during wheelchair navigation was recorded to produce a heat map based on most probable direction state. Information within the heat maps was subsequently utilized to encode eye gaze-contingent wheelchair interfaces. Various wheelchair navigation tasks were performed using these novel interfaces to compare against other currently available navigation techniques and modules. The results suggest that such intention decoding from natural eye gaze based methods can be a suitable alternative to the other widely-used techniques for wheelchair navigation.
Footnotes
This work was supported in part by the EPSRC, UK.