RT Journal Article SR Electronic T1 Natural Gaze Data Driven Wheelchair JF bioRxiv FD Cold Spring Harbor Laboratory SP 252684 DO 10.1101/252684 A1 Lou-Ann Raymond A1 Margherita Piccini A1 Mahendran Subramanian A1 Orlov Pavel A1 Aldo Faisal YR 2018 UL http://biorxiv.org/content/early/2018/01/24/252684.abstract AB Most wheelchair interfaces, such as joystick and sip/puff, are not suitable for severely disabled patients. Furthermore, cost is a predominant limiting factor for the deployment of custom engineered platforms. Here we describe and discuss two low-cost yet efficient gaze-based wheelchair control interfaces in detail. Firstly, we superimposed gaze and depth data to achieve real-time 3D gaze data estimation and floor-detection. The accuracy of the 3D gaze data was assessed using a commercial IR-based tracking camera system as a control. Later, natural eye gaze during wheelchair navigation was recorded to produce a heat map based on most probable direction state. Information within the heat maps was subsequently utilized to encode eye gaze-contingent wheelchair interfaces. Various wheelchair navigation tasks were performed using these novel interfaces to compare against other currently available navigation techniques and modules. The results suggest that such intention decoding from natural eye gaze based methods can be a suitable alternative to the other widely-used techniques for wheelchair navigation.