Eye Movement Recognition Using Support Vector Machine

Authors

  • Computer Science Department, College of Education for pure Sciences, University of Thi-Qar , Iraq
  • Computer Science Department, College of Education for pure Sciences, University of Thi-Qar , Iraq.

Keywords:

EOG, Eye Movement and SVM

Abstract

People with disabilities suffer from inability to communicate with their surroundings, so Human-
Computer Interaction (HCI) technologies are used to have a means of communication for people with
disabilities with their surroundings. HCI is an emerging technology in the disciplines of Artificial
Intelligence and Biomedical Engineering. To power an external device, HCI technology uses several
basic signals such as ECG, EMG, and EEG. Electrooculography (EOG) is a technique for measuring the
potential difference between the cornea and the retina located between the front and back of the human
eye, and the main application of EOG is to determine the directions of different eye movements. This
study aims to assess eye movement for communication by persons with disabilities using
electrocardiogram (EOG) data. In this study, the Supporting Vector Machine (SVM) classification
technique was used and two types of features (statistical and time domain features) were used.
Classification accuracy was 90.7% and 93.9% when using SVM with statistical domain and time domain
features, respectively

References

F. Cincotti et al., “Non-invasive brain–computer interface system: towards its application as

assistive technology,” Brain Res. Bull., vol. 75, no. 6, pp. 796–803, 2008.

R. Barea, L. Boquete, M. Mazo, and E. López, “System for assisted mobility using eye movements

based on electrooculography,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 10, no. 4, pp. 209–

, Dec. 2002, doi: 10.1109/TNSRE.2002.806829.

J. Keegan, E. Burke, J. Condron, and E. Coyle, “Improving electrooculogram-based computer

mouse systems: The accelerometer trigger,” Bioeng. Irel., vol. 201, no. 1, 2011.

R. Barea, L. Boquete, M. Mazo, and E. López, “Wheelchair guidance strategies using EOG,” J.

Intell. Robot. Syst., vol. 34, no. 3, pp. 279–299, 2002.

T. Wissel and R. Palaniappan, “Considerations on strategies to improve EOG signal analysis,” Int.

J. Artif. Life Res., vol. 2, no. 3, pp. 6–21, 2011.

G. Teng, Y. He, H. Zhao, D. Liu, J. Xiao, and S. Ramkumar, “Design and Development of Human

Computer Interface Using Electrooculogram With Deep Learning,” Artif. Intell. Med., vol. 102, p.

, 2020, doi: 10.1016/j.artmed.2019.101765.

R. Barea, L. Boquete, J. M. Rodriguez-Ascariz, S. Ortega, and E. López, “Sensory system for

implementing a human—computer interface based on electrooculography,” Sensors, vol. 11, no. 1,

pp. 310–328, 2010.

N. Itakura and K. Sakamoto, “A new method for calculating eye movement displacement from AC

coupled electro-oculographic signals in head mounted eye–gaze input interfaces,” Biomed. Signal

Process. Control, vol. 5, no. 2, pp. 142–146, 2010.

T. Gandhi, M. Trikha, J. Santosh, and S. Anand, “VHDL based electro-oculogram signal

classification,” in 2007 15th International Conference on Advanced Computing and

Communications, 2007, pp. 153–158.

A. Banerjee, M. Pal, S. Datta, D. N. Tibarewala, and A. Konar, “Eye movement sequence analysis

using electrooculogram to assist autistic children,” Biomed. Signal Process. Control, vol. 14, pp.

–140, 2014.

F. E. Samann and M. S. Hadi, “Human to Television Interface for Disabled People Based on

EOG,” J. Duhok Univ., vol. 21, no. 1, pp. 54–64, 2018.

P. Majaranta and A. Bulling, “Eye tracking and eye-based human–computer interaction,” in

Advances in physiological computing, Springer, 2014, pp. 39–65.

R. Barea, L. Boquete, S. Ortega, E. López, and J. M. Rodríguez-Ascariz, “EOG-based eye

movements codification for human computer interaction,” Expert Syst. Appl., vol. 39, no. 3, pp.

–2683, 2012.

A. U. Kabir, F. Bin Shahin, and M. Kafiul Islam, “Design and Implementation of an EOG-based

Mouse Cursor Control for Application in Human-Computer Interaction,” J. Phys. Conf. Ser., vol.

, no. 1, 2020, doi: 10.1088/1742-6596/1487/1/012043.

T. Ravichandran, N. Kamel, A. A. Al-Ezzi, K. Alsaih, and N. Yahya, “Electrooculography-based

Eye Movement Classification using Deep Learning Models,” Proc. - 2020 IEEE EMBS Conf

Biomed. Eng. Sci. IECBES 2020, pp. 57–61, 2021, doi: 10.1109/IECBES48179.2021.9398730.

https://www.um.edu.mt/cbc/ourprojects/EOG/EOGdataset.

L. J. Qi and N. Alias, “Comparison of ANN and SVM for classification of eye movements in EOG

signals,” J. Phys. Conf. Ser., vol. 971, no. 1, 2018, doi: 10.1088/1742-6596/971/1/012012.

S. Roy, A. De, and N. Panigrahi, “Saccade and Fix detection from EOG signal,” Proc. - 2019 IEEE

Int. Symp. Smart Electron. Syst. iSES 2019, pp. 406–408, 2019, doi:

1109/iSES47678.2019.00099.

R. Zebari, A. Abdulazeez, D. Zeebaree, D. Zebari, and J. Saeed, “A Comprehensive Review of

Dimensionality Reduction Techniques for Feature Selection and Feature Extraction,” J. Appl. Sci.

Technol. Trends, vol. 1, no. 2, pp. 56–70, 2020, doi: 10.38094/jastt1224.

S. Aungsakul, A. Phinyomark, P. Phukpattaranont, and C. Limsakul, “Evaluating feature extraction

methods of electrooculography (EOG) signal for human-computer interface,” Procedia Eng., vol.

, no. January, pp. 246–252, 2012, doi: 10.1016/j.proeng.2012.01.1264.

Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Circuits and systems (ISCAS),” in Proceedings of

IEEE International Symposium on, 2010, pp. 253–256.

M. M. Hasan, C. N. Watling, and G. S. Larue, “Physiological signal-based drowsiness detection

using machine learning: Singular and hybrid signal approaches,” J. Safety Res., vol. 80, pp. 215–

, 2022, doi: 10.1016/j.jsr.2021.12.001.

Downloads

Published

2023-02-19