Gaze-based Cursor Control Impairs Performance in Divided Attention

  • Róbert Adrian Rill ELTE Eötvös Loránd University, Budapest, Hungary. Faculty of Informatics, 3in Research Group, Martonvásár, Hungary. Faculty of Mathematics and Computer Science, Babeş-Bolyai University, Cluj-Napoca, Romania.
  • Kinga Bettina Faragó ELTE Eötvös Loránd University, Budapest, Hungary. Faculty of Informatics, 3in Research Group, Martonvásár, Hungary.
Keywords: gaze-based control, eye tracking, divided attention, human performance, cognitive load, Midas Touch, dwell time


In this work we investigate the effects of switching from mouse cursor control to gaze-based control in a computerized divided attention game. We conducted experiments with nine participants performing a task that requires continuous focused concentration and frequent shifts of attention. Despite carefully controlling experimental and design aspects, the performance of subjects was considerably impaired when using gaze-based control. The participants were experienced users of the mouse control version of the task, we adjusted the difficulty to the more demanding conditions and selected the parameters of gaze input based on previous research findings. In contrast to our assumptions, experienced users could not get used to gaze-based control in the amount of experiments we performed. Additionally we consider the strategies of users, i.e. their method of problem solving, and found that it is possible to make progress in our task even during a short amount of practice. The results of this study provide evidence that the adoption of interfaces controlled by human eye-gaze in cognitively demanding environments require careful design, proper testing and sufficient user training.


Download data is not yet available.


  1. Bednarik, Roman, Gowases, Tersia, and Tukiainen, Markku.
    Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience.
    Journal of Eye Movement Research, 3(1), 2009.

  2. Borji, Ali, Lennartz, Andreas, and Pomplun, Marc.
    What do eyes reveal about the mind? Algorithmic inference of search targets from fixations.
    Neurocomputing, 149:788-799, 2015.
    DOI: 10.1016/j.neucom.2014.07.055.

  3. Çöltekin, Arzu, Hempel, J., Brychtova, A., Giannopoulos, Ioannis, Stellmach, Sophie, and Dachselt, Raimund.
    Gaze and feet as additional input modalities for interacting with geospatial interfaces.
    In ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, volume III-2. ETH-Zürich, 2016.
    DOI: 10.3929/ethz-a-010820228.

  4. Chen, Chun-Ching and Huang, Yen-Yi.
    Exploring the effect of color on the gaze input interface.
    In 2018 IEEE International Conference on Applied System Invention (ICASI), pages 620-623, April 2018.
    DOI: 10.1109/ICASI.2018.8394331.

  5. Chen, Zhaokang and Shi, Bertram E.
    Using variable dwell time to accelerate gaze-based web browsing with two-step selection.
    International Journal of Human-Computer Interaction, 2018.
    DOI: 10.1080/10447318.2018.1452351.

  6. Dorr, Michael, Pomarjanschi, Laura, and Barth, Erhardt.
    Gaze beats mouse: A case study on a gaze-controlled breakout.
    PsychNology, 7(2):197-211, 2009.

  7. Fedorova, Anastasia A., Shishkin, Sergei L., Nuzhdin, Yu O., and Velichkovsky, Boris M.
    Gaze based robot control: The communicative approach.
    In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), pages 751-754, April 2015.
    DOI: 10.1109/NER.2015.7146732.

  8. Gibaldi, Agostino, Vanegas, Mauricio, Bex, Peter J., and Maiello, Guido.
    Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.
    Behavior Research Methods, 49(3):923-946, 2017.
    DOI: 10.3758/s13428-016-0762-9.

  9. Hardy, Joseph L., Farzin, Faraz, and Scanlon, Michael.
    The science behind Lumosity, Version 2, 2013.
    Lumos Labs, Inc.

  10. Holmqvist, Eva, Derbring, Sandra, and Wallin, Sofia.
    Participation through gaze controlled computer for children with severe multiple disabilities.
    Studies in Health Technology and Informatics, 242:1103-1108, 2017.

  11. Hyrskykari, Aulikki, Istance, Howell, and Vickers, Stephen.
    Gaze gestures or dwell-based interaction?.
    In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA '12, pages 229-232, New York, NY, USA, 2012. ACM.
    DOI: 10.1145/2168556.2168602.

  12. Jacob, Robert J. K.
    What you look at is what you get: Eye movement-based interaction techniques.
    In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '90, pages 11-18, New York, NY, USA, 1990. ACM.
    DOI: 10.1145/97243.97246.

  13. Kern, Dagmar, Marshall, Paul, and Schmidt, Albrecht.
    Gazemarks: Gaze-based visual placeholders to ease attention switching.
    In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '10, pages 2093-2102, New York, NY, USA, 2010. ACM.
    DOI: 10.1145/1753326.1753646.

  14. Lutteroth, Christof, Penkar, Moiz, and Weber, Gerald.
    Gaze vs. mouse: A fast and accurate gaze-only click alternative.
    In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST '15, pages 385-394, New York, NY, USA, 2015. ACM.
    DOI: 10.1145/2807442.2807461.

  15. Majaranta, Päivi, Ahola, Ulla-Kaija, and Špakov, Oleg.
    Fast gaze typing with an adjustable dwell time.
    In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, pages 357-360, New York, NY, USA, 2009. ACM.
    DOI: 10.1145/1518701.1518758.

  16. Majaranta, Päivi, Aula, Anne, and Räihä, Kari-Jouko.
    Effects of feedback on eye typing with a short dwell time.
    In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, ETRA '04, pages 139-146, New York, NY, USA, 2004. ACM.
    DOI: 10.1145/968363.968390.

  17. Majaranta, Päivi and Bulling, Andreas. Eye Tracking and Eye-Based Human-Computer Interaction, pages 39-65.
    Springer, London, 2014.
    DOI: 10.1007/978-1-4471-6392-3_3.

  18. Majaranta, Päivi, Isokoski, Poika, Rantala, Jussi, Špakov, Oleg, Akkil, Deepak, Kangas, Jari, and Raisamo, Roope.
    Haptic feedback in eye typing.
    Journal of Eye Movement Research, 9(1), 2016.

  19. Majaranta, Päivi, MacKenzie, I. Scott, Aula, Anne, and Räihä, Kari-Jouko.
    Auditory and visual feedback during eye typing.
    In CHI '03 Extended Abstracts on Human Factors in Computing Systems, CHI EA '03, pages 766-767, New York, NY, USA, 2003. ACM.
    DOI: 10.1145/765891.765979.

  20. Menges, Raphael, Kumar, Chandan, Sengupta, Korok, and Staab, Steffen.
    eyegui: A novel framework for eye-controlled user interfaces.
    In Proceedings of the 9th Nordic Conference on Human-Computer Interaction, NordiCHI '16, pages 121:1-121:6, New York, NY, USA, 2016. ACM.
    DOI: 10.1145/2971485.2996756.

  21. Murata, Atsuo.
    Eye-gaze input versus mouse: Cursor control as a function of age.
    International Journal of Human-Computer Interaction, 21(1):1-14, 2006.
    DOI: 10.1080/10447310609526168.

  22. Park, Seonwook, Spurr, Adrian, and Hilliges, Otmar.
    Deep pictorial gaze estimation.
    In The European Conference on Computer Vision (ECCV), pages 721-738, September 2018.

  23. Prabhakar, Gowdham and Biswas, Pradipta.
    Eye gaze controlled projected display in automotive and military aviation environments.
    Multimodal Technologies and Interaction, 2(1), 2018.
    DOI: 10.3390/mti2010001.

  24. Rill, Róbert Adrian, Faragó, Kinga Bettina, and Lőrincz, András.
    Strategic predictors of performance in a divided attention task.
    PLOS ONE, 13(4):1-27, 2018.
    DOI: 10.1371/journal.pone.0195131.

  25. Sengupta, Korok, Sun, Jun, Menges, Raphael, Kumar, Chandan, and Staab, Steffen.
    Analyzing the impact of cognitive load in evaluating gaze-based typing.
    In 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS), pages 787-792, June 2017.
    DOI: 10.1109/CBMS.2017.134.

  26. Shao, Yuan-Fu, Wang, Chiuan, and Fuh, Chiou-Shann.
    Eyelasso: Real-world object selection using gaze-based gestures.
    In 28th IPPR Conference on Computer Vision, Graphics, and Image Processing, 2015.

  27. Smith, J. David and Graham, T. C. Nicholas.
    Use of eye movements for video game control.
    In Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, ACE '06, New York, NY, USA, 2006. ACM.
    DOI: 10.1145/1178823.1178847.

  28. Stampe, Dave M. and Reingold, Eyal M.
    Selection by looking: A novel computer interface and its application to psychological research.
    In Findlay, John M., Walker, Robin, and Kentridge, Robert W., editors, Eye Movement Research, volume 6 of Studies in Visual Information Processing, pages 467-478. North-Holland, 1995,
    DOI: 10.1016/S0926-907X(05)80039-X.

  29. Woods, David L., Wyma, John M., Yund, E. William, Herron, Timothy J., and Reed, Bruce.
    Factors influencing the latency of simple reaction time.
    Frontiers in Human Neuroscience, 9:131, 2015.
    DOI: 10.3389/fnhum.2015.00131.

  30. Yarbus, Alfred L.
    Eye movements and vision.
    Plenum Press, 1967.

How to Cite
Rill, R. A., & Faragó, K. B. (2018). Gaze-based Cursor Control Impairs Performance in Divided Attention. Acta Cybernetica, 23(4), 1071-1087.
Regular articles