Akinyelu, Andronicus A. and Blignaut, Pieter (2022) Convolutional Neural Network-Based Technique for Gaze Estimation on Mobile Devices. Frontiers in Artificial Intelligence, 4. ISSN 2624-8212
pubmed-zip/versions/1/package-entries/frai-04-796825/frai-04-796825.pdf - Published Version
Download (1MB)
Abstract
Eye tracking is becoming a very popular, useful, and important technology. Many eye tracking technologies are currently expensive and only available to large corporations. Some of them necessitate explicit personal calibration, which makes them unsuitable for use in real-world or uncontrolled environments. Explicit personal calibration can also be cumbersome and degrades the user experience. To address these issues, this study proposes a Convolutional Neural Network (CNN) based calibration-free technique for improved gaze estimation in unconstrained environments. The proposed technique consists of two components, namely a face component and a 39-point facial landmark component. The face component is used to extract the gaze estimation features from the eyes, while the 39-point facial landmark component is used to encode the shape and location of the eyes (within the face) into the network. Adding this information can make the network learn free-head and eye movements. Another CNN model was designed in this study primarily for the sake of comparison. The CNN model accepts only the face images as input. Different experiments were performed, and the experimental result reveals that the proposed technique outperforms the second model. Fine-tuning was also performed using the VGG16 pre-trained model. Experimental results show that the fine-tuned results of the proposed technique perform better than the fine-tuned results of the second model. Overall, the results show that 39-point facial landmarks can be used to improve the performance of CNN-based gaze estimation models.
Item Type: | Article |
---|---|
Subjects: | Impact Archive > Multidisciplinary |
Depositing User: | Managing Editor |
Date Deposited: | 24 Mar 2023 05:23 |
Last Modified: | 22 Jun 2024 07:57 |
URI: | http://research.sdpublishers.net/id/eprint/903 |