Eye movement patterns are similar during accurate multiple-target tracking




Bagha, Kamyar; Kamkar, Shiva; Moghaddam, Hamid Abrishami; Oksama, Lauri; Li, Jie; Hyönä, Jukka

IEEE International Conference on Cognitive Infocommunications

2024

IEEE International Conference on Cognitive Infocommunications

2024 IEEE 15th International Conference on Cognitive Infocommunications (CogInfoCom)

15

000059

000064

979-8-3503-7825-2

979-8-3503-7824-5

2380-7350

2473-5671

DOIhttps://doi.org/10.1109/CogInfoCom63007.2024.10894724



Understanding how the brain works is a base of cognitive info-communication. To this aim we focus on multiple target tracking (MTT) as a key task that involves two important cognitive factors, attention and memory. Humans track multiple objects in their daily life while facing various challenges including occlusion and set-size. Eye movement research has shown that there are within and between subjects’ differences in scanpaths while performing MTT tasks. However, it is unclear if there is a winning scan pattern that would lead to a successful tracking of targets. To answer this question, we used dynamic time warping to compare the similarities between subjects’ scan patterns during an MTT task with different challenges. We studied the effect of set-size, occlusion, and trial response on the similarities. Then a mixed effect analysis was applied on the output to measure whether the findings were statistically significant. Results demonstrated that scan patterns were more similar when MTT task was performed correctly. It suggests that there is a common tracking strategy adopted by the viewers that leads to a correct response. Decoding this strategy has countless applications in the fields including human-computer interaction, brain-modeling and cognitive info-communication.



This work has been supported by the Center for International Scientific Studies & Collaborations (CISSC), Ministry of Science, Research and Technology of Iran, Grant No. A/1402/981, and National Natural Science Foundation of China, Grant No. 32071087.


Last updated on 2025-27-02 at 11:35