%0 Journal Article %T EyeTrackUAV2: a Large-Scale Binocular Eye-Tracking Dataset for UAV Videos %T EyeTrackUAV2: Une base de donnée de suivi oculaire pour les vidéos drones (UAV) %+ Computational Visual Perception and Applications (PERCEPT) %+ Laboratoire des Sciences du Numérique de Nantes (LS2N) %+ Institut d'Électronique et des Technologies du numéRique (IETR) %+ Image Perception Interaction (LS2N - équipe IPI) %A Perrin, Anne-Flore %A Krassanakis, Vassilios %A Zhang, Lu %A Ricordel, Vincent %A Perreira, Matthieu %A Silva, Da %A Le Meur, Olivier %Z D6 Currently under review in Drones (MDPI) %< avec comité de lecture %J Drones %I MDPI %S Drones 2020 %V 4 %N 2 %8 2020-01-08 %D 2020 %K surveillance %K eye tracking %K Dataset %K Visual attention %K Unmanned Aerial Vehicles (UAV) %K Videos %K Visual salience %K Salience %K 15 surveillance 16 %Z Computer Science [cs]/Signal and Image Processing %Z Computer Science [cs]/Multimedia [cs.MM] %Z Computer Science [cs]/Databases [cs.DB] %Z Computer Science [cs]/Image Processing [eess.IV] %Z Cognitive science/Computer scienceJournal articles %X Unmanned Aerial Vehicles (UAVs) achieved a lot of momentum through their fast and tremendous evolution over the last decade. A multiplication of applications results from the use of the UAV imagery in various fields such as military and civilian surveillance, delivery services, and wildlife monitoring. Combining UAV imagery with study of dynamic salience further extends the number of future applications. Indeed, considerations of visual attention open the door to new compression, retargeting, and decision-making tools. To conduct such studies, in this era of big data and deep learning, we identified the need for new large-scale eye-tracking datasets for visual salience in UAV content. To address this need, we introduce here the dataset EyeTrackUAV2 consisting of the collection of binocular gaze information through visualization of UAV videos for both free viewing and task-based attention conditions. An analysis of collected gaze positions provides recommendations for visual salience ground-truth generation. It also sheds light upon variations of saliency biases in UAV videos when opposed to conventional content, especially regarding the center bias. %G English %2 https://univ-rennes.hal.science/hal-02391832v2/document %2 https://univ-rennes.hal.science/hal-02391832v2/file/drones-672485.pdf %L hal-02391832 %U https://univ-rennes.hal.science/hal-02391832