Dawei Du, Yifan Yang, Kaiwen Duan, Guorong Li, Qingming Huang, University of Chinese Academy of Sciences.
Yuankai Qi, Hongyang Yu, Weigang Zhang, Harbin Institute of Technology.
Qi Tian, The University of Texas at San Antonio.
מאגר סרטונים שנבנה בשיתוף פעולה בין חוקרים מהאקדמיה הסינית למדעים, מכון חרבין לטכנולוגיה בסין ואוניברסיטת טקסס בסן-אנטוניו. המאגר מכיל 100 סרטונים שצולמו באזורים אורבניים שונים ע"י רחפנים. הסרטונים במאגר לא מציגים אנשים, אלא רכבים בסיטואציות שונות, דבר המאפשר לחוקרי DEEPFAKE לבחון את הנושא מזווית שונה ולבדוק האם השיטות השונות לזיהוי DEEPFAKE עובדות גם לגבי סרטונים בהם הנושא המרכזי הוא לא בן אדם.
Abstract:
With the advantage of high mobility, Unmanned Aerial Vehicles (UAVs) are used to fuel numerous important applications in computer vision, delivering more e ciency and convenience than surveillance cameras with xed camera angle, scale and view. However, very limited UAV datasets are proposed, and they focus only on a specific task such as visual tracking or object detection in relatively constrained scenarios. Consequently, it is of great importance to develop an unconstrained UAV benchmark to boost related researches. In this paper, we construct a new UAV benchmark focusing on complex scenarios with new level challenges. Selected from 10 hours raw videos, about 80, 000 representative frames are fully annotated with bounding boxes as well as up to 14 kinds of attributes (e.g., weather condition, flying altitude, camera view, vehicle category, and occlusion) for three fundamental computer vision tasks: object detection, single object tracking, and multiple object tracking. Then, a detailed quantitative study is performed using most recent state-of-the-art algorithms for each task. Experimental results show that the current state-of-the-art methods perform relative worse on our dataset, due to the new challenges appeared in UAV based real scenes, e.g., high density, small object, and camera motion. To our knowledge, our work is the first time to explore such issues in unconstrained scenes comprehensively.