Workshop program

Sponsor: the Faculty of Computer and Information Science, University of Ljubljana.

  • 08.25 Welcome

  • 08.30-10.00 Oral session I, session chair Jiři Matas

    • 08.30 The VOT2017 and VOT-TIR2017 challenge results

    • 09.15 The VOT2017 winning tracker: TBA

    • 09.30 Spotlight presentations of the poster session

  • 10.00-11.00 Poster session, session chair Roman Pflugfelder

    • UCT: Learning Unified Convolutional Networks for Real-time Visual Tracking

      Zheng Zhu (Institute of Automation, Chinese Academy of Sciences), Guan Huang (Horizon Robotics Inc.), Wei Zou (Institute of Automation, Chinese Academy of Sciences), Dalong Du (Horizon Robotics, Inc.), Chang Huang (Horizon Robotics, Inc.)

    • The Benefits of Evaluating Tracker Performance using Pixel-wise Segmentations

      Tobias Böttger (MVTec Software GmbH), Patrick Follmann (MVTec Software GmbH)

    • Correlation Filters with Weighted Convolution Responses

      Zhiqun He (Beijing University of Posts and Telecommunications), Yingruo Fan (Beijing University of Posts and Telecommunications), JunFei Zhuang (Beijing University of Posts and Telecommunications)

    • Integrating Boundary and Center Correlation Filters for Visual Tracking with Aspect Ratio Variation

      Feng Li (Harbin Institute of Technology), Yingjie Yao (Harbin Institute of Technology), Peihua Li (Dalian University of Technology), D. Zhang (The Hong Kong Polytechnic University), Wangmeng Zuo (Harbin Institute of Technology, China), Ming-Hsuan Yang (University of California at Merced)

    • Recurrent Filter Learning for Visual Tracking

      Tianyu Yang (City University of Hong Kong), Antoni Chan (City University of Hong Kong)

  • 11.00-12.00 Oral session II, session chair Aleš Leonardis

    • 11.00 Invited/Keynote Talk: Davide Scaramuzza, Robust, Visual-Inertial State Estimation: from Frame to Event Cameras

      I will present the main algorithms to achieve robust, 6-DOF, state estimation for mobile robots using passive sensing. Since cameras alone are not robust enough to high-speed motion and high-dynamic range scenes, I will describe how IMUs and event-based cameras can be fused with visual information to achieve higher accuracy and robustness. I will therefore dig into the topic of event-based cameras, which are revolutionary sensors with a latency of microseconds, a very high dynamic range, and a measurement update rate that is almost a million time faster than standard cameras. Finally, I will show concrete applications of these methods in autonomous navigation of vision-controlled drones.

    • 11.30 Presentation of the VOT-TIR2017 winning tracker: TBA

    • 11.45 The VOT-realtime challenge best performing tracker talk: TBA

  • 12.00-12.30 Panel

    • Advantages & Challenges of Thermal Imaging, Adel Lablack, Stefan Schulte, FLIR

    • Discussion

  • 12.30 Closing Remarks