Paper ID | TEC-6.6 | ||
Paper Title | PixStabNet: FAST MULTI-SCALE DEEP ONLINE VIDEO STABILIZATION WITH PIXEL-BASED WARPING | ||
Authors | Yu-Ta Chen, Kuan-Wei Tseng, National Taiwan University, Taiwan; Yao-Chih Lee, Academia Sinica, Taiwan; Chun-Yu Chen, Yi-Ping Hung, National Taiwan University, Taiwan | ||
Session | TEC-6: Image and Video Processing 2 | ||
Location | Area G | ||
Session Time: | Monday, 20 September, 15:30 - 17:00 | ||
Presentation Time: | Monday, 20 September, 15:30 - 17:00 | ||
Presentation | Poster | ||
Topic | Image and Video Processing: Restoration and enhancement | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Online video stabilizaton is increasingly needed for real-time applications such as live streaming, drone remote control, and video communication. We propose a multi-scale convolutional neural network (PixStabNet) which stabilizes video in real time without using future frames. Instead of calculating a global homography or multiple homographies, we estimate a pixel-based warping map to make the transformation of each pixel to achieve more precise modelling. In addition, we propose well-designed loss functions along with a two-stage training scheme to enhance network robustness. The quantitative result shows that our method outperforms other learning-based online methods in terms of stability with excellent geometric and temporal consistency. Moreover, to the best of our knowledge, the proposed algorithm is the most efficient approach for video stabilization. The models and results are available at: https://yu-ta-chen.github.io/PixStabNet. |