Paper ID | MLSP-29.2 |
Paper Title |
CONTINUOUS CNN FOR NONUNIFORM TIME SERIES |
Authors |
Hui Shi, University of California, San Diego, United States; Yang Zhang, MIT-IBM Watson AI Lab, United States; Hao Wu, University of Illinois at Urbana-Champaign, United States; Shiyu Chang, MIT-IBM Watson AI Lab, United States; Kaizhi Qian, Mark Hasegawa-Johnson, University of Illinois at Urbana-Champaign, United States; Jishen Zhao, University of California, San Diego, United States |
Session | MLSP-29: Deep Learning for Time Series |
Location | Gather.Town |
Session Time: | Thursday, 10 June, 14:00 - 14:45 |
Presentation Time: | Thursday, 10 June, 14:00 - 14:45 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-DEEP] Deep learning techniques |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
CNN for time series data implicitly assumes that the data are uniformly sampled, whereas many event-based and multi-modal data are nonuniform or have heterogeneous sampling rates. Directly applying regular CNN to nonuniform time series is ungrounded, be-cause it is unable to recognize and extract common patterns from the nonuniform input signals. In this paper, we propose the Continuous CNN (CCNN), which estimates the inherent continuous inputs by interpolation, and performs continuous convolution on the continuous input. The interpolation and convolution kernels are learned in an end-to-end manner and are able to learn useful patterns despite the nonuniform sampling rate. Results of several experiments verify that CNN achieves a better performance on nonuniform data, and learns meaningful continuous kernels. |