Liquid time-constant networks

Hasani R, Lechner M, Amini A, Rus D, Grosu R. 2021. Liquid time-constant networks. Proceedings of the AAAI Conference on Artificial Intelligence. AAAI: Association for the Advancement of Artificial Intelligence, Technical Tracks, vol. 35, 7657–7666.

Download
OA 16936-Article Text-20430-1-2-20210518 (1).pdf 4.30 MB [Published Version]
Download (ext.)
Conference Paper | Published | English
Author
Hasani, Ramin; Lechner, MathiasISTA; Amini, Alexander; Rus, Daniela; Grosu, Radu
Series Title
Technical Tracks
Abstract
We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system’s dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics, and compute their expressive power by the trajectory length measure in a latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.
Publishing Year
Date Published
2021-05-28
Proceedings Title
Proceedings of the AAAI Conference on Artificial Intelligence
Publisher
AAAI Press
Acknowledgement
R.H. and D.R. are partially supported by Boeing. R.H. and R.G. were partially supported by the Horizon-2020 ECSEL Project grant No. 783163 (iDev40). M.L. was supported in part by the Austrian Science Fund (FWF) under grant Z211-N23 (Wittgenstein Award). A.A. is supported by the National Science Foundation (NSF) Graduate Research Fellowship Program. This research work is partially drawn from the PhD dissertation of R.H.
Volume
35
Issue
9
Page
7657-7666
Conference
AAAI: Association for the Advancement of Artificial Intelligence
Conference Location
Virtual
Conference Date
2021-02-02 – 2021-02-09
ISSN
eISSN
IST-REx-ID

Cite this

Hasani R, Lechner M, Amini A, Rus D, Grosu R. Liquid time-constant networks. In: Proceedings of the AAAI Conference on Artificial Intelligence. Vol 35. AAAI Press; 2021:7657-7666.
Hasani, R., Lechner, M., Amini, A., Rus, D., & Grosu, R. (2021). Liquid time-constant networks. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 35, pp. 7657–7666). Virtual: AAAI Press.
Hasani, Ramin, Mathias Lechner, Alexander Amini, Daniela Rus, and Radu Grosu. “Liquid Time-Constant Networks.” In Proceedings of the AAAI Conference on Artificial Intelligence, 35:7657–66. AAAI Press, 2021.
R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 2021, vol. 35, no. 9, pp. 7657–7666.
Hasani R, Lechner M, Amini A, Rus D, Grosu R. 2021. Liquid time-constant networks. Proceedings of the AAAI Conference on Artificial Intelligence. AAAI: Association for the Advancement of Artificial Intelligence, Technical Tracks, vol. 35, 7657–7666.
Hasani, Ramin, et al. “Liquid Time-Constant Networks.” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 9, AAAI Press, 2021, pp. 7657–66.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
Access Level
OA Open Access
Date Uploaded
2022-01-26
MD5 Checksum
0f06995fba06dbcfa7ed965fc66027ff


Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 2006.04439

Search this title in

Google Scholar
ISBN Search