---
_id: '12147'
abstract:
- lang: eng
  text: Continuous-time neural networks are a class of machine learning systems that
    can tackle representation learning on spatiotemporal decision-making tasks. These
    models are typically represented by continuous differential equations. However,
    their expressive power when they are deployed on computers is bottlenecked by
    numerical differential equation solvers. This limitation has notably slowed down
    the scaling and understanding of numerous natural physical phenomena such as the
    dynamics of nervous systems. Ideally, we would circumvent this bottleneck by solving
    the given dynamical system in closed form. This is known to be intractable in
    general. Here, we show that it is possible to closely approximate the interaction
    between neurons and synapses—the building blocks of natural and artificial neural
    networks—constructed by liquid time-constant networks efficiently in closed form.
    To this end, we compute a tightly bounded approximation of the solution of an
    integral appearing in liquid time-constant dynamics that has had no known closed-form
    solution so far. This closed-form solution impacts the design of continuous-time
    and continuous-depth neural models. For instance, since time appears explicitly
    in closed form, the formulation relaxes the need for complex numerical solvers.
    Consequently, we obtain models that are between one and five orders of magnitude
    faster in training and inference compared with differential equation-based counterparts.
    More importantly, in contrast to ordinary differential equation-based continuous
    networks, closed-form networks can scale remarkably well compared with other deep
    learning instances. Lastly, as these models are derived from liquid networks,
    they show good performance in time-series modelling compared with advanced recurrent
    neural network models.
acknowledgement: This research was supported in part by the AI2050 program at Schmidt
  Futures (grant G-22-63172), the Boeing Company, and the United States Air Force
  Research Laboratory and the United States Air Force Artificial Intelligence Accelerator
  and was accomplished under cooperative agreement number FA8750-19-2-1000. The views
  and conclusions contained in this document are those of the authors and should not
  be interpreted as representing the official policies, either expressed or implied,
  of the United States Air Force or the U.S. Government. The U.S. Government is authorized
  to reproduce and distribute reprints for Government purposes, notwithstanding any
  copyright notation herein. This work was further supported by The Boeing Company
  and Office of Naval Research grant N00014-18-1-2830. M.T. is supported by the Poul
  Due Jensen Foundation, grant 883901. M.L. was supported in part by the Austrian
  Science Fund under grant Z211-N23 (Wittgenstein Award). A.A. was supported by the
  National Science Foundation Graduate Research Fellowship Program. We thank T.-H.
  Wang, P. Kao, M. Chahine, W. Xiao, X. Li, L. Yin and Y. Ben for useful suggestions
  and for testing of CfC models to confirm the results across other domains.
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Ramin
  full_name: Hasani, Ramin
  last_name: Hasani
- first_name: Mathias
  full_name: Lechner, Mathias
  id: 3DC22916-F248-11E8-B48F-1D18A9856A87
  last_name: Lechner
- first_name: Alexander
  full_name: Amini, Alexander
  last_name: Amini
- first_name: Lucas
  full_name: Liebenwein, Lucas
  last_name: Liebenwein
- first_name: Aaron
  full_name: Ray, Aaron
  last_name: Ray
- first_name: Max
  full_name: Tschaikowski, Max
  last_name: Tschaikowski
- first_name: Gerald
  full_name: Teschl, Gerald
  last_name: Teschl
- first_name: Daniela
  full_name: Rus, Daniela
  last_name: Rus
citation:
  ama: Hasani R, Lechner M, Amini A, et al. Closed-form continuous-time neural networks.
    <i>Nature Machine Intelligence</i>. 2022;4(11):992-1003. doi:<a href="https://doi.org/10.1038/s42256-022-00556-7">10.1038/s42256-022-00556-7</a>
  apa: Hasani, R., Lechner, M., Amini, A., Liebenwein, L., Ray, A., Tschaikowski,
    M., … Rus, D. (2022). Closed-form continuous-time neural networks. <i>Nature Machine
    Intelligence</i>. Springer Nature. <a href="https://doi.org/10.1038/s42256-022-00556-7">https://doi.org/10.1038/s42256-022-00556-7</a>
  chicago: Hasani, Ramin, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Aaron
    Ray, Max Tschaikowski, Gerald Teschl, and Daniela Rus. “Closed-Form Continuous-Time
    Neural Networks.” <i>Nature Machine Intelligence</i>. Springer Nature, 2022. <a
    href="https://doi.org/10.1038/s42256-022-00556-7">https://doi.org/10.1038/s42256-022-00556-7</a>.
  ieee: R. Hasani <i>et al.</i>, “Closed-form continuous-time neural networks,” <i>Nature
    Machine Intelligence</i>, vol. 4, no. 11. Springer Nature, pp. 992–1003, 2022.
  ista: Hasani R, Lechner M, Amini A, Liebenwein L, Ray A, Tschaikowski M, Teschl
    G, Rus D. 2022. Closed-form continuous-time neural networks. Nature Machine Intelligence.
    4(11), 992–1003.
  mla: Hasani, Ramin, et al. “Closed-Form Continuous-Time Neural Networks.” <i>Nature
    Machine Intelligence</i>, vol. 4, no. 11, Springer Nature, 2022, pp. 992–1003,
    doi:<a href="https://doi.org/10.1038/s42256-022-00556-7">10.1038/s42256-022-00556-7</a>.
  short: R. Hasani, M. Lechner, A. Amini, L. Liebenwein, A. Ray, M. Tschaikowski,
    G. Teschl, D. Rus, Nature Machine Intelligence 4 (2022) 992–1003.
date_created: 2023-01-12T12:07:21Z
date_published: 2022-11-15T00:00:00Z
date_updated: 2023-08-04T09:00:10Z
day: '15'
ddc:
- '000'
department:
- _id: ToHe
doi: 10.1038/s42256-022-00556-7
external_id:
  arxiv:
  - '2106.13898'
  isi:
  - '000884215600003'
file:
- access_level: open_access
  checksum: b4789122ce04bfb4ac042390f59aaa8b
  content_type: application/pdf
  creator: dernst
  date_created: 2023-01-24T09:49:44Z
  date_updated: 2023-01-24T09:49:44Z
  file_id: '12355'
  file_name: 2022_NatureMachineIntelligence_Hasani.pdf
  file_size: 3259553
  relation: main_file
  success: 1
file_date_updated: 2023-01-24T09:49:44Z
has_accepted_license: '1'
intvolume: '         4'
isi: 1
issue: '11'
keyword:
- Artificial Intelligence
- Computer Networks and Communications
- Computer Vision and Pattern Recognition
- Human-Computer Interaction
- Software
language:
- iso: eng
month: '11'
oa: 1
oa_version: Published Version
page: 992-1003
project:
- _id: 25F42A32-B435-11E9-9278-68D0E5697425
  call_identifier: FWF
  grant_number: Z211
  name: The Wittgenstein Prize
publication: Nature Machine Intelligence
publication_identifier:
  issn:
  - 2522-5839
publication_status: published
publisher: Springer Nature
quality_controlled: '1'
related_material:
  link:
  - relation: erratum
    url: https://doi.org/10.1038/s42256-022-00597-y
scopus_import: '1'
status: public
title: Closed-form continuous-time neural networks
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 4
year: '2022'
...
