---
_id: '14422'
abstract:
- lang: eng
  text: "Animals exhibit a remarkable ability to learn and remember new behaviors,
    skills, and associations throughout their lifetime. These capabilities are made
    possible thanks to a variety of\r\nchanges in the brain throughout adulthood,
    regrouped under the term \"plasticity\". Some cells\r\nin the brain —neurons—
    and specifically changes in the connections between neurons, the\r\nsynapses,
    were shown to be crucial for the formation, selection, and consolidation of memories\r\nfrom
    past experiences. These ongoing changes of synapses across time are called synaptic\r\nplasticity.
    Understanding how a myriad of biochemical processes operating at individual\r\nsynapses
    can somehow work in concert to give rise to meaningful changes in behavior is
    a\r\nfascinating problem and an active area of research.\r\nHowever, the experimental
    search for the precise plasticity mechanisms at play in the brain\r\nis daunting,
    as it is difficult to control and observe synapses during learning. Theoretical\r\napproaches
    have thus been the default method to probe the plasticity-behavior connection.
    Such\r\nstudies attempt to extract unifying principles across synapses and model
    all observed synaptic\r\nchanges using plasticity rules: equations that govern
    the evolution of synaptic strengths across\r\ntime in neuronal network models.
    These rules can use many relevant quantities to determine\r\nthe magnitude of
    synaptic changes, such as the precise timings of pre- and postsynaptic\r\naction
    potentials, the recent neuronal activity levels, the state of neighboring synapses,
    etc.\r\nHowever, analytical studies rely heavily on human intuition and are forced
    to make simplifying\r\nassumptions about plasticity rules.\r\nIn this thesis,
    we aim to assist and augment human intuition in this search for plasticity rules.\r\nWe
    explore whether a numerical approach could automatically discover the plasticity
    rules\r\nthat elicit desired behaviors in large networks of interconnected neurons.
    This approach is\r\ndubbed meta-learning synaptic plasticity: learning plasticity
    rules which themselves will make\r\nneuronal networks learn how to solve a desired
    task. We first write all the potential plasticity\r\nmechanisms to consider using
    a single expression with adjustable parameters. We then optimize\r\nthese plasticity
    parameters using evolutionary strategies or Bayesian inference on tasks known\r\nto
    involve synaptic plasticity, such as familiarity detection and network stabilization.\r\nWe
    show that these automated approaches are powerful tools, able to complement established\r\nanalytical
    methods. By comprehensively screening plasticity rules at all synapse types in\r\nrealistic,
    spiking neuronal network models, we discover entire sets of degenerate plausible\r\nplasticity
    rules that reliably elicit memory-related behaviors. Our approaches allow for
    more\r\nrobust experimental predictions, by abstracting out the idiosyncrasies
    of individual plasticity\r\nrules, and provide fresh insights on synaptic plasticity
    in spiking network models.\r\n"
alternative_title:
- ISTA Thesis
article_processing_charge: No
author:
- first_name: Basile J
  full_name: Confavreux, Basile J
  id: C7610134-B532-11EA-BD9F-F5753DDC885E
  last_name: Confavreux
citation:
  ama: 'Confavreux BJ. Synapseek: Meta-learning synaptic plasticity rules. 2023. doi:<a
    href="https://doi.org/10.15479/at:ista:14422">10.15479/at:ista:14422</a>'
  apa: 'Confavreux, B. J. (2023). <i>Synapseek: Meta-learning synaptic plasticity
    rules</i>. Institute of Science and Technology Austria. <a href="https://doi.org/10.15479/at:ista:14422">https://doi.org/10.15479/at:ista:14422</a>'
  chicago: 'Confavreux, Basile J. “Synapseek: Meta-Learning Synaptic Plasticity Rules.”
    Institute of Science and Technology Austria, 2023. <a href="https://doi.org/10.15479/at:ista:14422">https://doi.org/10.15479/at:ista:14422</a>.'
  ieee: 'B. J. Confavreux, “Synapseek: Meta-learning synaptic plasticity rules,” Institute
    of Science and Technology Austria, 2023.'
  ista: 'Confavreux BJ. 2023. Synapseek: Meta-learning synaptic plasticity rules.
    Institute of Science and Technology Austria.'
  mla: 'Confavreux, Basile J. <i>Synapseek: Meta-Learning Synaptic Plasticity Rules</i>.
    Institute of Science and Technology Austria, 2023, doi:<a href="https://doi.org/10.15479/at:ista:14422">10.15479/at:ista:14422</a>.'
  short: 'B.J. Confavreux, Synapseek: Meta-Learning Synaptic Plasticity Rules, Institute
    of Science and Technology Austria, 2023.'
date_created: 2023-10-12T14:13:25Z
date_published: 2023-10-12T00:00:00Z
date_updated: 2023-10-18T09:20:56Z
day: '12'
ddc:
- '610'
degree_awarded: PhD
department:
- _id: GradSch
- _id: TiVo
doi: 10.15479/at:ista:14422
ec_funded: 1
file:
- access_level: closed
  checksum: 7f636555eae7803323df287672fd13ed
  content_type: application/pdf
  creator: cchlebak
  date_created: 2023-10-12T14:53:50Z
  date_updated: 2023-10-12T14:54:52Z
  embargo: 2024-10-12
  embargo_to: open_access
  file_id: '14424'
  file_name: Confavreux_Thesis_2A.pdf
  file_size: 30599717
  relation: main_file
- access_level: closed
  checksum: 725e85946db92290a4583a0de9779e1b
  content_type: application/x-zip-compressed
  creator: cchlebak
  date_created: 2023-10-18T07:38:34Z
  date_updated: 2023-10-18T07:56:08Z
  file_id: '14440'
  file_name: Confavreux Thesis.zip
  file_size: 68406739
  relation: source_file
file_date_updated: 2023-10-18T07:56:08Z
has_accepted_license: '1'
language:
- iso: eng
license: https://creativecommons.org/licenses/by-nc-sa/4.0/
month: '10'
oa_version: Published Version
page: '148'
project:
- _id: 0aacfa84-070f-11eb-9043-d7eb2c709234
  call_identifier: H2020
  grant_number: '819603'
  name: Learning the shape of synaptic plasticity rules for neuronal architectures
    and function through machine learning.
publication_identifier:
  issn:
  - 2663 - 337X
publication_status: published
publisher: Institute of Science and Technology Austria
related_material:
  record:
  - id: '9633'
    relation: part_of_dissertation
    status: public
status: public
supervisor:
- first_name: Tim P
  full_name: Vogels, Tim P
  id: CB6FF8D2-008F-11EA-8E08-2637E6697425
  last_name: Vogels
  orcid: 0000-0003-3295-6181
title: 'Synapseek: Meta-learning synaptic plasticity rules'
tmp:
  image: /images/cc_by_nc_sa.png
  legal_code_url: https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode
  name: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC
    BY-NC-SA 4.0)
  short: CC BY-NC-SA (4.0)
type: dissertation
user_id: 8b945eb4-e2f2-11eb-945a-df72226e66a9
year: '2023'
...
---
_id: '10753'
abstract:
- lang: eng
  text: This is a comment on "Meta-learning synaptic plasticity and memory addressing
    for continual familiarity detection." Neuron. 2022 Feb 2;110(3):544-557.e8.
article_processing_charge: No
article_type: letter_note
author:
- first_name: Basile J
  full_name: Confavreux, Basile J
  id: C7610134-B532-11EA-BD9F-F5753DDC885E
  last_name: Confavreux
- first_name: Tim P
  full_name: Vogels, Tim P
  id: CB6FF8D2-008F-11EA-8E08-2637E6697425
  last_name: Vogels
  orcid: 0000-0003-3295-6181
citation:
  ama: 'Confavreux BJ, Vogels TP. A familiar thought: Machines that replace us? <i>Neuron</i>.
    2022;110(3):361-362. doi:<a href="https://doi.org/10.1016/j.neuron.2022.01.014">10.1016/j.neuron.2022.01.014</a>'
  apa: 'Confavreux, B. J., &#38; Vogels, T. P. (2022). A familiar thought: Machines
    that replace us? <i>Neuron</i>. Elsevier. <a href="https://doi.org/10.1016/j.neuron.2022.01.014">https://doi.org/10.1016/j.neuron.2022.01.014</a>'
  chicago: 'Confavreux, Basile J, and Tim P Vogels. “A Familiar Thought: Machines
    That Replace Us?” <i>Neuron</i>. Elsevier, 2022. <a href="https://doi.org/10.1016/j.neuron.2022.01.014">https://doi.org/10.1016/j.neuron.2022.01.014</a>.'
  ieee: 'B. J. Confavreux and T. P. Vogels, “A familiar thought: Machines that replace
    us?,” <i>Neuron</i>, vol. 110, no. 3. Elsevier, pp. 361–362, 2022.'
  ista: 'Confavreux BJ, Vogels TP. 2022. A familiar thought: Machines that replace
    us? Neuron. 110(3), 361–362.'
  mla: 'Confavreux, Basile J., and Tim P. Vogels. “A Familiar Thought: Machines That
    Replace Us?” <i>Neuron</i>, vol. 110, no. 3, Elsevier, 2022, pp. 361–62, doi:<a
    href="https://doi.org/10.1016/j.neuron.2022.01.014">10.1016/j.neuron.2022.01.014</a>.'
  short: B.J. Confavreux, T.P. Vogels, Neuron 110 (2022) 361–362.
date_created: 2022-02-13T23:01:34Z
date_published: 2022-02-02T00:00:00Z
date_updated: 2023-10-03T10:53:17Z
day: '02'
department:
- _id: TiVo
doi: 10.1016/j.neuron.2022.01.014
external_id:
  isi:
  - '000751819100005'
  pmid:
  - '35114107'
intvolume: '       110'
isi: 1
issue: '3'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.1016/j.neuron.2022.01.014
month: '02'
oa: 1
oa_version: Published Version
page: 361-362
pmid: 1
publication: Neuron
publication_identifier:
  eissn:
  - 1097-4199
publication_status: published
publisher: Elsevier
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'A familiar thought: Machines that replace us?'
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 110
year: '2022'
...
---
_id: '9633'
abstract:
- lang: eng
  text: The search for biologically faithful synaptic plasticity rules has resulted
    in a large body of models. They are usually inspired by – and fitted to – experimental
    data, but they rarely produce neural dynamics that serve complex functions. These
    failures suggest that current plasticity models are still under-constrained by
    existing data. Here, we present an alternative approach that uses meta-learning
    to discover plausible synaptic plasticity rules. Instead of experimental data,
    the rules are constrained by the functions they implement and the structure they
    are meant to produce. Briefly, we parameterize synaptic plasticity rules by a
    Volterra expansion and then use supervised learning methods (gradient descent
    or evolutionary strategies) to minimize a problem-dependent loss function that
    quantifies how effectively a candidate plasticity rule transforms an initially
    random network into one with the desired function. We first validate our approach
    by re-discovering previously described plasticity rules, starting at the single-neuron
    level and “Oja’s rule”, a simple Hebbian plasticity rule that captures the direction
    of most variability of inputs to a neuron (i.e., the first principal component).
    We expand the problem to the network level and ask the framework to find Oja’s
    rule together with an anti-Hebbian rule such that an initially random two-layer
    firing-rate network will recover several principal components of the input space
    after learning. Next, we move to networks of integrate-and-fire neurons with plastic
    inhibitory afferents. We train for rules that achieve a target firing rate by
    countering tuned excitation. Our algorithm discovers a specific subset of the
    manifold of rules that can solve this task. Our work is a proof of principle of
    an automated and unbiased approach to unveil synaptic plasticity rules that obey
    biological constraints and can solve complex functions.
acknowledgement: We would like to thank Chaitanya Chintaluri, Georgia Christodoulou,
  Bill Podlaski and Merima Šabanovic for useful discussions and comments. This work
  was supported by a Wellcome Trust ´ Senior Research Fellowship (214316/Z/18/Z),
  a BBSRC grant (BB/N019512/1), an ERC consolidator Grant (SYNAPSEEK), a Leverhulme
  Trust Project Grant (RPG-2016-446), and funding from École Polytechnique, Paris.
article_processing_charge: No
author:
- first_name: Basile J
  full_name: Confavreux, Basile J
  id: C7610134-B532-11EA-BD9F-F5753DDC885E
  last_name: Confavreux
- first_name: Friedemann
  full_name: Zenke, Friedemann
  last_name: Zenke
- first_name: Everton J.
  full_name: Agnes, Everton J.
  last_name: Agnes
- first_name: Timothy
  full_name: Lillicrap, Timothy
  last_name: Lillicrap
- first_name: Tim P
  full_name: Vogels, Tim P
  id: CB6FF8D2-008F-11EA-8E08-2637E6697425
  last_name: Vogels
  orcid: 0000-0003-3295-6181
citation:
  ama: 'Confavreux BJ, Zenke F, Agnes EJ, Lillicrap T, Vogels TP. A meta-learning
    approach to (re)discover plasticity rules that carve a desired function into a
    neural network. In: <i>Advances in Neural Information Processing Systems</i>.
    Vol 33. ; 2020:16398-16408.'
  apa: Confavreux, B. J., Zenke, F., Agnes, E. J., Lillicrap, T., &#38; Vogels, T.
    P. (2020). A meta-learning approach to (re)discover plasticity rules that carve
    a desired function into a neural network. In <i>Advances in Neural Information
    Processing Systems</i> (Vol. 33, pp. 16398–16408). Vancouver, Canada.
  chicago: Confavreux, Basile J, Friedemann Zenke, Everton J. Agnes, Timothy Lillicrap,
    and Tim P Vogels. “A Meta-Learning Approach to (Re)Discover Plasticity Rules That
    Carve a Desired Function into a Neural Network.” In <i>Advances in Neural Information
    Processing Systems</i>, 33:16398–408, 2020.
  ieee: B. J. Confavreux, F. Zenke, E. J. Agnes, T. Lillicrap, and T. P. Vogels, “A
    meta-learning approach to (re)discover plasticity rules that carve a desired function
    into a neural network,” in <i>Advances in Neural Information Processing Systems</i>,
    Vancouver, Canada, 2020, vol. 33, pp. 16398–16408.
  ista: 'Confavreux BJ, Zenke F, Agnes EJ, Lillicrap T, Vogels TP. 2020. A meta-learning
    approach to (re)discover plasticity rules that carve a desired function into a
    neural network. Advances in Neural Information Processing Systems. NeurIPS: Conference
    on Neural Information Processing Systems vol. 33, 16398–16408.'
  mla: Confavreux, Basile J., et al. “A Meta-Learning Approach to (Re)Discover Plasticity
    Rules That Carve a Desired Function into a Neural Network.” <i>Advances in Neural
    Information Processing Systems</i>, vol. 33, 2020, pp. 16398–408.
  short: B.J. Confavreux, F. Zenke, E.J. Agnes, T. Lillicrap, T.P. Vogels, in:, Advances
    in Neural Information Processing Systems, 2020, pp. 16398–16408.
conference:
  end_date: 2020-12-12
  location: Vancouver, Canada
  name: 'NeurIPS: Conference on Neural Information Processing Systems'
  start_date: 2020-12-06
date_created: 2021-07-04T22:01:27Z
date_published: 2020-12-06T00:00:00Z
date_updated: 2023-10-18T09:20:55Z
day: '06'
department:
- _id: TiVo
ec_funded: 1
intvolume: '        33'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://proceedings.neurips.cc/paper/2020/hash/bdbd5ebfde4934142c8a88e7a3796cd5-Abstract.html
month: '12'
oa: 1
oa_version: Published Version
page: 16398-16408
project:
- _id: c084a126-5a5b-11eb-8a69-d75314a70a87
  grant_number: 214316/Z/18/Z
  name: What’s in a memory? Spatiotemporal dynamics in strongly coupled recurrent
    neuronal networks.
- _id: 0aacfa84-070f-11eb-9043-d7eb2c709234
  call_identifier: H2020
  grant_number: '819603'
  name: Learning the shape of synaptic plasticity rules for neuronal architectures
    and function through machine learning.
publication: Advances in Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
quality_controlled: '1'
related_material:
  link:
  - relation: is_continued_by
    url: https://doi.org/10.1101/2020.10.24.353409
  record:
  - id: '14422'
    relation: dissertation_contains
    status: public
scopus_import: '1'
status: public
title: A meta-learning approach to (re)discover plasticity rules that carve a desired
  function into a neural network
type: conference
user_id: 6785fbc1-c503-11eb-8a32-93094b40e1cf
volume: 33
year: '2020'
...
