---
_id: '13146'
abstract:
- lang: eng
  text: 'A recent line of work has analyzed the theoretical properties of deep neural
    networks via the Neural Tangent Kernel (NTK). In particular, the smallest eigenvalue
    of the NTK has been related to the memorization capacity, the global convergence
    of gradient descent algorithms and the generalization of deep nets. However, existing
    results either provide bounds in the two-layer setting or assume that the spectrum
    of the NTK matrices is bounded away from 0 for multi-layer networks. In this paper,
    we provide tight bounds on the smallest eigenvalue of NTK matrices for deep ReLU
    nets, both in the limiting case of infinite widths and for finite widths. In the
    finite-width setting, the network architectures we consider are fairly general:
    we require the existence of a wide layer with roughly order of N neurons, N being
    the number of data samples; and the scaling of the remaining layer widths is arbitrary
    (up to logarithmic factors). To obtain our results, we analyze various quantities
    of independent interest: we give lower bounds on the smallest singular value of
    hidden feature matrices, and upper bounds on the Lipschitz constant of input-output
    feature maps.'
acknowledgement: The authors would like to thank the anonymous reviewers for their
  helpful comments. MM was partially supported by the 2019 Lopez-Loreta Prize. QN
  and GM acknowledge support from the European Research Council (ERC) under the European
  Union’s Horizon 2020 research and innovation programme (grant agreement no 757983).
article_processing_charge: No
arxiv: 1
author:
- first_name: Quynh
  full_name: Nguyen, Quynh
  last_name: Nguyen
- first_name: Marco
  full_name: Mondelli, Marco
  id: 27EB676C-8706-11E9-9510-7717E6697425
  last_name: Mondelli
  orcid: 0000-0002-3242-7020
- first_name: Guido
  full_name: Montufar, Guido
  last_name: Montufar
citation:
  ama: 'Nguyen Q, Mondelli M, Montufar G. Tight bounds on the smallest Eigenvalue
    of the neural tangent kernel for deep ReLU networks. In: <i>Proceedings of the
    38th International Conference on Machine Learning</i>. Vol 139. ML Research Press;
    2021:8119-8129.'
  apa: 'Nguyen, Q., Mondelli, M., &#38; Montufar, G. (2021). Tight bounds on the smallest
    Eigenvalue of the neural tangent kernel for deep ReLU networks. In <i>Proceedings
    of the 38th International Conference on Machine Learning</i> (Vol. 139, pp. 8119–8129).
    Virtual: ML Research Press.'
  chicago: Nguyen, Quynh, Marco Mondelli, and Guido Montufar. “Tight Bounds on the
    Smallest Eigenvalue of the Neural Tangent Kernel for Deep ReLU Networks.” In <i>Proceedings
    of the 38th International Conference on Machine Learning</i>, 139:8119–29. ML
    Research Press, 2021.
  ieee: Q. Nguyen, M. Mondelli, and G. Montufar, “Tight bounds on the smallest Eigenvalue
    of the neural tangent kernel for deep ReLU networks,” in <i>Proceedings of the
    38th International Conference on Machine Learning</i>, Virtual, 2021, vol. 139,
    pp. 8119–8129.
  ista: Nguyen Q, Mondelli M, Montufar G. 2021. Tight bounds on the smallest Eigenvalue
    of the neural tangent kernel for deep ReLU networks. Proceedings of the 38th International
    Conference on Machine Learning. International Conference on Machine Learning vol.
    139, 8119–8129.
  mla: Nguyen, Quynh, et al. “Tight Bounds on the Smallest Eigenvalue of the Neural
    Tangent Kernel for Deep ReLU Networks.” <i>Proceedings of the 38th International
    Conference on Machine Learning</i>, vol. 139, ML Research Press, 2021, pp. 8119–29.
  short: Q. Nguyen, M. Mondelli, G. Montufar, in:, Proceedings of the 38th International
    Conference on Machine Learning, ML Research Press, 2021, pp. 8119–8129.
conference:
  end_date: 2021-07-24
  location: Virtual
  name: International Conference on Machine Learning
  start_date: 2021-07-18
date_created: 2023-06-18T22:00:48Z
date_published: 2021-07-01T00:00:00Z
date_updated: 2024-09-10T13:03:17Z
day: '01'
ddc:
- '000'
department:
- _id: MaMo
external_id:
  arxiv:
  - '2012.11654'
file:
- access_level: open_access
  checksum: 19489cf5e16a0596b1f92e317d97c9b0
  content_type: application/pdf
  creator: dernst
  date_created: 2023-06-19T10:49:12Z
  date_updated: 2023-06-19T10:49:12Z
  file_id: '13155'
  file_name: 2021_PMLR_Nguyen.pdf
  file_size: 591332
  relation: main_file
  success: 1
file_date_updated: 2023-06-19T10:49:12Z
has_accepted_license: '1'
intvolume: '       139'
language:
- iso: eng
month: '07'
oa: 1
oa_version: Published Version
page: 8119-8129
project:
- _id: 059876FA-7A3F-11EA-A408-12923DDC885E
  name: Prix Lopez-Loretta 2019 - Marco Mondelli
publication: Proceedings of the 38th International Conference on Machine Learning
publication_identifier:
  eissn:
  - 2640-3498
  isbn:
  - '9781713845065'
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: Tight bounds on the smallest Eigenvalue of the neural tangent kernel for deep
  ReLU networks
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '13147'
abstract:
- lang: eng
  text: "We investigate fast and communication-efficient algorithms for the classic
    problem of minimizing a sum of strongly convex and smooth functions that are distributed
    among n\r\n different nodes, which can communicate using a limited number of bits.
    Most previous communication-efficient approaches for this problem are limited
    to first-order optimization, and therefore have \\emph{linear} dependence on the
    condition number in their communication complexity. We show that this dependence
    is not inherent: communication-efficient methods can in fact have sublinear dependence
    on the condition number. For this, we design and analyze the first communication-efficient
    distributed variants of preconditioned gradient descent for Generalized Linear
    Models, and for Newton’s method. Our results rely on a new technique for quantizing
    both the preconditioner and the descent direction at each step of the algorithms,
    while controlling their convergence rate. We also validate our findings experimentally,
    showing faster convergence and reduced communication relative to previous methods."
acknowledgement: The authors would like to thank Janne Korhonen, Aurelien Lucchi,
  Celestine MendlerDunner and Antonio Orvieto for helpful discussions. FA ¨and DA
  were supported during this work by the European Research Council (ERC) under the
  European Union’s Horizon 2020 research and innovation programme (grant agreement
  No 805223 ScaleML). PD was supported by the European Union’s Horizon 2020 programme
  under the Marie Skłodowska-Curie grant agreement No. 754411.
article_processing_charge: No
arxiv: 1
author:
- first_name: Foivos
  full_name: Alimisis, Foivos
  last_name: Alimisis
- first_name: Peter
  full_name: Davies, Peter
  id: 11396234-BB50-11E9-B24C-90FCE5697425
  last_name: Davies
  orcid: 0000-0002-5646-9524
- first_name: Dan-Adrian
  full_name: Alistarh, Dan-Adrian
  id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
  last_name: Alistarh
  orcid: 0000-0003-3650-940X
citation:
  ama: 'Alimisis F, Davies P, Alistarh D-A. Communication-efficient distributed optimization
    with quantized preconditioners. In: <i>Proceedings of the 38th International Conference
    on Machine Learning</i>. Vol 139. ML Research Press; 2021:196-206.'
  apa: 'Alimisis, F., Davies, P., &#38; Alistarh, D.-A. (2021). Communication-efficient
    distributed optimization with quantized preconditioners. In <i>Proceedings of
    the 38th International Conference on Machine Learning</i> (Vol. 139, pp. 196–206).
    Virtual: ML Research Press.'
  chicago: Alimisis, Foivos, Peter Davies, and Dan-Adrian Alistarh. “Communication-Efficient
    Distributed Optimization with Quantized Preconditioners.” In <i>Proceedings of
    the 38th International Conference on Machine Learning</i>, 139:196–206. ML Research
    Press, 2021.
  ieee: F. Alimisis, P. Davies, and D.-A. Alistarh, “Communication-efficient distributed
    optimization with quantized preconditioners,” in <i>Proceedings of the 38th International
    Conference on Machine Learning</i>, Virtual, 2021, vol. 139, pp. 196–206.
  ista: Alimisis F, Davies P, Alistarh D-A. 2021. Communication-efficient distributed
    optimization with quantized preconditioners. Proceedings of the 38th International
    Conference on Machine Learning. International Conference on Machine Learning vol.
    139, 196–206.
  mla: Alimisis, Foivos, et al. “Communication-Efficient Distributed Optimization
    with Quantized Preconditioners.” <i>Proceedings of the 38th International Conference
    on Machine Learning</i>, vol. 139, ML Research Press, 2021, pp. 196–206.
  short: F. Alimisis, P. Davies, D.-A. Alistarh, in:, Proceedings of the 38th International
    Conference on Machine Learning, ML Research Press, 2021, pp. 196–206.
conference:
  end_date: 2021-07-24
  location: Virtual
  name: International Conference on Machine Learning
  start_date: 2021-07-18
date_created: 2023-06-18T22:00:48Z
date_published: 2021-07-01T00:00:00Z
date_updated: 2023-06-19T10:44:38Z
day: '01'
ddc:
- '000'
department:
- _id: DaAl
ec_funded: 1
external_id:
  arxiv:
  - '2102.07214'
file:
- access_level: open_access
  checksum: 7ec0d59bac268b49c76bf2e036dedd7a
  content_type: application/pdf
  creator: dernst
  date_created: 2023-06-19T10:41:05Z
  date_updated: 2023-06-19T10:41:05Z
  file_id: '13154'
  file_name: 2021_PMLR_Alimisis.pdf
  file_size: 429087
  relation: main_file
  success: 1
file_date_updated: 2023-06-19T10:41:05Z
has_accepted_license: '1'
intvolume: '       139'
language:
- iso: eng
month: '07'
oa: 1
oa_version: Published Version
page: 196-206
project:
- _id: 268A44D6-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '805223'
  name: Elastic Coordination for Scalable Machine Learning
- _id: 260C2330-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '754411'
  name: ISTplus - Postdoctoral Fellowships
publication: Proceedings of the 38th International Conference on Machine Learning
publication_identifier:
  eissn:
  - 2640-3498
  isbn:
  - '9781713845065'
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: Communication-efficient distributed optimization with quantized preconditioners
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '14117'
abstract:
- lang: eng
  text: 'The two fields of machine learning and graphical causality arose and are
    developed separately. However, there is, now, cross-pollination and increasing
    interest in both fields to benefit from the advances of the other. In this article,
    we review fundamental concepts of causal inference and relate them to crucial
    open problems of machine learning, including transfer and generalization, thereby
    assaying how causality can contribute to modern machine learning research. This
    also applies in the opposite direction: we note that most work in causality starts
    from the premise that the causal variables are given. A central problem for AI
    and causality is, thus, causal representation learning, that is, the discovery
    of high-level causal variables from low-level observations. Finally, we delineate
    some implications of causality for machine learning and propose key research areas
    at the intersection of both communities.'
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Bernhard
  full_name: Scholkopf, Bernhard
  last_name: Scholkopf
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Stefan
  full_name: Bauer, Stefan
  last_name: Bauer
- first_name: Nan Rosemary
  full_name: Ke, Nan Rosemary
  last_name: Ke
- first_name: Nal
  full_name: Kalchbrenner, Nal
  last_name: Kalchbrenner
- first_name: Anirudh
  full_name: Goyal, Anirudh
  last_name: Goyal
- first_name: Yoshua
  full_name: Bengio, Yoshua
  last_name: Bengio
citation:
  ama: Scholkopf B, Locatello F, Bauer S, et al. Toward causal representation learning.
    <i>Proceedings of the IEEE</i>. 2021;109(5):612-634. doi:<a href="https://doi.org/10.1109/jproc.2021.3058954">10.1109/jproc.2021.3058954</a>
  apa: Scholkopf, B., Locatello, F., Bauer, S., Ke, N. R., Kalchbrenner, N., Goyal,
    A., &#38; Bengio, Y. (2021). Toward causal representation learning. <i>Proceedings
    of the IEEE</i>. Institute of Electrical and Electronics Engineers. <a href="https://doi.org/10.1109/jproc.2021.3058954">https://doi.org/10.1109/jproc.2021.3058954</a>
  chicago: Scholkopf, Bernhard, Francesco Locatello, Stefan Bauer, Nan Rosemary Ke,
    Nal Kalchbrenner, Anirudh Goyal, and Yoshua Bengio. “Toward Causal Representation
    Learning.” <i>Proceedings of the IEEE</i>. Institute of Electrical and Electronics
    Engineers, 2021. <a href="https://doi.org/10.1109/jproc.2021.3058954">https://doi.org/10.1109/jproc.2021.3058954</a>.
  ieee: B. Scholkopf <i>et al.</i>, “Toward causal representation learning,” <i>Proceedings
    of the IEEE</i>, vol. 109, no. 5. Institute of Electrical and Electronics Engineers,
    pp. 612–634, 2021.
  ista: Scholkopf B, Locatello F, Bauer S, Ke NR, Kalchbrenner N, Goyal A, Bengio
    Y. 2021. Toward causal representation learning. Proceedings of the IEEE. 109(5),
    612–634.
  mla: Scholkopf, Bernhard, et al. “Toward Causal Representation Learning.” <i>Proceedings
    of the IEEE</i>, vol. 109, no. 5, Institute of Electrical and Electronics Engineers,
    2021, pp. 612–34, doi:<a href="https://doi.org/10.1109/jproc.2021.3058954">10.1109/jproc.2021.3058954</a>.
  short: B. Scholkopf, F. Locatello, S. Bauer, N.R. Ke, N. Kalchbrenner, A. Goyal,
    Y. Bengio, Proceedings of the IEEE 109 (2021) 612–634.
date_created: 2023-08-21T12:19:30Z
date_published: 2021-05-01T00:00:00Z
date_updated: 2023-09-11T11:43:35Z
day: '01'
department:
- _id: FrLo
doi: 10.1109/jproc.2021.3058954
extern: '1'
external_id:
  arxiv:
  - '2102.11107'
intvolume: '       109'
issue: '5'
keyword:
- Electrical and Electronic Engineering
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.1109/JPROC.2021.3058954
month: '05'
oa: 1
oa_version: Published Version
page: 612-634
publication: Proceedings of the IEEE
publication_identifier:
  eissn:
  - 1558-2256
  issn:
  - 0018-9219
publication_status: published
publisher: Institute of Electrical and Electronics Engineers
quality_controlled: '1'
scopus_import: '1'
status: public
title: Toward causal representation learning
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 109
year: '2021'
...
---
_id: '14176'
abstract:
- lang: eng
  text: "Intensive care units (ICU) are increasingly looking towards machine learning
    for methods to provide online monitoring of critically ill patients. In machine
    learning, online monitoring is often formulated as a supervised learning problem.
    Recently, contrastive learning approaches have demonstrated promising improvements
    over competitive supervised benchmarks. These methods rely on well-understood
    data augmentation techniques developed for image data which do not apply to online
    monitoring. In this work, we overcome this limitation by\r\nsupplementing time-series
    data augmentation techniques with a novel contrastive\r\nlearning objective which
    we call neighborhood contrastive learning (NCL). Our objective explicitly groups
    together contiguous time segments from each patient while maintaining state-specific
    information. Our experiments demonstrate a marked improvement over existing work
    applying contrastive methods to medical time-series."
alternative_title:
- PMLR
article_processing_charge: No
arxiv: 1
author:
- first_name: Hugo
  full_name: Yèche, Hugo
  last_name: Yèche
- first_name: Gideon
  full_name: Dresdner, Gideon
  last_name: Dresdner
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Matthias
  full_name: Hüser, Matthias
  last_name: Hüser
- first_name: Gunnar
  full_name: Rätsch, Gunnar
  last_name: Rätsch
citation:
  ama: 'Yèche H, Dresdner G, Locatello F, Hüser M, Rätsch G. Neighborhood contrastive
    learning applied to online patient monitoring. In: <i>Proceedings of 38th International
    Conference on Machine Learning</i>. Vol 139. ML Research Press; 2021:11964-11974.'
  apa: 'Yèche, H., Dresdner, G., Locatello, F., Hüser, M., &#38; Rätsch, G. (2021).
    Neighborhood contrastive learning applied to online patient monitoring. In <i>Proceedings
    of 38th International Conference on Machine Learning</i> (Vol. 139, pp. 11964–11974).
    Virtual: ML Research Press.'
  chicago: Yèche, Hugo, Gideon Dresdner, Francesco Locatello, Matthias Hüser, and
    Gunnar Rätsch. “Neighborhood Contrastive Learning Applied to Online Patient Monitoring.”
    In <i>Proceedings of 38th International Conference on Machine Learning</i>, 139:11964–74.
    ML Research Press, 2021.
  ieee: H. Yèche, G. Dresdner, F. Locatello, M. Hüser, and G. Rätsch, “Neighborhood
    contrastive learning applied to online patient monitoring,” in <i>Proceedings
    of 38th International Conference on Machine Learning</i>, Virtual, 2021, vol.
    139, pp. 11964–11974.
  ista: Yèche H, Dresdner G, Locatello F, Hüser M, Rätsch G. 2021. Neighborhood contrastive
    learning applied to online patient monitoring. Proceedings of 38th International
    Conference on Machine Learning. International Conference on Machine Learning,
    PMLR, vol. 139, 11964–11974.
  mla: Yèche, Hugo, et al. “Neighborhood Contrastive Learning Applied to Online Patient
    Monitoring.” <i>Proceedings of 38th International Conference on Machine Learning</i>,
    vol. 139, ML Research Press, 2021, pp. 11964–74.
  short: H. Yèche, G. Dresdner, F. Locatello, M. Hüser, G. Rätsch, in:, Proceedings
    of 38th International Conference on Machine Learning, ML Research Press, 2021,
    pp. 11964–11974.
conference:
  end_date: 2021-07-24
  location: Virtual
  name: International Conference on Machine Learning
  start_date: 2021-07-18
date_created: 2023-08-22T14:03:04Z
date_published: 2021-08-01T00:00:00Z
date_updated: 2023-09-11T10:16:55Z
day: '01'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2106.05142'
intvolume: '       139'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2106.05142
month: '08'
oa: 1
oa_version: Preprint
page: 11964-11974
publication: Proceedings of 38th International Conference on Machine Learning
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: Neighborhood contrastive learning applied to online patient monitoring
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '14177'
abstract:
- lang: eng
  text: "The focus of disentanglement approaches has been on identifying independent
    factors of variation in data. However, the causal variables underlying real-world
    observations are often not statistically independent. In this work, we bridge
    the gap to real-world scenarios by analyzing the behavior of the most prominent
    disentanglement approaches on correlated data in a large-scale empirical study
    (including 4260 models). We show and quantify that systematically induced correlations
    in the dataset are being learned and reflected in the latent representations,
    which has implications for downstream applications of disentanglement such as
    fairness. We also demonstrate how to resolve these latent correlations, either
    using weak supervision during\r\ntraining or by post-hoc correcting a pre-trained
    model with a small number of labels."
alternative_title:
- PMLR
article_processing_charge: No
arxiv: 1
author:
- first_name: Frederik
  full_name: Träuble, Frederik
  last_name: Träuble
- first_name: Elliot
  full_name: Creager, Elliot
  last_name: Creager
- first_name: Niki
  full_name: Kilbertus, Niki
  last_name: Kilbertus
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Andrea
  full_name: Dittadi, Andrea
  last_name: Dittadi
- first_name: Anirudh
  full_name: Goyal, Anirudh
  last_name: Goyal
- first_name: Bernhard
  full_name: Schölkopf, Bernhard
  last_name: Schölkopf
- first_name: Stefan
  full_name: Bauer, Stefan
  last_name: Bauer
citation:
  ama: 'Träuble F, Creager E, Kilbertus N, et al. On disentangled representations
    learned from correlated data. In: <i>Proceedings of the 38th International Conference
    on Machine Learning</i>. Vol 139. ML Research Press; 2021:10401-10412.'
  apa: 'Träuble, F., Creager, E., Kilbertus, N., Locatello, F., Dittadi, A., Goyal,
    A., … Bauer, S. (2021). On disentangled representations learned from correlated
    data. In <i>Proceedings of the 38th International Conference on Machine Learning</i>
    (Vol. 139, pp. 10401–10412). Virtual: ML Research Press.'
  chicago: Träuble, Frederik, Elliot Creager, Niki Kilbertus, Francesco Locatello,
    Andrea Dittadi, Anirudh Goyal, Bernhard Schölkopf, and Stefan Bauer. “On Disentangled
    Representations Learned from Correlated Data.” In <i>Proceedings of the 38th International
    Conference on Machine Learning</i>, 139:10401–12. ML Research Press, 2021.
  ieee: F. Träuble <i>et al.</i>, “On disentangled representations learned from correlated
    data,” in <i>Proceedings of the 38th International Conference on Machine Learning</i>,
    Virtual, 2021, vol. 139, pp. 10401–10412.
  ista: 'Träuble F, Creager E, Kilbertus N, Locatello F, Dittadi A, Goyal A, Schölkopf
    B, Bauer S. 2021. On disentangled representations learned from correlated data.
    Proceedings of the 38th International Conference on Machine Learning. ICML: International
    Conference on Machine Learning, PMLR, vol. 139, 10401–10412.'
  mla: Träuble, Frederik, et al. “On Disentangled Representations Learned from Correlated
    Data.” <i>Proceedings of the 38th International Conference on Machine Learning</i>,
    vol. 139, ML Research Press, 2021, pp. 10401–12.
  short: F. Träuble, E. Creager, N. Kilbertus, F. Locatello, A. Dittadi, A. Goyal,
    B. Schölkopf, S. Bauer, in:, Proceedings of the 38th International Conference
    on Machine Learning, ML Research Press, 2021, pp. 10401–10412.
conference:
  end_date: 2021-07-24
  location: Virtual
  name: 'ICML: International Conference on Machine Learning'
  start_date: 2021-07-18
date_created: 2023-08-22T14:03:47Z
date_published: 2021-08-01T00:00:00Z
date_updated: 2023-09-11T10:18:48Z
day: '01'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2006.07886'
intvolume: '       139'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2006.07886
month: '08'
oa: 1
oa_version: Published Version
page: 10401-10412
publication: Proceedings of the 38th International Conference on Machine Learning
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: On disentangled representations learned from correlated data
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '14178'
abstract:
- lang: eng
  text: Learning meaningful representations that disentangle the underlying structure
    of the data generating process is considered to be of key importance in machine
    learning. While disentangled representations were found to be useful for diverse
    tasks such as abstract reasoning and fair classification, their scalability and
    real-world impact remain questionable. We introduce a new high-resolution dataset
    with 1M simulated images and over 1,800 annotated real-world images of the same
    setup. In contrast to previous work, this new dataset exhibits correlations, a
    complex underlying structure, and allows to evaluate transfer to unseen simulated
    and real-world settings where the encoder i) remains in distribution or ii) is
    out of distribution. We propose new architectures in order to scale disentangled
    representation learning to realistic high-resolution settings and conduct a large-scale
    empirical study of disentangled representations on this dataset. We observe that
    disentanglement is a good predictor for out-of-distribution (OOD) task performance.
article_processing_charge: No
arxiv: 1
author:
- first_name: Andrea
  full_name: Dittadi, Andrea
  last_name: Dittadi
- first_name: Frederik
  full_name: Träuble, Frederik
  last_name: Träuble
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Manuel
  full_name: Wüthrich, Manuel
  last_name: Wüthrich
- first_name: Vaibhav
  full_name: Agrawal, Vaibhav
  last_name: Agrawal
- first_name: Ole
  full_name: Winther, Ole
  last_name: Winther
- first_name: Stefan
  full_name: Bauer, Stefan
  last_name: Bauer
- first_name: Bernhard
  full_name: Schölkopf, Bernhard
  last_name: Schölkopf
citation:
  ama: 'Dittadi A, Träuble F, Locatello F, et al. On the transfer of disentangled
    representations in realistic settings. In: <i>The Ninth International Conference
    on Learning Representations</i>. ; 2021.'
  apa: Dittadi, A., Träuble, F., Locatello, F., Wüthrich, M., Agrawal, V., Winther,
    O., … Schölkopf, B. (2021). On the transfer of disentangled representations in
    realistic settings. In <i>The Ninth International Conference on Learning Representations</i>.
    Virtual.
  chicago: Dittadi, Andrea, Frederik Träuble, Francesco Locatello, Manuel Wüthrich,
    Vaibhav Agrawal, Ole Winther, Stefan Bauer, and Bernhard Schölkopf. “On the Transfer
    of Disentangled Representations in Realistic Settings.” In <i>The Ninth International
    Conference on Learning Representations</i>, 2021.
  ieee: A. Dittadi <i>et al.</i>, “On the transfer of disentangled representations
    in realistic settings,” in <i>The Ninth International Conference on Learning Representations</i>,
    Virtual, 2021.
  ista: 'Dittadi A, Träuble F, Locatello F, Wüthrich M, Agrawal V, Winther O, Bauer
    S, Schölkopf B. 2021. On the transfer of disentangled representations in realistic
    settings. The Ninth International Conference on Learning Representations. ICLR:
    International Conference on Learning Representations.'
  mla: Dittadi, Andrea, et al. “On the Transfer of Disentangled Representations in
    Realistic Settings.” <i>The Ninth International Conference on Learning Representations</i>,
    2021.
  short: A. Dittadi, F. Träuble, F. Locatello, M. Wüthrich, V. Agrawal, O. Winther,
    S. Bauer, B. Schölkopf, in:, The Ninth International Conference on Learning Representations,
    2021.
conference:
  end_date: 2021-05-07
  location: Virtual
  name: 'ICLR: International Conference on Learning Representations'
  start_date: 2021-05-03
date_created: 2023-08-22T14:04:16Z
date_published: 2021-05-04T00:00:00Z
date_updated: 2023-09-11T10:55:30Z
day: '04'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2010.14407'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2010.14407
month: '05'
oa: 1
oa_version: Preprint
publication: The Ninth International Conference on Learning Representations
publication_status: published
quality_controlled: '1'
status: public
title: On the transfer of disentangled representations in realistic settings
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14179'
abstract:
- lang: eng
  text: Self-supervised representation learning has shown remarkable success in a
    number of domains. A common practice is to perform data augmentation via hand-crafted
    transformations intended to leave the semantics of the data invariant. We seek
    to understand the empirical success of this approach from a theoretical perspective.
    We formulate the augmentation process as a latent variable model by postulating
    a partition of the latent representation into a content component, which is assumed
    invariant to augmentation, and a style component, which is allowed to change.
    Unlike prior work on disentanglement and independent component analysis, we allow
    for both nontrivial statistical and causal dependencies in the latent space. We
    study the identifiability of the latent representation based on pairs of views
    of the observations and prove sufficient conditions that allow us to identify
    the invariant content partition up to an invertible mapping in both generative
    and discriminative settings. We find numerical simulations with dependent latent
    variables are consistent with our theory. Lastly, we introduce Causal3DIdent,
    a dataset of high-dimensional, visually complex images with rich causal dependencies,
    which we use to study the effect of data augmentations performed in practice.
article_processing_charge: No
arxiv: 1
author:
- first_name: Julius von
  full_name: Kügelgen, Julius von
  last_name: Kügelgen
- first_name: Yash
  full_name: Sharma, Yash
  last_name: Sharma
- first_name: Luigi
  full_name: Gresele, Luigi
  last_name: Gresele
- first_name: Wieland
  full_name: Brendel, Wieland
  last_name: Brendel
- first_name: Bernhard
  full_name: Schölkopf, Bernhard
  last_name: Schölkopf
- first_name: Michel
  full_name: Besserve, Michel
  last_name: Besserve
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
citation:
  ama: 'Kügelgen J von, Sharma Y, Gresele L, et al. Self-supervised learning with
    data augmentations provably isolates content from style. In: <i>Advances in Neural
    Information Processing Systems</i>. Vol 34. ; 2021:16451-16467.'
  apa: Kügelgen, J. von, Sharma, Y., Gresele, L., Brendel, W., Schölkopf, B., Besserve,
    M., &#38; Locatello, F. (2021). Self-supervised learning with data augmentations
    provably isolates content from style. In <i>Advances in Neural Information Processing
    Systems</i> (Vol. 34, pp. 16451–16467). Virtual.
  chicago: Kügelgen, Julius von, Yash Sharma, Luigi Gresele, Wieland Brendel, Bernhard
    Schölkopf, Michel Besserve, and Francesco Locatello. “Self-Supervised Learning
    with Data Augmentations Provably Isolates Content from Style.” In <i>Advances
    in Neural Information Processing Systems</i>, 34:16451–67, 2021.
  ieee: J. von Kügelgen <i>et al.</i>, “Self-supervised learning with data augmentations
    provably isolates content from style,” in <i>Advances in Neural Information Processing
    Systems</i>, Virtual, 2021, vol. 34, pp. 16451–16467.
  ista: 'Kügelgen J von, Sharma Y, Gresele L, Brendel W, Schölkopf B, Besserve M,
    Locatello F. 2021. Self-supervised learning with data augmentations provably isolates
    content from style. Advances in Neural Information Processing Systems. NeurIPS:
    Neural Information Processing Systems vol. 34, 16451–16467.'
  mla: Kügelgen, Julius von, et al. “Self-Supervised Learning with Data Augmentations
    Provably Isolates Content from Style.” <i>Advances in Neural Information Processing
    Systems</i>, vol. 34, 2021, pp. 16451–67.
  short: J. von Kügelgen, Y. Sharma, L. Gresele, W. Brendel, B. Schölkopf, M. Besserve,
    F. Locatello, in:, Advances in Neural Information Processing Systems, 2021, pp.
    16451–16467.
conference:
  end_date: 2021-12-10
  location: Virtual
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2021-12-07
date_created: 2023-08-22T14:04:36Z
date_published: 2021-06-08T00:00:00Z
date_updated: 2023-09-11T10:33:19Z
day: '08'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2106.04619'
intvolume: '        34'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2106.04619
month: '06'
oa: 1
oa_version: Preprint
page: 16451-16467
publication: Advances in Neural Information Processing Systems
publication_identifier:
  isbn:
  - '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: Self-supervised learning with data augmentations provably isolates content
  from style
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14180'
abstract:
- lang: eng
  text: 'Modern neural network architectures can leverage large amounts of data to
    generalize well within the training distribution. However, they are less capable
    of systematic generalization to data drawn from unseen but related distributions,
    a feat that is hypothesized to require compositional reasoning and reuse of knowledge.
    In this work, we present Neural Interpreters, an architecture that factorizes
    inference in a self-attention network as a system of modules, which we call \emph{functions}.
    Inputs to the model are routed through a sequence of functions in a way that is
    end-to-end learned. The proposed architecture can flexibly compose computation
    along width and depth, and lends itself well to capacity extension after training.
    To demonstrate the versatility of Neural Interpreters, we evaluate it in two distinct
    settings: image classification and visual abstract reasoning on Raven Progressive
    Matrices. In the former, we show that Neural Interpreters perform on par with
    the vision transformer using fewer parameters, while being transferrable to a
    new task in a sample efficient manner. In the latter, we find that Neural Interpreters
    are competitive with respect to the state-of-the-art in terms of systematic generalization. '
article_processing_charge: No
arxiv: 1
author:
- first_name: Nasim
  full_name: Rahaman, Nasim
  last_name: Rahaman
- first_name: Muhammad Waleed
  full_name: Gondal, Muhammad Waleed
  last_name: Gondal
- first_name: Shruti
  full_name: Joshi, Shruti
  last_name: Joshi
- first_name: Peter
  full_name: Gehler, Peter
  last_name: Gehler
- first_name: Yoshua
  full_name: Bengio, Yoshua
  last_name: Bengio
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Bernhard
  full_name: Schölkopf, Bernhard
  last_name: Schölkopf
citation:
  ama: 'Rahaman N, Gondal MW, Joshi S, et al. Dynamic inference with neural interpreters.
    In: <i>Advances in Neural Information Processing Systems</i>. Vol 34. ; 2021:10985-10998.'
  apa: Rahaman, N., Gondal, M. W., Joshi, S., Gehler, P., Bengio, Y., Locatello, F.,
    &#38; Schölkopf, B. (2021). Dynamic inference with neural interpreters. In <i>Advances
    in Neural Information Processing Systems</i> (Vol. 34, pp. 10985–10998). Virtual.
  chicago: Rahaman, Nasim, Muhammad Waleed Gondal, Shruti Joshi, Peter Gehler, Yoshua
    Bengio, Francesco Locatello, and Bernhard Schölkopf. “Dynamic Inference with Neural
    Interpreters.” In <i>Advances in Neural Information Processing Systems</i>, 34:10985–98,
    2021.
  ieee: N. Rahaman <i>et al.</i>, “Dynamic inference with neural interpreters,” in
    <i>Advances in Neural Information Processing Systems</i>, Virtual, 2021, vol.
    34, pp. 10985–10998.
  ista: 'Rahaman N, Gondal MW, Joshi S, Gehler P, Bengio Y, Locatello F, Schölkopf
    B. 2021. Dynamic inference with neural interpreters. Advances in Neural Information
    Processing Systems. NeurIPS: Neural Information Processing Systems vol. 34, 10985–10998.'
  mla: Rahaman, Nasim, et al. “Dynamic Inference with Neural Interpreters.” <i>Advances
    in Neural Information Processing Systems</i>, vol. 34, 2021, pp. 10985–98.
  short: N. Rahaman, M.W. Gondal, S. Joshi, P. Gehler, Y. Bengio, F. Locatello, B.
    Schölkopf, in:, Advances in Neural Information Processing Systems, 2021, pp. 10985–10998.
conference:
  end_date: 2021-12-10
  location: Virtual
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2021-12-07
date_created: 2023-08-22T14:04:55Z
date_published: 2021-10-12T00:00:00Z
date_updated: 2023-09-11T11:33:46Z
day: '12'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2110.06399'
intvolume: '        34'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2110.06399
month: '10'
oa: 1
oa_version: Preprint
page: 10985-10998
publication: Advances in Neural Information Processing Systems
publication_identifier:
  isbn:
  - '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: Dynamic inference with neural interpreters
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14181'
abstract:
- lang: eng
  text: Variational Inference makes a trade-off between the capacity of the variational
    family and the tractability of finding an approximate posterior distribution.
    Instead, Boosting Variational Inference allows practitioners to obtain increasingly
    good posterior approximations by spending more compute. The main obstacle to widespread
    adoption of Boosting Variational Inference is the amount of resources necessary
    to improve over a strong Variational Inference baseline. In our work, we trace
    this limitation back to the global curvature of the KL-divergence. We characterize
    how the global curvature impacts time and memory consumption, address the problem
    with the notion of local curvature, and provide a novel approximate backtracking
    algorithm for estimating local curvature. We give new theoretical convergence
    rates for our algorithms and provide experimental validation on synthetic and
    real-world datasets.
article_processing_charge: No
arxiv: 1
author:
- first_name: Gideon
  full_name: Dresdner, Gideon
  last_name: Dresdner
- first_name: Saurav
  full_name: Shekhar, Saurav
  last_name: Shekhar
- first_name: Fabian
  full_name: Pedregosa, Fabian
  last_name: Pedregosa
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Gunnar
  full_name: Rätsch, Gunnar
  last_name: Rätsch
citation:
  ama: 'Dresdner G, Shekhar S, Pedregosa F, Locatello F, Rätsch G. Boosting variational
    inference with locally adaptive step-sizes. In: <i>Proceedings of the Thirtieth
    International Joint Conference on Artificial Intelligence</i>. International Joint
    Conferences on Artificial Intelligence; 2021:2337-2343. doi:<a href="https://doi.org/10.24963/ijcai.2021/322">10.24963/ijcai.2021/322</a>'
  apa: 'Dresdner, G., Shekhar, S., Pedregosa, F., Locatello, F., &#38; Rätsch, G.
    (2021). Boosting variational inference with locally adaptive step-sizes. In <i>Proceedings
    of the Thirtieth International Joint Conference on Artificial Intelligence</i>
    (pp. 2337–2343). Montreal, Canada: International Joint Conferences on Artificial
    Intelligence. <a href="https://doi.org/10.24963/ijcai.2021/322">https://doi.org/10.24963/ijcai.2021/322</a>'
  chicago: Dresdner, Gideon, Saurav Shekhar, Fabian Pedregosa, Francesco Locatello,
    and Gunnar Rätsch. “Boosting Variational Inference with Locally Adaptive Step-Sizes.”
    In <i>Proceedings of the Thirtieth International Joint Conference on Artificial
    Intelligence</i>, 2337–43. International Joint Conferences on Artificial Intelligence,
    2021. <a href="https://doi.org/10.24963/ijcai.2021/322">https://doi.org/10.24963/ijcai.2021/322</a>.
  ieee: G. Dresdner, S. Shekhar, F. Pedregosa, F. Locatello, and G. Rätsch, “Boosting
    variational inference with locally adaptive step-sizes,” in <i>Proceedings of
    the Thirtieth International Joint Conference on Artificial Intelligence</i>, Montreal,
    Canada, 2021, pp. 2337–2343.
  ista: 'Dresdner G, Shekhar S, Pedregosa F, Locatello F, Rätsch G. 2021. Boosting
    variational inference with locally adaptive step-sizes. Proceedings of the Thirtieth
    International Joint Conference on Artificial Intelligence. IJCAI: International
    Joint Conference on Artificial Intelligence, 2337–2343.'
  mla: Dresdner, Gideon, et al. “Boosting Variational Inference with Locally Adaptive
    Step-Sizes.” <i>Proceedings of the Thirtieth International Joint Conference on
    Artificial Intelligence</i>, International Joint Conferences on Artificial Intelligence,
    2021, pp. 2337–43, doi:<a href="https://doi.org/10.24963/ijcai.2021/322">10.24963/ijcai.2021/322</a>.
  short: G. Dresdner, S. Shekhar, F. Pedregosa, F. Locatello, G. Rätsch, in:, Proceedings
    of the Thirtieth International Joint Conference on Artificial Intelligence, International
    Joint Conferences on Artificial Intelligence, 2021, pp. 2337–2343.
conference:
  end_date: 2021-08-27
  location: Montreal, Canada
  name: 'IJCAI: International Joint Conference on Artificial Intelligence'
  start_date: 2021-08-19
date_created: 2023-08-22T14:05:14Z
date_published: 2021-05-19T00:00:00Z
date_updated: 2023-09-11T11:14:30Z
day: '19'
department:
- _id: FrLo
doi: 10.24963/ijcai.2021/322
extern: '1'
external_id:
  arxiv:
  - '2105.09240'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2105.09240
month: '05'
oa: 1
oa_version: Published Version
page: 2337-2343
publication: Proceedings of the Thirtieth International Joint Conference on Artificial
  Intelligence
publication_identifier:
  eisbn:
  - '9780999241196'
publication_status: published
publisher: International Joint Conferences on Artificial Intelligence
quality_controlled: '1'
status: public
title: Boosting variational inference with locally adaptive step-sizes
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14182'
abstract:
- lang: eng
  text: "When machine learning systems meet real world applications, accuracy is only\r\none
    of several requirements. In this paper, we assay a complementary\r\nperspective
    originating from the increasing availability of pre-trained and\r\nregularly improving
    state-of-the-art models. While new improved models develop\r\nat a fast pace,
    downstream tasks vary more slowly or stay constant. Assume that\r\nwe have a large
    unlabelled data set for which we want to maintain accurate\r\npredictions. Whenever
    a new and presumably better ML models becomes available,\r\nwe encounter two problems:
    (i) given a limited budget, which data points should\r\nbe re-evaluated using
    the new model?; and (ii) if the new predictions differ\r\nfrom the current ones,
    should we update? Problem (i) is about compute cost,\r\nwhich matters for very
    large data sets and models. Problem (ii) is about\r\nmaintaining consistency of
    the predictions, which can be highly relevant for\r\ndownstream applications;
    our demand is to avoid negative flips, i.e., changing\r\ncorrect to incorrect
    predictions. In this paper, we formalize the Prediction\r\nUpdate Problem and
    present an efficient probabilistic approach as answer to the\r\nabove questions.
    In extensive experiments on standard classification benchmark\r\ndata sets, we
    show that our method outperforms alternative strategies along key\r\nmetrics for
    backward-compatible prediction updates."
article_processing_charge: No
arxiv: 1
author:
- first_name: Frederik
  full_name: Träuble, Frederik
  last_name: Träuble
- first_name: Julius von
  full_name: Kügelgen, Julius von
  last_name: Kügelgen
- first_name: Matthäus
  full_name: Kleindessner, Matthäus
  last_name: Kleindessner
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Bernhard
  full_name: Schölkopf, Bernhard
  last_name: Schölkopf
- first_name: Peter
  full_name: Gehler, Peter
  last_name: Gehler
citation:
  ama: 'Träuble F, Kügelgen J von, Kleindessner M, Locatello F, Schölkopf B, Gehler
    P. Backward-compatible prediction updates: A probabilistic approach. In: <i>35th
    Conference on Neural Information Processing Systems</i>. Vol 34. ; 2021:116-128.'
  apa: 'Träuble, F., Kügelgen, J. von, Kleindessner, M., Locatello, F., Schölkopf,
    B., &#38; Gehler, P. (2021). Backward-compatible prediction updates: A probabilistic
    approach. In <i>35th Conference on Neural Information Processing Systems</i> (Vol.
    34, pp. 116–128). Virtual.'
  chicago: 'Träuble, Frederik, Julius von Kügelgen, Matthäus Kleindessner, Francesco
    Locatello, Bernhard Schölkopf, and Peter Gehler. “Backward-Compatible Prediction
    Updates: A Probabilistic Approach.” In <i>35th Conference on Neural Information
    Processing Systems</i>, 34:116–28, 2021.'
  ieee: 'F. Träuble, J. von Kügelgen, M. Kleindessner, F. Locatello, B. Schölkopf,
    and P. Gehler, “Backward-compatible prediction updates: A probabilistic approach,”
    in <i>35th Conference on Neural Information Processing Systems</i>, Virtual, 2021,
    vol. 34, pp. 116–128.'
  ista: 'Träuble F, Kügelgen J von, Kleindessner M, Locatello F, Schölkopf B, Gehler
    P. 2021. Backward-compatible prediction updates: A probabilistic approach. 35th
    Conference on Neural Information Processing Systems. NeurIPS: Neural Information
    Processing Systems vol. 34, 116–128.'
  mla: 'Träuble, Frederik, et al. “Backward-Compatible Prediction Updates: A Probabilistic
    Approach.” <i>35th Conference on Neural Information Processing Systems</i>, vol.
    34, 2021, pp. 116–28.'
  short: F. Träuble, J. von Kügelgen, M. Kleindessner, F. Locatello, B. Schölkopf,
    P. Gehler, in:, 35th Conference on Neural Information Processing Systems, 2021,
    pp. 116–128.
conference:
  end_date: 2021-12-10
  location: Virtual
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2021-12-07
date_created: 2023-08-22T14:05:41Z
date_published: 2021-07-02T00:00:00Z
date_updated: 2023-09-11T11:31:59Z
day: '02'
department:
- _id: FrLo
extern: '1'
external_id:
  arxiv:
  - '2107.01057'
intvolume: '        34'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2107.01057
month: '07'
oa: 1
oa_version: Preprint
page: 116-128
publication: 35th Conference on Neural Information Processing Systems
publication_identifier:
  isbn:
  - '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: 'Backward-compatible prediction updates: A probabilistic approach'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14221'
abstract:
- lang: eng
  text: 'The world is structured in countless ways. It may be prudent to enforce corresponding
    structural properties to a learning algorithm''s solution, such as incorporating
    prior beliefs, natural constraints, or causal structures. Doing so may translate
    to faster, more accurate, and more flexible models, which may directly relate
    to real-world impact. In this dissertation, we consider two different research
    areas that concern structuring a learning algorithm''s solution: when the structure
    is known and when it has to be discovered.'
article_number: '2111.13693'
article_processing_charge: No
arxiv: 1
author:
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
citation:
  ama: Locatello F. Enforcing and discovering structure in machine learning. <i>arXiv</i>.
    doi:<a href="https://doi.org/10.48550/arXiv.2111.13693">10.48550/arXiv.2111.13693</a>
  apa: Locatello, F. (n.d.). Enforcing and discovering structure in machine learning.
    <i>arXiv</i>. <a href="https://doi.org/10.48550/arXiv.2111.13693">https://doi.org/10.48550/arXiv.2111.13693</a>
  chicago: Locatello, Francesco. “Enforcing and Discovering Structure in Machine Learning.”
    <i>ArXiv</i>, n.d. <a href="https://doi.org/10.48550/arXiv.2111.13693">https://doi.org/10.48550/arXiv.2111.13693</a>.
  ieee: F. Locatello, “Enforcing and discovering structure in machine learning,” <i>arXiv</i>.
    .
  ista: Locatello F. Enforcing and discovering structure in machine learning. arXiv,
    2111.13693.
  mla: Locatello, Francesco. “Enforcing and Discovering Structure in Machine Learning.”
    <i>ArXiv</i>, 2111.13693, doi:<a href="https://doi.org/10.48550/arXiv.2111.13693">10.48550/arXiv.2111.13693</a>.
  short: F. Locatello, ArXiv (n.d.).
date_created: 2023-08-22T14:23:35Z
date_published: 2021-11-26T00:00:00Z
date_updated: 2023-09-12T07:04:44Z
day: '26'
department:
- _id: FrLo
doi: 10.48550/arXiv.2111.13693
extern: '1'
external_id:
  arxiv:
  - '2111.13693'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2111.13693
month: '11'
oa: 1
oa_version: Preprint
publication: arXiv
publication_status: submitted
status: public
title: Enforcing and discovering structure in machine learning
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14278'
abstract:
- lang: eng
  text: 'The Birkhoff conjecture says that the boundary of a strictly convex integrable
    billiard table is necessarily an ellipse. In this article, we consider a stronger
    notion of integrability, namely, integrability close to the boundary, and prove
    a local version of this conjecture: a small perturbation of almost every ellipse
    that preserves integrability near the boundary, is itself an ellipse. We apply
    this result to study local spectral rigidity of ellipses using the connection
    between the wave trace of the Laplacian and the dynamics near the boundary and
    establish rigidity for almost all of them.'
article_number: '2111.12171'
article_processing_charge: No
arxiv: 1
author:
- first_name: Illya
  full_name: Koval, Illya
  id: 2eed1f3b-896a-11ed-bdf8-93c7c4bf159e
  last_name: Koval
citation:
  ama: Koval I. Local strong Birkhoff conjecture and local spectral rigidity of almost
    every ellipse. <i>arXiv</i>. doi:<a href="https://doi.org/10.48550/ARXIV.2111.12171">10.48550/ARXIV.2111.12171</a>
  apa: Koval, I. (n.d.). Local strong Birkhoff conjecture and local spectral rigidity
    of almost every ellipse. <i>arXiv</i>. <a href="https://doi.org/10.48550/ARXIV.2111.12171">https://doi.org/10.48550/ARXIV.2111.12171</a>
  chicago: Koval, Illya. “Local Strong Birkhoff Conjecture and Local Spectral Rigidity
    of Almost Every Ellipse.” <i>ArXiv</i>, n.d. <a href="https://doi.org/10.48550/ARXIV.2111.12171">https://doi.org/10.48550/ARXIV.2111.12171</a>.
  ieee: I. Koval, “Local strong Birkhoff conjecture and local spectral rigidity of
    almost every ellipse,” <i>arXiv</i>. .
  ista: Koval I. Local strong Birkhoff conjecture and local spectral rigidity of almost
    every ellipse. arXiv, 2111.12171.
  mla: Koval, Illya. “Local Strong Birkhoff Conjecture and Local Spectral Rigidity
    of Almost Every Ellipse.” <i>ArXiv</i>, 2111.12171, doi:<a href="https://doi.org/10.48550/ARXIV.2111.12171">10.48550/ARXIV.2111.12171</a>.
  short: I. Koval, ArXiv (n.d.).
date_created: 2023-09-06T08:35:43Z
date_published: 2021-11-23T00:00:00Z
date_updated: 2023-09-15T06:44:00Z
day: '23'
department:
- _id: GradSch
doi: 10.48550/ARXIV.2111.12171
external_id:
  arxiv:
  - '2111.12171'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2111.12171
month: '11'
oa: 1
oa_version: Preprint
publication: arXiv
publication_status: submitted
status: public
title: Local strong Birkhoff conjecture and local spectral rigidity of almost every
  ellipse
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '9002'
abstract:
- lang: eng
  text: ' We prove that, for the binary erasure channel (BEC), the polar-coding paradigm
    gives rise to codes that not only approach the Shannon limit but do so under the
    best possible scaling of their block length as a function of the gap to capacity.
    This result exhibits the first known family of binary codes that attain both optimal
    scaling and quasi-linear complexity of encoding and decoding. Our proof is based
    on the construction and analysis of binary polar codes with large kernels. When
    communicating reliably at rates within ε>0 of capacity, the code length n often
    scales as O(1/εμ), where the constant μ is called the scaling exponent. It is
    known that the optimal scaling exponent is μ=2, and it is achieved by random linear
    codes. The scaling exponent of conventional polar codes (based on the 2×2 kernel)
    on the BEC is μ=3.63. This falls far short of the optimal scaling guaranteed by
    random codes. Our main contribution is a rigorous proof of the following result:
    for the BEC, there exist ℓ×ℓ binary kernels, such that polar codes constructed
    from these kernels achieve scaling exponent μ(ℓ) that tends to the optimal value
    of 2 as ℓ grows. We furthermore characterize precisely how large ℓ needs to be
    as a function of the gap between μ(ℓ) and 2. The resulting binary codes maintain
    the recursive structure of conventional polar codes, and thereby achieve construction
    complexity O(n) and encoding/decoding complexity O(nlogn).'
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Arman
  full_name: Fazeli, Arman
  last_name: Fazeli
- first_name: Hamed
  full_name: Hassani, Hamed
  last_name: Hassani
- first_name: Marco
  full_name: Mondelli, Marco
  id: 27EB676C-8706-11E9-9510-7717E6697425
  last_name: Mondelli
  orcid: 0000-0002-3242-7020
- first_name: Alexander
  full_name: Vardy, Alexander
  last_name: Vardy
citation:
  ama: 'Fazeli A, Hassani H, Mondelli M, Vardy A. Binary linear codes with optimal
    scaling: Polar codes with large kernels. <i>IEEE Transactions on Information Theory</i>.
    2021;67(9):5693-5710. doi:<a href="https://doi.org/10.1109/TIT.2020.3038806">10.1109/TIT.2020.3038806</a>'
  apa: 'Fazeli, A., Hassani, H., Mondelli, M., &#38; Vardy, A. (2021). Binary linear
    codes with optimal scaling: Polar codes with large kernels. <i>IEEE Transactions
    on Information Theory</i>. IEEE. <a href="https://doi.org/10.1109/TIT.2020.3038806">https://doi.org/10.1109/TIT.2020.3038806</a>'
  chicago: 'Fazeli, Arman, Hamed Hassani, Marco Mondelli, and Alexander Vardy. “Binary
    Linear Codes with Optimal Scaling: Polar Codes with Large Kernels.” <i>IEEE Transactions
    on Information Theory</i>. IEEE, 2021. <a href="https://doi.org/10.1109/TIT.2020.3038806">https://doi.org/10.1109/TIT.2020.3038806</a>.'
  ieee: 'A. Fazeli, H. Hassani, M. Mondelli, and A. Vardy, “Binary linear codes with
    optimal scaling: Polar codes with large kernels,” <i>IEEE Transactions on Information
    Theory</i>, vol. 67, no. 9. IEEE, pp. 5693–5710, 2021.'
  ista: 'Fazeli A, Hassani H, Mondelli M, Vardy A. 2021. Binary linear codes with
    optimal scaling: Polar codes with large kernels. IEEE Transactions on Information
    Theory. 67(9), 5693–5710.'
  mla: 'Fazeli, Arman, et al. “Binary Linear Codes with Optimal Scaling: Polar Codes
    with Large Kernels.” <i>IEEE Transactions on Information Theory</i>, vol. 67,
    no. 9, IEEE, 2021, pp. 5693–710, doi:<a href="https://doi.org/10.1109/TIT.2020.3038806">10.1109/TIT.2020.3038806</a>.'
  short: A. Fazeli, H. Hassani, M. Mondelli, A. Vardy, IEEE Transactions on Information
    Theory 67 (2021) 5693–5710.
date_created: 2021-01-10T23:01:18Z
date_published: 2021-09-01T00:00:00Z
date_updated: 2024-03-07T12:18:50Z
day: '01'
department:
- _id: MaMo
doi: 10.1109/TIT.2020.3038806
external_id:
  arxiv:
  - '1711.01339'
intvolume: '        67'
issue: '9'
language:
- iso: eng
month: '09'
oa_version: Preprint
page: 5693-5710
publication: IEEE Transactions on Information Theory
publication_identifier:
  eissn:
  - 1557-9654
  issn:
  - 0018-9448
publication_status: published
publisher: IEEE
quality_controlled: '1'
related_material:
  record:
  - id: '6665'
    relation: earlier_version
    status: public
scopus_import: '1'
status: public
title: 'Binary linear codes with optimal scaling: Polar codes with large kernels'
type: journal_article
user_id: 3E5EF7F0-F248-11E8-B48F-1D18A9856A87
volume: 67
year: '2021'
...
---
_id: '9005'
abstract:
- lang: eng
  text: Studies on the experimental realization of two-dimensional anyons in terms
    of quasiparticles have been restricted, so far, to only anyons on the plane. It
    is known, however, that the geometry and topology of space can have significant
    effects on quantum statistics for particles moving on it. Here, we have undertaken
    the first step toward realizing the emerging fractional statistics for particles
    restricted to move on the sphere instead of on the plane. We show that such a
    model arises naturally in the context of quantum impurity problems. In particular,
    we demonstrate a setup in which the lowest-energy spectrum of two linear bosonic
    or fermionic molecules immersed in a quantum many-particle environment can coincide
    with the anyonic spectrum on the sphere. This paves the way toward the experimental
    realization of anyons on the sphere using molecular impurities. Furthermore, since
    a change in the alignment of the molecules corresponds to the exchange of the
    particles on the sphere, such a realization reveals a novel type of exclusion
    principle for molecular impurities, which could also be of use as a powerful technique
    to measure the statistics parameter. Finally, our approach opens up a simple numerical
    route to investigate the spectra of many anyons on the sphere. Accordingly, we
    present the spectrum of two anyons on the sphere in the presence of a Dirac monopole
    field.
acknowledgement: "We are grateful to A. Ghazaryan for valuable discussions and also
  thank the anonymous referees for comments. D.L. acknowledges financial support from
  the G¨oran Gustafsson Foundation (grant no. 1804) and LMU Munich. M.L. gratefully
  acknowledges financial support\r\nby the European Research Council (ERC) under the
  European Union’s Horizon 2020 research and innovation programme (grant agreements
  No 801770)."
article_number: '015301'
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Morris
  full_name: Brooks, Morris
  id: B7ECF9FC-AA38-11E9-AC9A-0930E6697425
  last_name: Brooks
  orcid: 0000-0002-6249-0928
- first_name: Mikhail
  full_name: Lemeshko, Mikhail
  id: 37CB05FA-F248-11E8-B48F-1D18A9856A87
  last_name: Lemeshko
  orcid: 0000-0002-6990-7802
- first_name: D.
  full_name: Lundholm, D.
  last_name: Lundholm
- first_name: Enderalp
  full_name: Yakaboylu, Enderalp
  id: 38CB71F6-F248-11E8-B48F-1D18A9856A87
  last_name: Yakaboylu
  orcid: 0000-0001-5973-0874
citation:
  ama: Brooks M, Lemeshko M, Lundholm D, Yakaboylu E. Molecular impurities as a realization
    of anyons on the two-sphere. <i>Physical Review Letters</i>. 2021;126(1). doi:<a
    href="https://doi.org/10.1103/PhysRevLett.126.015301">10.1103/PhysRevLett.126.015301</a>
  apa: Brooks, M., Lemeshko, M., Lundholm, D., &#38; Yakaboylu, E. (2021). Molecular
    impurities as a realization of anyons on the two-sphere. <i>Physical Review Letters</i>.
    American Physical Society. <a href="https://doi.org/10.1103/PhysRevLett.126.015301">https://doi.org/10.1103/PhysRevLett.126.015301</a>
  chicago: Brooks, Morris, Mikhail Lemeshko, D. Lundholm, and Enderalp Yakaboylu.
    “Molecular Impurities as a Realization of Anyons on the Two-Sphere.” <i>Physical
    Review Letters</i>. American Physical Society, 2021. <a href="https://doi.org/10.1103/PhysRevLett.126.015301">https://doi.org/10.1103/PhysRevLett.126.015301</a>.
  ieee: M. Brooks, M. Lemeshko, D. Lundholm, and E. Yakaboylu, “Molecular impurities
    as a realization of anyons on the two-sphere,” <i>Physical Review Letters</i>,
    vol. 126, no. 1. American Physical Society, 2021.
  ista: Brooks M, Lemeshko M, Lundholm D, Yakaboylu E. 2021. Molecular impurities
    as a realization of anyons on the two-sphere. Physical Review Letters. 126(1),
    015301.
  mla: Brooks, Morris, et al. “Molecular Impurities as a Realization of Anyons on
    the Two-Sphere.” <i>Physical Review Letters</i>, vol. 126, no. 1, 015301, American
    Physical Society, 2021, doi:<a href="https://doi.org/10.1103/PhysRevLett.126.015301">10.1103/PhysRevLett.126.015301</a>.
  short: M. Brooks, M. Lemeshko, D. Lundholm, E. Yakaboylu, Physical Review Letters
    126 (2021).
date_created: 2021-01-17T23:01:10Z
date_published: 2021-01-08T00:00:00Z
date_updated: 2023-08-07T13:32:10Z
day: '08'
department:
- _id: MiLe
- _id: RoSe
doi: 10.1103/PhysRevLett.126.015301
ec_funded: 1
external_id:
  arxiv:
  - '2009.05948'
  isi:
  - '000606325000003'
intvolume: '       126'
isi: 1
issue: '1'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/2009.05948
month: '01'
oa: 1
oa_version: Preprint
project:
- _id: 2688CF98-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '801770'
  name: 'Angulon: physics and applications of a new quasiparticle'
publication: Physical Review Letters
publication_identifier:
  eissn:
  - '10797114'
  issn:
  - '00319007'
publication_status: published
publisher: American Physical Society
quality_controlled: '1'
related_material:
  link:
  - description: News on IST Homepage
    relation: press_release
    url: https://ist.ac.at/en/news/dancing-molecules-and-two-dimensional-particles/
  record:
  - id: '12390'
    relation: dissertation_contains
    status: public
scopus_import: '1'
status: public
title: Molecular impurities as a realization of anyons on the two-sphere
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 126
year: '2021'
...
---
_id: '9006'
abstract:
- lang: eng
  text: Cytoplasm is a gel-like crowded environment composed of various macromolecules,
    organelles, cytoskeletal networks, and cytosol. The structure of the cytoplasm
    is highly organized and heterogeneous due to the crowding of its constituents
    and their effective compartmentalization. In such an environment, the diffusive
    dynamics of the molecules are restricted, an effect that is further amplified
    by clustering and anchoring of molecules. Despite the crowded nature of the cytoplasm
    at the microscopic scale, large-scale reorganization of the cytoplasm is essential
    for important cellular functions, such as cell division and polarization. How
    such mesoscale reorganization of the cytoplasm is achieved, especially for large
    cells such as oocytes or syncytial tissues that can span hundreds of micrometers
    in size, is only beginning to be understood. In this review, we will discuss recent
    advances in elucidating the molecular, cellular, and biophysical mechanisms by
    which the cytoskeleton drives cytoplasmic reorganization across different scales,
    structures, and species.
acknowledgement: We would like to thank Justine Renno for illustrations and Edouard
  Hannezo and members of the Heisenberg group for their comments on previous versions
  of the manuscript.
article_processing_charge: No
article_type: original
author:
- first_name: Shayan
  full_name: Shamipour, Shayan
  id: 40B34FE2-F248-11E8-B48F-1D18A9856A87
  last_name: Shamipour
- first_name: Silvia
  full_name: Caballero Mancebo, Silvia
  id: 2F1E1758-F248-11E8-B48F-1D18A9856A87
  last_name: Caballero Mancebo
  orcid: 0000-0002-5223-3346
- first_name: Carl-Philipp J
  full_name: Heisenberg, Carl-Philipp J
  id: 39427864-F248-11E8-B48F-1D18A9856A87
  last_name: Heisenberg
  orcid: 0000-0002-0912-4566
citation:
  ama: Shamipour S, Caballero Mancebo S, Heisenberg C-PJ. Cytoplasm’s got moves. <i>Developmental
    Cell</i>. 2021;56(2):P213-226. doi:<a href="https://doi.org/10.1016/j.devcel.2020.12.002">10.1016/j.devcel.2020.12.002</a>
  apa: Shamipour, S., Caballero Mancebo, S., &#38; Heisenberg, C.-P. J. (2021). Cytoplasm’s
    got moves. <i>Developmental Cell</i>. Elsevier. <a href="https://doi.org/10.1016/j.devcel.2020.12.002">https://doi.org/10.1016/j.devcel.2020.12.002</a>
  chicago: Shamipour, Shayan, Silvia Caballero Mancebo, and Carl-Philipp J Heisenberg.
    “Cytoplasm’s Got Moves.” <i>Developmental Cell</i>. Elsevier, 2021. <a href="https://doi.org/10.1016/j.devcel.2020.12.002">https://doi.org/10.1016/j.devcel.2020.12.002</a>.
  ieee: S. Shamipour, S. Caballero Mancebo, and C.-P. J. Heisenberg, “Cytoplasm’s
    got moves,” <i>Developmental Cell</i>, vol. 56, no. 2. Elsevier, pp. P213-226,
    2021.
  ista: Shamipour S, Caballero Mancebo S, Heisenberg C-PJ. 2021. Cytoplasm’s got moves.
    Developmental Cell. 56(2), P213-226.
  mla: Shamipour, Shayan, et al. “Cytoplasm’s Got Moves.” <i>Developmental Cell</i>,
    vol. 56, no. 2, Elsevier, 2021, pp. P213-226, doi:<a href="https://doi.org/10.1016/j.devcel.2020.12.002">10.1016/j.devcel.2020.12.002</a>.
  short: S. Shamipour, S. Caballero Mancebo, C.-P.J. Heisenberg, Developmental Cell
    56 (2021) P213-226.
date_created: 2021-01-17T23:01:10Z
date_published: 2021-01-25T00:00:00Z
date_updated: 2024-03-25T23:30:10Z
day: '25'
department:
- _id: CaHe
doi: 10.1016/j.devcel.2020.12.002
external_id:
  isi:
  - '000613273900009'
  pmid:
  - '33321104'
intvolume: '        56'
isi: 1
issue: '2'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.1016/j.devcel.2020.12.002
month: '01'
oa: 1
oa_version: Published Version
page: P213-226
pmid: 1
publication: Developmental Cell
publication_identifier:
  eissn:
  - '18781551'
  issn:
  - '15345807'
publication_status: published
publisher: Elsevier
quality_controlled: '1'
related_material:
  record:
  - id: '9623'
    relation: dissertation_contains
    status: public
scopus_import: '1'
status: public
title: Cytoplasm's got moves
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 56
year: '2021'
...
---
_id: '9009'
abstract:
- lang: eng
  text: Recent advancements in live cell imaging technologies have identified the
    phenomenon of intracellular propagation of late apoptotic events, such as cytochrome
    c release and caspase activation. The mechanism, prevalence, and speed of apoptosis
    propagation remain unclear. Additionally, no studies have demonstrated propagation
    of the pro-apoptotic protein, BAX. To evaluate the role of BAX in intracellular
    apoptotic propagation, we used high speed live-cell imaging to visualize fluorescently
    tagged-BAX recruitment to mitochondria in four immortalized cell lines. We show
    that propagation of mitochondrial BAX recruitment occurs in parallel to cytochrome
    c and SMAC/Diablo release and is affected by cellular morphology, such that cells
    with processes are more likely to exhibit propagation. The initiation of propagation
    events is most prevalent in the distal tips of processes, while the rate of propagation
    is influenced by the 2-dimensional width of the process. Propagation was rarely
    observed in the cell soma, which exhibited near synchronous recruitment of BAX.
    Propagation velocity is not affected by mitochondrial volume in segments of processes,
    but is negatively affected by mitochondrial density. There was no evidence of
    a propagating wave of increased levels of intracellular calcium ions. Alternatively,
    we did observe a uniform increase in superoxide build-up in cellular mitochondria,
    which was released as a propagating wave simultaneously with the propagating recruitment
    of BAX to the mitochondrial outer membrane.
acknowledgement: This work was supported by National Institute of Health grants R01
  EY030123, P30 EY016665, and T32 GM081061, an unrestricted research grant from Research
  to Prevent Blindness, Inc., and the Frederick A. Davis Endowment from the Department
  of Ophthalmology and Visual Sciences at the University of Wisconsin-Madison.
article_processing_charge: No
article_type: original
author:
- first_name: Joshua A.
  full_name: Grosser, Joshua A.
  last_name: Grosser
- first_name: Margaret E
  full_name: Maes, Margaret E
  id: 3838F452-F248-11E8-B48F-1D18A9856A87
  last_name: Maes
  orcid: 0000-0001-9642-1085
- first_name: Robert W.
  full_name: Nickells, Robert W.
  last_name: Nickells
citation:
  ama: Grosser JA, Maes ME, Nickells RW. Characteristics of intracellular propagation
    of mitochondrial BAX recruitment during apoptosis. <i>Apoptosis</i>. 2021;26(2):132-145.
    doi:<a href="https://doi.org/10.1007/s10495-020-01654-w">10.1007/s10495-020-01654-w</a>
  apa: Grosser, J. A., Maes, M. E., &#38; Nickells, R. W. (2021). Characteristics
    of intracellular propagation of mitochondrial BAX recruitment during apoptosis.
    <i>Apoptosis</i>. Springer Nature. <a href="https://doi.org/10.1007/s10495-020-01654-w">https://doi.org/10.1007/s10495-020-01654-w</a>
  chicago: Grosser, Joshua A., Margaret E Maes, and Robert W. Nickells. “Characteristics
    of Intracellular Propagation of Mitochondrial BAX Recruitment during Apoptosis.”
    <i>Apoptosis</i>. Springer Nature, 2021. <a href="https://doi.org/10.1007/s10495-020-01654-w">https://doi.org/10.1007/s10495-020-01654-w</a>.
  ieee: J. A. Grosser, M. E. Maes, and R. W. Nickells, “Characteristics of intracellular
    propagation of mitochondrial BAX recruitment during apoptosis,” <i>Apoptosis</i>,
    vol. 26, no. 2. Springer Nature, pp. 132–145, 2021.
  ista: Grosser JA, Maes ME, Nickells RW. 2021. Characteristics of intracellular propagation
    of mitochondrial BAX recruitment during apoptosis. Apoptosis. 26(2), 132–145.
  mla: Grosser, Joshua A., et al. “Characteristics of Intracellular Propagation of
    Mitochondrial BAX Recruitment during Apoptosis.” <i>Apoptosis</i>, vol. 26, no.
    2, Springer Nature, 2021, pp. 132–45, doi:<a href="https://doi.org/10.1007/s10495-020-01654-w">10.1007/s10495-020-01654-w</a>.
  short: J.A. Grosser, M.E. Maes, R.W. Nickells, Apoptosis 26 (2021) 132–145.
date_created: 2021-01-17T23:01:11Z
date_published: 2021-02-01T00:00:00Z
date_updated: 2023-08-07T13:32:40Z
day: '01'
department:
- _id: SaSi
doi: 10.1007/s10495-020-01654-w
external_id:
  isi:
  - '000606722600001'
  pmid:
  - '33426618'
intvolume: '        26'
isi: 1
issue: '2'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8082518/
month: '02'
oa: 1
oa_version: Submitted Version
page: 132-145
pmid: 1
publication: Apoptosis
publication_identifier:
  eissn:
  - 1573-675X
  issn:
  - 1360-8185
publication_status: published
publisher: Springer Nature
quality_controlled: '1'
scopus_import: '1'
status: public
title: Characteristics of intracellular propagation of mitochondrial BAX recruitment
  during apoptosis
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 26
year: '2021'
...
---
_id: '9010'
abstract:
- lang: eng
  text: Availability of the essential macronutrient nitrogen in soil plays a critical
    role in plant growth, development, and impacts agricultural productivity. Plants
    have evolved different strategies for sensing and responding to heterogeneous
    nitrogen distribution. Modulation of root system architecture, including primary
    root growth and branching, is among the most essential plant adaptions to ensure
    adequate nitrogen acquisition. However, the immediate molecular pathways coordinating
    the adjustment of root growth in response to distinct nitrogen sources, such as
    nitrate or ammonium, are poorly understood. Here, we show that growth as manifested
    by cell division and elongation is synchronized by coordinated auxin flux between
    two adjacent outer tissue layers of the root. This coordination is achieved by
    nitrate‐dependent dephosphorylation of the PIN2 auxin efflux carrier at a previously
    uncharacterized phosphorylation site, leading to subsequent PIN2 lateralization
    and thereby regulating auxin flow between adjacent tissues. A dynamic computer
    model based on our experimental data successfully recapitulates experimental observations.
    Our study provides mechanistic insights broadening our understanding of root growth
    mechanisms in dynamic environments.
acknowledged_ssus:
- _id: Bio
acknowledgement: 'We acknowledge Gergely Molnar for critical reading of the manuscript,
  Alexander Johnson for language editing and Yulija Salanenka for technical assistance.
  Work in the Benkova laboratory was supported by the Austrian Science Fund (FWF01_I1774S)
  to KO, RA and EB. Work in the Benkova laboratory was supported by the Austrian Science
  Fund (FWF01_I1774S) to KO, RA and EB and by the DOC Fellowship Programme of the
  AustrianAcademy of Sciences (25008) to C.A. Work in the Wabnik laboratory was supported
  by the Programa de Atraccion de Talento 2017 (Comunidad deMadrid, 2017-T1/BIO-5654
  to K.W.), Severo Ochoa Programme for Centres of Excellence in R&D from the Agencia
  Estatal de Investigacion of Spain (grantSEV-2016-0672 (2017-2021) to K.W. via the
  CBGP) and Programa Estatal de Generacion del Conocimiento y Fortalecimiento Científico
  y Tecnologico del Sistema de I+D+I 2019 (PGC2018-093387-A-I00) from MICIU (to K.W.).
  M.M.was supported by a postdoctoral contract associated to SEV-2016-0672.We acknowledge
  the Bioimaging Facility in IST-Austria and the Advanced Microscopy Facility of the
  Vienna Bio Center Core Facilities, member of the Vienna Bio Center Austria, for
  use of the OMX v43D SIM microscope. AJ was supported by the Austrian Science Fund
  (FWF): I03630 to J.F'
article_number: e106862
article_processing_charge: Yes (via OA deal)
article_type: original
author:
- first_name: Krisztina
  full_name: Ötvös, Krisztina
  id: 29B901B0-F248-11E8-B48F-1D18A9856A87
  last_name: Ötvös
  orcid: 0000-0002-5503-4983
- first_name: Marco
  full_name: Marconi, Marco
  last_name: Marconi
- first_name: Andrea
  full_name: Vega, Andrea
  last_name: Vega
- first_name: Jose
  full_name: O’Brien, Jose
  last_name: O’Brien
- first_name: Alexander J
  full_name: Johnson, Alexander J
  id: 46A62C3A-F248-11E8-B48F-1D18A9856A87
  last_name: Johnson
  orcid: 0000-0002-2739-8843
- first_name: Rashed
  full_name: Abualia, Rashed
  id: 4827E134-F248-11E8-B48F-1D18A9856A87
  last_name: Abualia
  orcid: 0000-0002-9357-9415
- first_name: Livio
  full_name: Antonielli, Livio
  last_name: Antonielli
- first_name: Juan C
  full_name: Montesinos López, Juan C
  id: 310A8E3E-F248-11E8-B48F-1D18A9856A87
  last_name: Montesinos López
  orcid: 0000-0001-9179-6099
- first_name: Yuzhou
  full_name: Zhang, Yuzhou
  id: 3B6137F2-F248-11E8-B48F-1D18A9856A87
  last_name: Zhang
  orcid: 0000-0003-2627-6956
- first_name: Shutang
  full_name: Tan, Shutang
  id: 2DE75584-F248-11E8-B48F-1D18A9856A87
  last_name: Tan
  orcid: 0000-0002-0471-8285
- first_name: Candela
  full_name: Cuesta, Candela
  id: 33A3C818-F248-11E8-B48F-1D18A9856A87
  last_name: Cuesta
  orcid: 0000-0003-1923-2410
- first_name: Christina
  full_name: Artner, Christina
  id: 45DF286A-F248-11E8-B48F-1D18A9856A87
  last_name: Artner
- first_name: Eleonore
  full_name: Bouguyon, Eleonore
  last_name: Bouguyon
- first_name: Alain
  full_name: Gojon, Alain
  last_name: Gojon
- first_name: Jiří
  full_name: Friml, Jiří
  id: 4159519E-F248-11E8-B48F-1D18A9856A87
  last_name: Friml
  orcid: 0000-0002-8302-7596
- first_name: Rodrigo A.
  full_name: Gutiérrez, Rodrigo A.
  last_name: Gutiérrez
- first_name: Krzysztof T
  full_name: Wabnik, Krzysztof T
  id: 4DE369A4-F248-11E8-B48F-1D18A9856A87
  last_name: Wabnik
  orcid: 0000-0001-7263-0560
- first_name: Eva
  full_name: Benková, Eva
  id: 38F4F166-F248-11E8-B48F-1D18A9856A87
  last_name: Benková
  orcid: 0000-0002-8510-9739
citation:
  ama: Ötvös K, Marconi M, Vega A, et al. Modulation of plant root growth by nitrogen
    source-defined regulation of polar auxin transport. <i>EMBO Journal</i>. 2021;40(3).
    doi:<a href="https://doi.org/10.15252/embj.2020106862">10.15252/embj.2020106862</a>
  apa: Ötvös, K., Marconi, M., Vega, A., O’Brien, J., Johnson, A. J., Abualia, R.,
    … Benková, E. (2021). Modulation of plant root growth by nitrogen source-defined
    regulation of polar auxin transport. <i>EMBO Journal</i>. Embo Press. <a href="https://doi.org/10.15252/embj.2020106862">https://doi.org/10.15252/embj.2020106862</a>
  chicago: Ötvös, Krisztina, Marco Marconi, Andrea Vega, Jose O’Brien, Alexander J
    Johnson, Rashed Abualia, Livio Antonielli, et al. “Modulation of Plant Root Growth
    by Nitrogen Source-Defined Regulation of Polar Auxin Transport.” <i>EMBO Journal</i>.
    Embo Press, 2021. <a href="https://doi.org/10.15252/embj.2020106862">https://doi.org/10.15252/embj.2020106862</a>.
  ieee: K. Ötvös <i>et al.</i>, “Modulation of plant root growth by nitrogen source-defined
    regulation of polar auxin transport,” <i>EMBO Journal</i>, vol. 40, no. 3. Embo
    Press, 2021.
  ista: Ötvös K, Marconi M, Vega A, O’Brien J, Johnson AJ, Abualia R, Antonielli L,
    Montesinos López JC, Zhang Y, Tan S, Cuesta C, Artner C, Bouguyon E, Gojon A,
    Friml J, Gutiérrez RA, Wabnik KT, Benková E. 2021. Modulation of plant root growth
    by nitrogen source-defined regulation of polar auxin transport. EMBO Journal.
    40(3), e106862.
  mla: Ötvös, Krisztina, et al. “Modulation of Plant Root Growth by Nitrogen Source-Defined
    Regulation of Polar Auxin Transport.” <i>EMBO Journal</i>, vol. 40, no. 3, e106862,
    Embo Press, 2021, doi:<a href="https://doi.org/10.15252/embj.2020106862">10.15252/embj.2020106862</a>.
  short: K. Ötvös, M. Marconi, A. Vega, J. O’Brien, A.J. Johnson, R. Abualia, L. Antonielli,
    J.C. Montesinos López, Y. Zhang, S. Tan, C. Cuesta, C. Artner, E. Bouguyon, A.
    Gojon, J. Friml, R.A. Gutiérrez, K.T. Wabnik, E. Benková, EMBO Journal 40 (2021).
date_created: 2021-01-17T23:01:12Z
date_published: 2021-02-01T00:00:00Z
date_updated: 2024-03-25T23:30:22Z
day: '01'
ddc:
- '580'
department:
- _id: JiFr
- _id: EvBe
doi: 10.15252/embj.2020106862
external_id:
  isi:
  - '000604645600001'
  pmid:
  - ' 33399250'
file:
- access_level: open_access
  checksum: dc55c900f3b061d6c2790b8813d759a3
  content_type: application/pdf
  creator: dernst
  date_created: 2021-02-11T12:28:29Z
  date_updated: 2021-02-11T12:28:29Z
  file_id: '9110'
  file_name: 2021_Embo_Otvos.pdf
  file_size: 2358617
  relation: main_file
  success: 1
file_date_updated: 2021-02-11T12:28:29Z
has_accepted_license: '1'
intvolume: '        40'
isi: 1
issue: '3'
language:
- iso: eng
month: '02'
oa: 1
oa_version: Published Version
pmid: 1
project:
- _id: 2542D156-B435-11E9-9278-68D0E5697425
  call_identifier: FWF
  grant_number: I 1774-B16
  name: Hormone cross-talk drives nutrient dependent plant development
- _id: 2685A872-B435-11E9-9278-68D0E5697425
  name: Hormonal regulation of plant adaptive responses to environmental signals
- _id: 26538374-B435-11E9-9278-68D0E5697425
  call_identifier: FWF
  grant_number: I03630
  name: Molecular mechanisms of endocytic cargo recognition in plants
publication: EMBO Journal
publication_identifier:
  eissn:
  - '14602075'
  issn:
  - '02614189'
publication_status: published
publisher: Embo Press
quality_controlled: '1'
related_material:
  link:
  - description: News on IST Homepage
    relation: press_release
    url: https://ist.ac.at/en/news/a-plants-way-to-its-favorite-food/
  record:
  - id: '10303'
    relation: dissertation_contains
    status: public
scopus_import: '1'
status: public
title: Modulation of plant root growth by nitrogen source-defined regulation of polar
  auxin transport
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 40
year: '2021'
...
---
_id: '9020'
abstract:
- lang: eng
  text: 'We study dynamics and thermodynamics of ion transport in narrow, water-filled
    channels, considered as effective 1D Coulomb systems. The long range nature of
    the inter-ion interactions comes about due to the dielectric constants mismatch
    between the water and the surrounding medium, confining the electric filed to
    stay mostly within the water-filled channel. Statistical mechanics of such Coulomb
    systems is dominated by entropic effects which may be accurately accounted for
    by mapping onto an effective quantum mechanics. In presence of multivalent ions
    the corresponding quantum mechanics appears to be non-Hermitian. In this review
    we discuss a framework for semiclassical calculations for the effective non-Hermitian
    Hamiltonians. Non-Hermiticity elevates WKB action integrals from the real line
    to closed cycles on a complex Riemann surfaces where direct calculations are not
    attainable. We circumvent this issue by applying tools from algebraic topology,
    such as the Picard-Fuchs equation. We discuss how its solutions relate to the
    thermodynamics and correlation functions of multivalent solutions within narrow,
    water-filled channels. '
acknowledgement: "A.K. was supported by NSF grants DMR-2037654. T.G. acknowledges
  funding from the Institute of Science and Technology (IST) Austria, and from the
  European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie
  Grant Agreement No. 754411.\r\nWe are indebted to Boris Shklovskii for introducing
  us to the problem, and Alexander Gorsky and Peter Koroteev for introducing us to
  the Picard-Fuchs methods. A very special thanks goes to Michael Janas for several
  years of excellent collaboration on these topics. TG thanks Michael Kreshchuk for
  introduction to the exact WKB method and great collaboration on related projects.
  Figure 3 and Figure 4 are reproduced from Reference [25] with friendly permission
  by the Russian Academy of Sciences. Figure 2, Figure 4, Figure 5, Figure 6, and
  Figure 8 are reproduced from Reference [26] with friendly permission by IOP Publishing."
article_number: e23010125
article_processing_charge: Yes
article_type: original
arxiv: 1
author:
- first_name: Tobias
  full_name: Gulden, Tobias
  id: 1083E038-9F73-11E9-A4B5-532AE6697425
  last_name: Gulden
  orcid: 0000-0001-6814-7541
- first_name: Alex
  full_name: Kamenev, Alex
  last_name: Kamenev
citation:
  ama: Gulden T, Kamenev A. Dynamics of ion channels via non-hermitian quantum mechanics.
    <i>Entropy</i>. 2021;23(1). doi:<a href="https://doi.org/10.3390/e23010125">10.3390/e23010125</a>
  apa: Gulden, T., &#38; Kamenev, A. (2021). Dynamics of ion channels via non-hermitian
    quantum mechanics. <i>Entropy</i>. MDPI. <a href="https://doi.org/10.3390/e23010125">https://doi.org/10.3390/e23010125</a>
  chicago: Gulden, Tobias, and Alex Kamenev. “Dynamics of Ion Channels via Non-Hermitian
    Quantum Mechanics.” <i>Entropy</i>. MDPI, 2021. <a href="https://doi.org/10.3390/e23010125">https://doi.org/10.3390/e23010125</a>.
  ieee: T. Gulden and A. Kamenev, “Dynamics of ion channels via non-hermitian quantum
    mechanics,” <i>Entropy</i>, vol. 23, no. 1. MDPI, 2021.
  ista: Gulden T, Kamenev A. 2021. Dynamics of ion channels via non-hermitian quantum
    mechanics. Entropy. 23(1), e23010125.
  mla: Gulden, Tobias, and Alex Kamenev. “Dynamics of Ion Channels via Non-Hermitian
    Quantum Mechanics.” <i>Entropy</i>, vol. 23, no. 1, e23010125, MDPI, 2021, doi:<a
    href="https://doi.org/10.3390/e23010125">10.3390/e23010125</a>.
  short: T. Gulden, A. Kamenev, Entropy 23 (2021).
date_created: 2021-01-19T11:12:06Z
date_published: 2021-01-19T00:00:00Z
date_updated: 2023-08-07T13:34:18Z
day: '19'
ddc:
- '530'
department:
- _id: MaSe
doi: 10.3390/e23010125
ec_funded: 1
external_id:
  arxiv:
  - '2012.01390'
  isi:
  - '000610122000001'
file:
- access_level: open_access
  checksum: 6cd0e706156827c45c740534bd32c179
  content_type: application/pdf
  creator: tgulden
  date_created: 2021-01-19T11:11:14Z
  date_updated: 2021-01-19T11:11:14Z
  file_id: '9021'
  file_name: Final published paper.pdf
  file_size: 981285
  relation: main_file
file_date_updated: 2021-01-19T11:11:14Z
has_accepted_license: '1'
intvolume: '        23'
isi: 1
issue: '1'
language:
- iso: eng
month: '01'
oa: 1
oa_version: Published Version
project:
- _id: 260C2330-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '754411'
  name: ISTplus - Postdoctoral Fellowships
publication: Entropy
publication_identifier:
  eissn:
  - 1099-4300
publication_status: published
publisher: MDPI
quality_controlled: '1'
status: public
title: Dynamics of ion channels via non-hermitian quantum mechanics
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 23
year: '2021'
...
---
_id: '9022'
abstract:
- lang: eng
  text: "In the first part of the thesis we consider Hermitian random matrices. Firstly,
    we consider sample covariance matrices XX∗ with X having independent identically
    distributed (i.i.d.) centred entries. We prove a Central Limit Theorem for differences
    of linear statistics of XX∗ and its minor after removing the first column of X.
    Secondly, we consider Wigner-type matrices and prove that the eigenvalue statistics
    near cusp singularities of the limiting density of states are universal and that
    they form a Pearcey process. Since the limiting eigenvalue distribution admits
    only square root (edge) and cubic root (cusp) singularities, this concludes the
    third and last remaining case of the Wigner-Dyson-Mehta universality conjecture.
    The main technical ingredients are an optimal local law at the cusp, and the proof
    of the fast relaxation to equilibrium of the Dyson Brownian motion in the cusp
    regime.\r\nIn the second part we consider non-Hermitian matrices X with centred
    i.i.d. entries. We normalise the entries of X to have variance N −1. It is well
    known that the empirical eigenvalue density converges to the uniform distribution
    on the unit disk (circular law). In the first project, we prove universality of
    the local eigenvalue statistics close to the edge of the spectrum. This is the
    non-Hermitian analogue of the TracyWidom universality at the Hermitian edge. Technically
    we analyse the evolution of the spectral distribution of X along the Ornstein-Uhlenbeck
    flow for very long time\r\n(up to t = +∞). In the second project, we consider
    linear statistics of eigenvalues for macroscopic test functions f in the Sobolev
    space H2+ϵ and prove their convergence to the projection of the Gaussian Free
    Field on the unit disk. We prove this result for non-Hermitian matrices with real
    or complex entries. The main technical ingredients are: (i) local law for products
    of two resolvents at different spectral parameters, (ii) analysis of correlated
    Dyson Brownian motions.\r\nIn the third and final part we discuss the mathematically
    rigorous application of supersymmetric techniques (SUSY ) to give a lower tail
    estimate of the lowest singular value of X − z, with z ∈ C. More precisely, we
    use superbosonisation formula to give an integral representation of the resolvent
    of (X − z)(X − z)∗ which reduces to two and three contour integrals in the complex
    and real case, respectively. The rigorous analysis of these integrals is quite
    challenging since simple saddle point analysis cannot be applied (the main contribution
    comes from a non-trivial manifold). Our result\r\nimproves classical smoothing
    inequalities in the regime |z| ≈ 1; this result is essential to prove edge universality
    for i.i.d. non-Hermitian matrices."
acknowledgement: I gratefully acknowledge the financial support from the European
  Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie
  Grant Agreement No. 665385 and my advisor’s ERC Advanced Grant No. 338804.
alternative_title:
- ISTA Thesis
article_processing_charge: No
author:
- first_name: Giorgio
  full_name: Cipolloni, Giorgio
  id: 42198EFA-F248-11E8-B48F-1D18A9856A87
  last_name: Cipolloni
  orcid: 0000-0002-4901-7992
citation:
  ama: Cipolloni G. Fluctuations in the spectrum of random matrices. 2021. doi:<a
    href="https://doi.org/10.15479/AT:ISTA:9022">10.15479/AT:ISTA:9022</a>
  apa: Cipolloni, G. (2021). <i>Fluctuations in the spectrum of random matrices</i>.
    Institute of Science and Technology Austria. <a href="https://doi.org/10.15479/AT:ISTA:9022">https://doi.org/10.15479/AT:ISTA:9022</a>
  chicago: Cipolloni, Giorgio. “Fluctuations in the Spectrum of Random Matrices.”
    Institute of Science and Technology Austria, 2021. <a href="https://doi.org/10.15479/AT:ISTA:9022">https://doi.org/10.15479/AT:ISTA:9022</a>.
  ieee: G. Cipolloni, “Fluctuations in the spectrum of random matrices,” Institute
    of Science and Technology Austria, 2021.
  ista: Cipolloni G. 2021. Fluctuations in the spectrum of random matrices. Institute
    of Science and Technology Austria.
  mla: Cipolloni, Giorgio. <i>Fluctuations in the Spectrum of Random Matrices</i>.
    Institute of Science and Technology Austria, 2021, doi:<a href="https://doi.org/10.15479/AT:ISTA:9022">10.15479/AT:ISTA:9022</a>.
  short: G. Cipolloni, Fluctuations in the Spectrum of Random Matrices, Institute
    of Science and Technology Austria, 2021.
date_created: 2021-01-21T18:16:54Z
date_published: 2021-01-25T00:00:00Z
date_updated: 2023-09-07T13:29:32Z
day: '25'
ddc:
- '510'
degree_awarded: PhD
department:
- _id: GradSch
- _id: LaEr
doi: 10.15479/AT:ISTA:9022
ec_funded: 1
file:
- access_level: open_access
  checksum: 5a93658a5f19478372523ee232887e2b
  content_type: application/pdf
  creator: gcipollo
  date_created: 2021-01-25T14:19:03Z
  date_updated: 2021-01-25T14:19:03Z
  file_id: '9043'
  file_name: thesis.pdf
  file_size: 4127796
  relation: main_file
  success: 1
- access_level: closed
  checksum: e8270eddfe6a988e92a53c88d1d19b8c
  content_type: application/zip
  creator: gcipollo
  date_created: 2021-01-25T14:19:10Z
  date_updated: 2021-01-25T14:19:10Z
  file_id: '9044'
  file_name: Thesis_files.zip
  file_size: 12775206
  relation: source_file
file_date_updated: 2021-01-25T14:19:10Z
has_accepted_license: '1'
language:
- iso: eng
month: '01'
oa: 1
oa_version: Published Version
page: '380'
project:
- _id: 2564DBCA-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '665385'
  name: International IST Doctoral Program
- _id: 258DCDE6-B435-11E9-9278-68D0E5697425
  call_identifier: FP7
  grant_number: '338804'
  name: Random matrices, universality and disordered quantum systems
publication_identifier:
  issn:
  - 2663-337X
publication_status: published
publisher: Institute of Science and Technology Austria
status: public
supervisor:
- first_name: László
  full_name: Erdös, László
  id: 4DBD5372-F248-11E8-B48F-1D18A9856A87
  last_name: Erdös
  orcid: 0000-0001-5366-9603
title: Fluctuations in the spectrum of random matrices
type: dissertation
user_id: c635000d-4b10-11ee-a964-aac5a93f6ac1
year: '2021'
...
---
_id: '9036'
abstract:
- lang: eng
  text: In this short note, we prove that the square root of the quantum Jensen-Shannon
    divergence is a true metric on the cone of positive matrices, and hence in particular
    on the quantum state space.
acknowledgement: D. Virosztek was supported by the European Union's Horizon 2020 research
  and innovation programme under the Marie Skłodowska-Curie Grant Agreement No. 846294,
  and partially supported by the Hungarian National Research, Development and Innovation
  Office (NKFIH) via grants no. K124152, and no. KH129601.
article_number: '107595'
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Daniel
  full_name: Virosztek, Daniel
  id: 48DB45DA-F248-11E8-B48F-1D18A9856A87
  last_name: Virosztek
  orcid: 0000-0003-1109-5511
citation:
  ama: Virosztek D. The metric property of the quantum Jensen-Shannon divergence.
    <i>Advances in Mathematics</i>. 2021;380(3). doi:<a href="https://doi.org/10.1016/j.aim.2021.107595">10.1016/j.aim.2021.107595</a>
  apa: Virosztek, D. (2021). The metric property of the quantum Jensen-Shannon divergence.
    <i>Advances in Mathematics</i>. Elsevier. <a href="https://doi.org/10.1016/j.aim.2021.107595">https://doi.org/10.1016/j.aim.2021.107595</a>
  chicago: Virosztek, Daniel. “The Metric Property of the Quantum Jensen-Shannon Divergence.”
    <i>Advances in Mathematics</i>. Elsevier, 2021. <a href="https://doi.org/10.1016/j.aim.2021.107595">https://doi.org/10.1016/j.aim.2021.107595</a>.
  ieee: D. Virosztek, “The metric property of the quantum Jensen-Shannon divergence,”
    <i>Advances in Mathematics</i>, vol. 380, no. 3. Elsevier, 2021.
  ista: Virosztek D. 2021. The metric property of the quantum Jensen-Shannon divergence.
    Advances in Mathematics. 380(3), 107595.
  mla: Virosztek, Daniel. “The Metric Property of the Quantum Jensen-Shannon Divergence.”
    <i>Advances in Mathematics</i>, vol. 380, no. 3, 107595, Elsevier, 2021, doi:<a
    href="https://doi.org/10.1016/j.aim.2021.107595">10.1016/j.aim.2021.107595</a>.
  short: D. Virosztek, Advances in Mathematics 380 (2021).
date_created: 2021-01-22T17:55:17Z
date_published: 2021-03-26T00:00:00Z
date_updated: 2023-08-07T13:34:48Z
day: '26'
department:
- _id: LaEr
doi: 10.1016/j.aim.2021.107595
ec_funded: 1
external_id:
  arxiv:
  - '1910.10447'
  isi:
  - '000619676100035'
intvolume: '       380'
isi: 1
issue: '3'
keyword:
- General Mathematics
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://arxiv.org/abs/1910.10447
month: '03'
oa: 1
oa_version: Preprint
project:
- _id: 26A455A6-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '846294'
  name: Geometric study of Wasserstein spaces and free probability
publication: Advances in Mathematics
publication_identifier:
  issn:
  - 0001-8708
publication_status: published
publisher: Elsevier
quality_controlled: '1'
status: public
title: The metric property of the quantum Jensen-Shannon divergence
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 380
year: '2021'
...
