{"year":"2023","project":[{"_id":"059876FA-7A3F-11EA-A408-12923DDC885E","name":"Prix Lopez-Loretta 2019 - Marco Mondelli"}],"acknowledgement":"Marco Mondelli was partially supported by the 2019 Lopez-Loreta prize.","isi":1,"publication_identifier":{"isbn":["9798350301496"],"eissn":["2475-4218"]},"external_id":{"isi":["001031733100053"],"arxiv":["2212.01572"]},"page":"294-298","doi":"10.1109/ITW55543.2023.10160238","publication":"2023 IEEE Information Theory Workshop","author":[{"full_name":"Xu, Yizhou","last_name":"Xu","first_name":"Yizhou"},{"full_name":"Hou, Tian Qi","last_name":"Hou","first_name":"Tian Qi"},{"first_name":"Shan Suo","last_name":"Liang","full_name":"Liang, Shan Suo"},{"full_name":"Mondelli, Marco","id":"27EB676C-8706-11E9-9510-7717E6697425","first_name":"Marco","last_name":"Mondelli","orcid":"0000-0002-3242-7020"}],"language":[{"iso":"eng"}],"_id":"13321","conference":{"start_date":"2023-04-23","name":"ITW: Information Theory Workshop","location":"Saint-Malo, France","end_date":"2023-04-28"},"citation":{"ama":"Xu Y, Hou TQ, Liang SS, Mondelli M. Approximate message passing for multi-layer estimation in rotationally invariant models. In: 2023 IEEE Information Theory Workshop. Institute of Electrical and Electronics Engineers; 2023:294-298. doi:10.1109/ITW55543.2023.10160238","ista":"Xu Y, Hou TQ, Liang SS, Mondelli M. 2023. Approximate message passing for multi-layer estimation in rotationally invariant models. 2023 IEEE Information Theory Workshop. ITW: Information Theory Workshop, 294–298.","apa":"Xu, Y., Hou, T. Q., Liang, S. S., & Mondelli, M. (2023). Approximate message passing for multi-layer estimation in rotationally invariant models. In 2023 IEEE Information Theory Workshop (pp. 294–298). Saint-Malo, France: Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/ITW55543.2023.10160238","chicago":"Xu, Yizhou, Tian Qi Hou, Shan Suo Liang, and Marco Mondelli. “Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models.” In 2023 IEEE Information Theory Workshop, 294–98. Institute of Electrical and Electronics Engineers, 2023. https://doi.org/10.1109/ITW55543.2023.10160238.","short":"Y. Xu, T.Q. Hou, S.S. Liang, M. Mondelli, in:, 2023 IEEE Information Theory Workshop, Institute of Electrical and Electronics Engineers, 2023, pp. 294–298.","mla":"Xu, Yizhou, et al. “Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models.” 2023 IEEE Information Theory Workshop, Institute of Electrical and Electronics Engineers, 2023, pp. 294–98, doi:10.1109/ITW55543.2023.10160238.","ieee":"Y. Xu, T. Q. Hou, S. S. Liang, and M. Mondelli, “Approximate message passing for multi-layer estimation in rotationally invariant models,” in 2023 IEEE Information Theory Workshop, Saint-Malo, France, 2023, pp. 294–298."},"scopus_import":"1","publisher":"Institute of Electrical and Electronics Engineers","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","date_created":"2023-07-30T22:01:04Z","oa":1,"oa_version":"Preprint","status":"public","title":"Approximate message passing for multi-layer estimation in rotationally invariant models","quality_controlled":"1","date_updated":"2024-09-10T13:03:19Z","article_processing_charge":"No","month":"05","day":"01","type":"conference","main_file_link":[{"open_access":"1","url":"https://doi.org/10.48550/arXiv.2212.01572"}],"abstract":[{"lang":"eng","text":"We consider the problem of reconstructing the signal and the hidden variables from observations coming from a multi-layer network with rotationally invariant weight matrices. The multi-layer structure models inference from deep generative priors, and the rotational invariance imposed on the weights generalizes the i.i.d. Gaussian assumption by allowing for a complex correlation structure, which is typical in applications. In this work, we present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion which precisely characterizes their performance in the large system limit. In contrast with the existing multi-layer VAMP (ML-VAMP) approach, our proposed AMP – dubbed multilayer rotationally invariant generalized AMP (ML-RI-GAMP) – provides a natural generalization beyond Gaussian designs, in the sense that it recovers the existing Gaussian AMP as a special case. Furthermore, ML-RI-GAMP exhibits a significantly lower complexity than ML-VAMP, as the computationally intensive singular value decomposition is replaced by an estimation of the moments of the design matrices. Finally, our numerical results show that this complexity gain comes at little to no cost in the performance of the algorithm."}],"department":[{"_id":"MaMo"}],"date_published":"2023-05-01T00:00:00Z","publication_status":"published"}