Elena-Alexandra Peste
Graduate School
Alistarh Group
Lampert Group
6 Publications
2023 | Accepted | Conference Paper | IST-REx-ID: 13053 |

Peste E-A, Vladu A, Kurtic E, Lampert C, Alistarh D-A. CrAM: A Compression-Aware Minimizer. In: 11th International Conference on Learning Representations .
[Preprint]
View
| Files available
| Download Preprint (ext.)
| arXiv
2023 | Published | Thesis | IST-REx-ID: 13074 |

Peste E-A. Efficiency and generalization of sparse neural networks. 2023. doi:10.15479/at:ista:13074
[Published Version]
View
| Files available
| DOI
2023 | Published | Conference Paper | IST-REx-ID: 14771 |

Iofinova EB, Peste E-A, Alistarh D-A. Bias in pruned vision models: In-depth analysis and countermeasures. In: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition. IEEE; 2023:24364-24373. doi:10.1109/cvpr52729.2023.02334
[Preprint]
View
| Files available
| DOI
| Download Preprint (ext.)
| WoS
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12299 |

Iofinova EB, Peste E-A, Kurtz M, Alistarh D-A. How well do sparse ImageNet models transfer? In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Institute of Electrical and Electronics Engineers; 2022:12256-12266. doi:10.1109/cvpr52688.2022.01195
[Preprint]
View
| Files available
| DOI
| Download Preprint (ext.)
| WoS
| arXiv
2021 | Published | Conference Paper | IST-REx-ID: 11458 |

Peste E-A, Iofinova EB, Vladu A, Alistarh D-A. AC/DC: Alternating Compressed/DeCompressed training of deep neural networks. In: 35th Conference on Neural Information Processing Systems. Vol 34. Curran Associates; 2021:8557-8570.
[Published Version]
View
| Files available
| Download Published Version (ext.)
| arXiv
2021 | Published | Journal Article | IST-REx-ID: 10180 |

Hoefler T, Alistarh D-A, Ben-Nun T, Dryden N, Peste E-A. Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks. Journal of Machine Learning Research. 2021;22(241):1-124.
[Published Version]
View
| Files available
| Download Published Version (ext.)
| arXiv
Grants
6 Publications
2023 | Accepted | Conference Paper | IST-REx-ID: 13053 |

Peste E-A, Vladu A, Kurtic E, Lampert C, Alistarh D-A. CrAM: A Compression-Aware Minimizer. In: 11th International Conference on Learning Representations .
[Preprint]
View
| Files available
| Download Preprint (ext.)
| arXiv
2023 | Published | Thesis | IST-REx-ID: 13074 |

Peste E-A. Efficiency and generalization of sparse neural networks. 2023. doi:10.15479/at:ista:13074
[Published Version]
View
| Files available
| DOI
2023 | Published | Conference Paper | IST-REx-ID: 14771 |

Iofinova EB, Peste E-A, Alistarh D-A. Bias in pruned vision models: In-depth analysis and countermeasures. In: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition. IEEE; 2023:24364-24373. doi:10.1109/cvpr52729.2023.02334
[Preprint]
View
| Files available
| DOI
| Download Preprint (ext.)
| WoS
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12299 |

Iofinova EB, Peste E-A, Kurtz M, Alistarh D-A. How well do sparse ImageNet models transfer? In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Institute of Electrical and Electronics Engineers; 2022:12256-12266. doi:10.1109/cvpr52688.2022.01195
[Preprint]
View
| Files available
| DOI
| Download Preprint (ext.)
| WoS
| arXiv
2021 | Published | Conference Paper | IST-REx-ID: 11458 |

Peste E-A, Iofinova EB, Vladu A, Alistarh D-A. AC/DC: Alternating Compressed/DeCompressed training of deep neural networks. In: 35th Conference on Neural Information Processing Systems. Vol 34. Curran Associates; 2021:8557-8570.
[Published Version]
View
| Files available
| Download Published Version (ext.)
| arXiv
2021 | Published | Journal Article | IST-REx-ID: 10180 |

Hoefler T, Alistarh D-A, Ben-Nun T, Dryden N, Peste E-A. Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks. Journal of Machine Learning Research. 2021;22(241):1-124.
[Published Version]
View
| Files available
| Download Published Version (ext.)
| arXiv