Nicolas Zucchet

ETH Zürich
Nicolas Zucchet
Institut für Theoretische Informatik
OAT Z11
Andreasstrasse 5
8092 Zürich

E-Mail: nzucchet@ethz.ch




I am a second year PhD student in the group of Prof. Dr. Angelika Steger, under the supervision of João Sacramento. Prior to this I received my Master's degree from ETH Zürich in computer science in 2021 and my bachelor in applied mathematics from École polytechnique (Palaiseau, France) in 2019. My research interests lie at the intersection of artificial intelligence and neuroscience: I try to understand how artificial and biological networks can learn to quickly adapt to new challenges.

Publications

The least-control principle for learning at equilibrium
A. Meulemans*, N. Zucchet*, S. Kobayashi*, J. von Oswald and J. Sacramento
Selected as an oral, 36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
Poster: Cosyne, Montreal (Canada), March 2023.
Talk: Swiss Computational Neuroscience Retreat, Crans Montana (Switzerland), March 2023.

A contrastive rule for meta-learning
N. Zucchet*, S. Schug*, J. von Oswald*, D. Zhao and J. Sacramento
36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
Poster: Champalimaud Research Symposium, Lisbon (Portugal), October 2021.
    MLSS^N, Krakow (Poland), July 2022.
    Cosyne, Montreal (Canada), March 2023.
Talk: Swiss Computational Neuroscience Retreat, Crans Montana (Switzerland), March 2022.

Beyond backpropagation: bilevel optimization through implicit differentiation and equilibrium propagation
N. Zucchet and J. Sacramento
Neural Computation 34 (12), 2022.

Learning where to learn: Gradient sparsity in meta and continual learning
J. von Oswald*, D. Zhao*, S. Kobayashi, S. Schug, M. Caccia, N. Zucchet and J. Sacramento
35th Conference on Neural Information Processing Systems (NeurIPS), 2021.

Workshop papers

Random initialisations performing above chance and how to find them
F. Benzing, S. Schug, R. Meier, J. von Oswald, Y. Akram, N. Zucchet, L. Aitchison*, A. Steger*
OPT2022: 14th Annual Workshop on Optimization for Machine Learning (NeurIPS), 2022.



Check Google scholar for an updated and complete publication history.