ETH Zürich
Nicolas Zucchet
Institut für Theoretische Informatik
OAT Z11
Andreasstrasse 5
8092 Zürich
The least-control principle for learning at equilibrium
A. Meulemans*, N. Zucchet*, S. Kobayashi*, J. von Oswald and J. Sacramento
Selected as an oral, 36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
Poster: Cosyne, Montreal (Canada), March 2023.
Talk: Swiss Computational Neuroscience Retreat, Crans Montana (Switzerland), March 2023.
A contrastive rule for meta-learning
N. Zucchet*, S. Schug*, J. von Oswald*, D. Zhao and J. Sacramento
36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
Poster: Champalimaud Research Symposium, Lisbon (Portugal), October 2021.
MLSS^N, Krakow (Poland), July 2022.
Cosyne, Montreal (Canada), March 2023.
Talk: Swiss Computational Neuroscience Retreat, Crans Montana (Switzerland), March 2022.
Beyond backpropagation: bilevel optimization through implicit differentiation and equilibrium propagation
N. Zucchet and J. Sacramento
Neural Computation 34 (12), 2022.
Learning where to learn: Gradient sparsity in meta and continual learning
J. von Oswald*, D. Zhao*, S. Kobayashi, S. Schug, M. Caccia, N. Zucchet and J. Sacramento
35th Conference on Neural Information Processing Systems (NeurIPS), 2021.
Random initialisations performing above chance and how to find them
F. Benzing, S. Schug, R. Meier, J. von Oswald, Y. Akram, N. Zucchet, L. Aitchison*, A. Steger*
OPT2022: 14th Annual Workshop on Optimization for Machine Learning (NeurIPS), 2022.