Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Invited Paper)
2021
Abstract
The Neural Tangent Kernel is a new way to understand the gradient descent in deep neural networks, connecting them with kernel methods. In this talk, I'll introduce this formalism and give a number of results on the Neural Tangent Kernel and explain how they give us insight into the dynamics of neural networks during training and into their generalization features.
Details
Title
Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Invited Paper)
Author(s)
Jacot, Arthur ; Gabriel, Franck ; Hongler, Clement
Published in
Stoc '21: Proceedings Of The 53Rd Annual Acm Sigact Symposium On Theory Of Computing
Series
Annual ACM Symposium on Theory of Computing
Pages
6-6
Conference
53rd Annual ACM SIGACT Symposium on Theory of Computing (STOC), Jun 21-25, 2021, ELECTR NETWORK
Date
2021-01-01
Publisher
New York, ASSOC COMPUTING MACHINERY
ISSN
0737-8017
ISBN
978-1-4503-8053-9
Keywords
Other identifier(s)
View record in Web of Science
Laboratories
CSFT
Record Appears in
Scientific production and competences > SB - School of Basic Sciences > MATH - Institute of Mathematics > CSFT - Chair of Statistical Field Theory
Peer-reviewed publications
Conference Papers
Work produced at EPFL
Published
Peer-reviewed publications
Conference Papers
Work produced at EPFL
Published
Record creation date
2022-07-18