Nadav Cohen

 

Nadav Cohen, Ph.D.
School of Mathematics
Institute for Advanced Study
Princeton, New Jersey
E-mail: cohennadav@ias.edu
Office: MOS-106, 1 Einstein Dr, Princeton, NJ

News

Research

My research focuses on the theoretical and algorithmic foundations of deep learning. In particular, I am interested in mathematically analyzing aspects of expressiveness, optimization and generalization, with the goal of deriving theoretically founded procedures and algorithms that will improve practical performance.

Blog Posts

Selected Talks

  • ICML 2018 (Stockholm, Sweden): [slides]

  • ICLR 2018 (Vancouver, Canada): [video|slides]

  • Symposium on the Mathematical Theory of Deep Neural Networks 2018 (Princeton, NJ, USA): [video|slides]

  • Symposium on Physics and Machine Learning 2017 (New York City, NY, USA): [slides]

  • Mathematics of Deep Learning Workshop 2017 (Berlin, Germany): [slides]

  • AAAI Spring Symposium Series 2017 (Palo Alto, CA, USA): [slides]

  • CVPR 2017 (Honolulu, HI, USA): [slides]

  • GAMM 2017 (Weimar, Germany): [slides]

  • NIPS 2016 (Barcelona, Spain): [slides]

  • ICML 2016 (New York City, NY, USA): [video|slides]

  • COLT 2016 (New York City, NY, USA): [video|slides]

Publications (see also Google Scholar)

A Convergence Analysis of Gradient Descent for Deep Linear Neural Networks
Sanjeev Arora, Nadav Cohen, Noah Golowich and Wei Hu (alphabetical order). Oct’18.

Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks
Yoav Levine, Or Sharir, Nadav Cohen, Amnon Shashua. Mar’18.

On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization
Sanjeev Arora, Nadav Cohen and Elad Hazan (alphabetical order). Feb’18.
International Conference on Machine Learning (ICML) 2018.

Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions (extended arXiv version)
Nadav Cohen, Ronen Tamari and Amnon Shashua. Apr’17.
International Conference on Learning Representations (ICLR) 2018.

Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design (extended arXiv version)
Yoav Levine, David Yakira, Nadav Cohen and Amnon Shashua. Apr’17.
International Conference on Learning Representations (ICLR) 2018.

Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions
Nadav Cohen, Or Sharir, Yoav Levine, Ronen Tamari, David Yakira and Amnon Shashua. Jun’17.
Intel Collaborative Research Institute for Computational Intelligence (ICRI-CI) Special Issue on Deep Learning Theory.

Inductive Bias of Deep Convolutional Networks through Pooling Geometry
Nadav Cohen and Amnon Shashua. May’16 (v1), Nov’16 (v2).
International Conference on Learning Representations (ICLR) 2017.

Tensorial Mixture Models
Or Sharir, Ronen Tamari, Nadav Cohen and Amnon Shashua. Oct’16.

Convolutional Rectifier Networks as Generalized Tensor Decompositions (extended arXiv version)
Nadav Cohen and Amnon Shashua. Mar’16.
International Conference on Machine Learning (ICML) 2016.

On the Expressive Power of Deep Learning: A Tensor Analysis
Nadav Cohen, Or Sharir and Amnon Shashua. Sep’15 (v1), Feb’16 (v2).
Conference on Learning Theory (COLT) 2016.

Deep SimNets
Nadav Cohen, Or Sharir and Amnon Shashua. Jun’15 (v1), Nov’15 (v2).
Conference on Computer Vision and Pattern Recognition (CVPR) 2016.

SimNets: A Generalization of Convolutional Networks
Nadav Cohen and Amnon Shashua. Oct’14 (v1), Dec’14 (v2).
Conference on Neural Information Processing Systems (NIPS) 2014, Deep Learning Workshop.