I’m a PhD candidate in the School of Computer Science at Tel Aviv University, fortunate to be advised by Nadav Cohen.

My research interests broadly include the theoretical foundations and applications of machine learning. More specifically, I focus on mathematically analyzing aspects of deep learning (such as expressiveness, optimization, and generalization), with the goal of establishing theoretically backed practices.

My research is generously supported by the Apple Scholars in AI/ML and the Tel Aviv University Center for AI & Data Science PhD fellowships.

**Email:** noamrazin (at) mail.tau.ac.il

**May 22**: Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks accepted to ICML 2022.**Mar 22**: Honored to receive the 2022 Apple Scholars in AI/ML PhD fellowship.**Jan 22**: New paper analyzes implicit regularization in hierarchical tensor factorizations, which are equivalent to certain deep convolutional neural networks. Turns out it leads to a bias towards locality that can be countered using dedicated*explicit*regularization!**Oct 21**: Honored to receive the Tel Aviv University Center for AI & Data Science excellence fellowship.**May 21**: Implicit Regularization in Tensor Factorization accepted to ICML 2021. Check out this blog post for an overview.**Sep 20**: Implicit Regularization in Deep Learning May Not Be Explainable by Norms accepted to NeurIPS 2020.**Nov 19**: Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding accepted to AAAI 2020.

* denotes equal contribution.

- Teaching Assistant for Foundations of Deep Learning (course #0368-3080), Tel Aviv University, 2021 to present