Pierfrancesco Beneventano


Hi, I'm Pier! I am a PhD student at Princeton University, part of ORFE department and the ML theory group. My interests lie in Machine Learning Theory and the mathematics to develop it. I'm fortunate to be advised by Prof. Boris Hanin and Prof. Jason D. Lee, and to be able to collaborate with Prof. Tomaso Poggio (MIT).

In Summer '22, I was at INRIA Paris hosted by Dr. Blake Woodworth in the group of Prof. Francis Bach. In Fall '22, I was at AWS AI Labs as a research intern. Previously, I got an MSc in math at ETH Zurich, where I worked with Prof. Arnulf Jentzen. For more, check my CV .

  pierb at princeton dot edu  


            CV                                    

Research

I am a mathematician working on the theory of machine learning. My work is motivated by enabling the deployment of machine learning techniques in safety critical contexts. I believe that developing a theory for deep learning is key to assessing when its use is harmful and to what extent. So, my long term objective is to understand which machine learning model works in what setting, how well, and why. Currently, I am working towards this goal by trying to gain a theoretical understanding of what neural networks we find by studying the optimization problem behind them. More precisely, I am investigating the implicit effect of SGD as it is a fundamental step towards understanding the generalization capabilities of neural networks.

Check out my new papers: On the Trajectories of SGD Without Replacement and How Neural Networks Learn the Support is an Implicit Regularization Effect of SGD

Preprints
PontTuset
How Neural Networks Learn the Support is an Implicit Regularization Effect of SGD.
Pierfrancesco Beneventano, Andrea Pinto, and Prof. Tomaso Poggio,

We found out that neural networks often implicitly identify what input variables are relevant in the first layer. We proved that this is the case when training with SGD (with higher speed for smaller batch or bigger step size), but it is not the case when training with vanilla GD.
PontTuset
On the Trajectories of SGD Without Replacement.
Pierfrancesco Beneventano.

We characterized the trajectories taken by the most commonly used training algorithm and explained why some phenomena are frequently empirically observed.
PontTuset
Deep neural network approximation theory for high-dimensional functions.
Pierfrancesco Beneventano, Prof. Patrick Cheridito, Robin Graeber, Prof. Arnulf Jentzen, and Benno Kuckuck.

We study the capacity of neural networks to approximate high-dimensional functions without suffering from the curse of dimensionality. We prove that they can, on any compact domain, for a vast and new class of functions.
PontTuset
High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations.
Pierfrancesco Beneventano, Prof. Patrick Cheridito, Prof. Arnulf Jentzen, and Philippe von Wurstemberger.

We develop a new machinery to study the capacity of neural networks to approximate high-dimensional functions without suffering from the curse of dimensionality. We prove that this is the case, for example, of a certain family of PDEs.
Besides that...

Lately, I'm getting increasingly interested in the ethical aspects of my job. I am contributing to the organization of multiple events to raise the awareness of the community around the responsibilities of modelers and statisticians for the high-stakes decisions and policies that are based on their work. See CEST-UCL Seminar series on responsible modelling and check out our conference.

I'm in the committee of the Princeton AI Club (PAIC)! Follow us on twitter! We'll host a lot of exciting talks featuring Yoshua Bengio, Max Welling, Chelsea Finn, Tamara Broderick, etc.

Talks, Visits, and Random News


[Jun '24]    I, Andrea Pinto, and Prof. Tomaso Poggio posted our new article How Neural Networks Learn the Support is an Implicit Regularization Effect of SGD
[14 Jun '24]    I'm giving a talk at ETH Zurich. Thanks a lot to Niao He and Zebang Shen for inviting!
[1 Apr '24]    I'm giving a talk at Brown. Thanks a lot to Govind Menon for inviting!
[18 Mar '24]    I'm giving a talk at MIT. Thanks a lot to Prof. Tomaso Poggio for inviting!
[24 Jan '24]    I'm giving a talk at Sapienza in Rome. Thanks a lot to Matteo Negri for inviting!
[11 Jan '24]    I'm giving a talk hosted by the group MaLGa at University of Genoa. Thanks a lot to Lorenzo Rosasco, Andrea Della Vecchia, and all the fantastic people of the group for hosting me!
[Dec '23]    I finally posted on arXiv the article about the project I've been working on for 2+ years On the Trajectories of SGD Without Replacement. Please check it out and let me know if you have any feedback, any questions, or any thought for a future project!
[Aug '23]    I am in Cargese, FR, for the conference Statistical Physics and Machine Learning Bach Together Again. Thanks to the organizers!
[Dec '22 and May '23]    I was interviewed in Italian at Zapping on Rai Radio 1, an Italian public radio channel, on AI, ChatGPT, and the related privacy and legal issues.
[Sept '22 - Jan '23]    I'm at AWS AI Labs in Santa Clara, CA! Please reach out if you are located in Bay Area or you're a collegue at AWS!
[May - Oct '22]    I'm a committee member of the Princeton AI Club (PAIC)! Please follow us on twitter! We'll host a lot of exciting talks featuring Yoshua Bengio, Max Welling, Chelsea Finn, Tamara Broderick, etc.
[Apr '22]    I passed my general examination and obtained my incidental MSc in Operations Research and Financial Engineering from Princeton! I am super thankful to my advisers Prof. Boris Hanin and Prof. Jason D. Lee. Also, a big thank you to the rest of my committee: Prof. Jianqing Fan and Prof. Bartolomeo Stellato!
[28 Mar - 2 Apr '22]    I will visit Berkeley for a week!
[15 Dec '21, 5pm UK]    I am chairing the event with Prof. Bin Yu, organized within the CEST-UCL Seminar series on responsible modelling.
[1 Dec '21, 5pm UK]    I will be a panelist at the event with Prof. Mary Morgan on the importance of the narrative in mathematical modelling. Organized within the CEST-UCL Seminar series on responsible modelling.
[3 Nov '21, 5pm UK]    I will be a panelist at the event with Prof. Cynthia Rudin on the relationship between the assumptions of a model and its interpretability. Organized within the CEST-UCL Seminar series on responsible modelling.
[Oct '21 - Apr '22]    With a group of friends at CEST I'm organizing series of seminars on "Responsible modelling in uncertain times: ethics of quantification in action" supported by INET and hosted by UCL IIPP. Please check out the website!
[Jul '21]    I'm attending the Deep Learning Theory Summer School at Princeton organized by Prof. Boris Hanin.
[Jun '21]    With a group of friends at CEST I'm organizing a conference on "Forecasting the future for sustainable development: New Approaches to Modelling and the Science of Prediction" supported by INET and hosted by OECD. Please check out the website!
I will be chair for the session on Explainable and Interpretable ML and I'm glad to announce that Prof. Cynthia Rudin will give us a keynote lecture on it. Moreover, AI Ethics - IBM will organize a workshop on XAI. We will have also many others amazing guests and interesting sessions, please check out!


Teaching
Optimization , Princeton University, Spring 2024.

Audience: Students from various bachelors.

Tasks: Teach Precepts, Hold office hours, grade homeworks and exams, organization.

Contents: An introduction to several fundamental and practically-relevant areas of modern optimization and numerical computing. Topics include computational linear algebra, first and second order descent methods, convex sets and functions, basics of linear and semidefinite programming, optimization for statistical regression and classification, and techniques for dealing with uncertainty and intractability in optimization problems. Extensive hands-on experience with high-level optimization software. Applications drawn from operations research, statistics and machine learning, economics, control theory, and engineering.

Computing and Optimization for the Physical and Social Sciences , Princeton University, Fall 2023.

Audience: Students from various bachelors.

Tasks: Hold office hours, grade homeworks and exams, organization.

Contents: An introduction to several fundamental and practically-relevant areas of modern optimization and numerical computing. Topics include computational linear algebra, first and second order descent methods, convex sets and functions, basics of linear and semidefinite programming, optimization for statistical regression and classification, and techniques for dealing with uncertainty and intractability in optimization problems. Extensive hands-on experience with high-level optimization software. Applications drawn from operations research, statistics and machine learning, economics, control theory, and engineering.

Analysis of Big Data, Princeton University, Spring 2023.

Audience: Students from various bachelors.

Tasks: Head TA. Teach the precepts, hold office hours, grade homeworks and exams.

Contents: This course is a theoretically oriented introduction to the statistical tools that underpin modern machine learning, whose hallmarks are large datasets and/or complex models. Topics include a rigorous analysis of dimensionality reduction, a survey of models ranging from regression to neural networks, and an analysis of learning algorithms.

Analysis of Big Data, Princeton University, Spring 2022.

Audience: Students from various bachelors.

Tasks: Teach the precepts, hold office hours, grade homeworks and exams.

Contents: This course is a theoretically oriented introduction to the statistical tools that underpin modern machine learning, whose hallmarks are large datasets and/or complex models. Topics include a rigorous analysis of dimensionality reduction, a survey of models ranging from regression to neural networks, and an analysis of learning algorithms.

Energy and Commodities Markets, Princeton University, Fall 2021.

Audience: Master in Finance and BSc in Operations Research and Financial Engineering.

Tasks: Teach the precepts, hold office hours, grade homeworks and exams (Python, Excel, and theory).

Contents: This course is an introduction to commodities markets (oil, gas, metals, electricity, etc.), and quantitative approaches to capturing uncertainties in their demand and supply. We start from a financial perspective, and traditional models of commodity spot prices and forward curves. Then we cover modern topics: game theoretic models of energy production (OPEC vs. fracking vs. renewables); quantifying the risk of intermittency of solar and wind output on the reliability of the electric grid (mitigating the duck curve); financialization of commodity markets; carbon emissions markets; cryptocurrencies as commodities. We also discuss economic and policy implications.

Numerical Methods for Partial Differential Equations, ETH Zurich, Spring 2020.

Audience: Master in Physics, Data Science, Computational Biology. Bachelor in Computational Science and Engineering.

Tasks: Teach the precepts and grade the homeworks (C++ and theory).

Contents: Derivation, properties, and implementation of fundamental numerical methods for a few key partial differential equations: convection-diffusion, heat equation, wave equation, conservation laws. Implementation in C++ based on a finite element library.

Computational Methods for Engineering Applications, ETH Zurich, Fall 2019.
Audience: Bachelor in Mechanical Engineering.

Tasks: Teach the precepts and grade the homeworks (C++ and theory).

Contents: Introduction to the numerical methods for the solution of ordinary and partial differential equations that play a central role in engineering applications. Both basic theoretical concepts and implementation techniques necessary to understand and master the methods are addressed.










Last modified on Jun 20th 2024.

Template credits to Jon Barron!