-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathindex.html
More file actions
34 lines (34 loc) · 2.97 KB
/
index.html
File metadata and controls
34 lines (34 loc) · 2.97 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
layout: default
title: Homepage of Antonio (Tony) SILVETI-FALLS
---
<img style='object-fit: contain; float: right; margin-left: 10px; margin-bottom: 5px; border-radius: 15%;'
src="images/presentation_SPARS.jpg"
alt="SPARS 2019 Conference Presentation" />
<h2>News</h2>
<p>
<ul class="news">
<li>
<span style="color: green;">HIRING:</span> I have positions for a PhD student and a master's student to join our team studying implicit differentiation and its applications in machine learning. Please contact me for more information.
</li>
<li>
New preprint: <a href="https://hal.science/hal-05312218">"Adaptive Conditional Gradient Descent"</a> in which we come up with a better backtracking for adaptive Frank-Wolfe and steepest descent!
</li>
<li>
Our paper <a href="https://arxiv.org/abs/2506.01913">"Generalized Gradient Norm Clipping & Non-Euclidean (L0,L1)-Smoothness"</a> has been accepted at NeurIPS 2025 as an oral presentation!
</li>
<li>
The code for <a href="https://github.com/LIONS-EPFL/scion">Scion</a> is out! Try our state of the art optimizer for neural networks on your problem.
</li>
<li>
Our paper <a href="https://arxiv.org/abs/2502.07529">"Training Large Neural Networks with Norm-Constrained Linear Minimization Oracles"</a> has been accepted at ICML 2025 as a spotlight!
</li>
</ul>
</p>
<hr>
<h2>About Me</h2>
<p> My name is Antonio Silveti-Falls but I mostly go by Tony. I am an associate professor (<a href="https://www.galaxie.enseignementsup-recherche.gouv.fr/ensup/pdf/EC_pays_etrangers/Tableau_comparaison_au_26_septembre_2012.pdf">maître de conférences en français</a>) in France doing research in nonsmooth optimization. Since September 2022 I am at <a href="https://www.centralesupelec.fr/">CentraleSupélec</a>/<a href="https://www.universite-paris-saclay.fr/en">University of Paris-Saclay</a> in the <a href="https://cvn.centralesupelec.fr/">Centre pour la Vision Numérique laboratory</a>. I am also a part of the <a href="https://opis-inria.eu/members/">INRIA team OPIS - OPtImization for large Scale biomedical data.</a> Last but not least, I am a proud member of the <a href="https://fd-math.pages.centralesupelec.fr/">Fédération de Mathématiques</a> de CentraleSupélec.<br>
</p><br>
<hr>
<h2>Research</h2>
<p>I work at the intersection of optimization theory and machine learning, with a focus on developing scalable algorithms for large-scale problems. A lot of my research looks at conditional gradient (Frank-Wolfe) methods and their applications, for instance to deep learning and training large neural networks. I have broad interests in {nonsmooth, stochastic, noneuclidean} optimization, such as using Bregman divergences or relaxed definitions of smoothness, spanning both convex and nonconvex settings. I am also very interested in developing the theory of path differentiable functions and conservative calculus to study automatic differentiation, especially for implicitly defined functions.</p>