@Mat_Dag Profile picture

Mathieu Dagréou

@Mat_Dag

Ph.D. student in at @Inria_Saclay working on Optimization and Machine Learning @matdag.bsky.social

Joined July 2019
Similar User
Mathurin Massias photo

@mathusmassias

Pierre Ablin photo

@PierreAblin

Moreau Thomas photo

@tomamoral

Anna Korba photo

@Korba_Anna

Jérôme Bolte photo

@jerome_bolte

Quentin Bertrand photo

@Qu3ntinB

Franck Iutzeler photo

@FranckIutzeler

Samuel Vaiter photo

@vaiter

Bénédicte Colnet photo

@BenedicteColnet

Rémi Gribonval photo

@RemiGribonval

Michael Eli Sander photo

@m_e_sander

Linus Bleistein photo

@bleistein_linus

Robert M. Gower 🇺🇦 photo

@gowerrobert

Samuel Hurault photo

@HuraultSamuel

Nicolas Dobigeon photo

@NicolasDobigeon

Pinned

📣📣 Preprint alert 📣📣 « A Lower Bound and a Near-Optimal Algorithm for Bilevel Empirical Risk Minimization » w. @tomamoral, @vaiter & @PierreAblin arxiv.org/abs/2302.08766 1/3


Mathieu Dagréou Reposted

Can deep learning finally compete with boosted trees on tabular data? 🌲 In our NeurIPS 2024 paper, we introduce RealMLP, a NN with improvements in all areas and meta-learned default parameters. Some insights about RealMLP and other models on large benchmarks (>200 datasets): 🧵

Tweet Image 1

Mathieu Dagréou Reposted

New blog post: the Hutchinson trace estimator, or how to evaluate divergence/Jacobian trace cheaply. Fundamental for Continuous Normalizing Flows mathurinm.github.io/hutchinson/

Tweet Image 1

Mathieu Dagréou Reposted

📣Job altertS in Toulouse (Maths department) 📣 There are multiple jobs offers from Master internships to Assistant professor in the mathematics of data science, optimization, statistical fairness and robustness. I will try to regroup them in this thread 🧵


Mathieu Dagréou Reposted

We also released Pixtral Large, a new SOTA vision model. mistral.ai/news/pixtral-l…


Mathieu Dagréou Reposted

🔒Image watermarking is promising for digital content protection. But images often undergo many modifications—spliced or altered by AI. Today at @AIatMeta, we released Watermark Anything that answers not only "where does the image come from," but "what part comes from where." 🧵

Tweet Image 1

Mathieu Dagréou Reposted

I have multiple openings for M2 internship / PhD / postdoc in Nice (France) on topics related to bilevel optimization, automatic differentiation and safe machine learning. More details on my webpage samuelvaiter.com Contact me by email, and feel free to forward/RT :)


Mathieu Dagréou Reposted

Convergence of iterates does not imply convergence of the derivatives. Nevertheless, Gilbert (1994) proposed an interversion limit-derivative theorem under strong assumption on the spectrum of the derivatives. who.rocq.inria.fr/Jean-Charles.G…

Tweet Image 1

Mathieu Dagréou Reposted

Leader-follower games, also known as Stackelberg games, are models in game theory where one player (the leader) makes a decision first, and the other player (the follower) responds, considering the leader’s action. This is one the first instance of bilevel optimization.

Tweet Image 1

Mathieu Dagréou Reposted

Looking forward to talk about gradient clipping and (geometric) medians next Monday 🖇️

Tweet Image 1

Mathieu Dagréou Reposted

Łojasiewicz inequality provides a way to control how close points are to the zeros of a real analytic function based on the value of the function itself. Extension of this result to semialgebraic or o-minimal functions exist. matwbn.icm.edu.pl/ksiazki/sm/sm1…

Tweet Image 1

Mathieu Dagréou Reposted

#ERCStG 🏆 | Félicitations à @TaylorAdrien, chargé de recherche @inria au centre @inria_paris, membre de l'équipe-projet commune @Sierra_ML_Lab (@CNRS @CNRSinformatics @ENS_ULM), lauréat d'une ERC Starting Grant 👏 Découvrez son portrait et son projet 👉 buff.ly/3YlTBU6

Tweet Image 1

Mathieu Dagréou Reposted

The proximal operator generalizes projection in convex optimization. It converts minimisers to fixed points. It is at the core of nonsmooth splitting methods and was first introduced by Jean-Jacques Moreau in 1965. numdam.org/article/BSMF_1…

Tweet Image 1

Mathieu Dagréou Reposted

Generating images via diffusion is cool, but _not_ generating certain images is vital. E.g., an external set of protected images, or already generated images, to increase diversity. Our new diffusion post-hoc intervention SPELL does that. 🧵 1/6 📖 arxiv.org/abs/2410.06025

Tweet Image 1

Mathieu Dagréou Reposted

Bilevel optimization problems with multiple inner solutions come typically in two flavors: optimistic and pessimistic. Optimistic assumes the inner problem selects the best solution for the outer objective, while pessimistic assumes the worst-case solution is chosen.

Tweet Image 1

Mathieu Dagréou Reposted

🏆Didn't get the Physics Nobel Prize this year, but really excited to share that I've been named one of the #FWIS2024 @FondationLOreal-@UNESCO French Young Talents alongside 34 amazing young researchers! This award recognizes my research on deep learning theory #WomenInScience 👩‍💻

Tweet Image 1
Tweet Image 2

#FWIS2024 🎖️@SibylleMarcotte, doctorante au département #mathématiques et applications de l'ENS @psl_univ, figure parmi les lauréates du Prix Jeunes Talents France 2024 @FondationLOreal @UNESCO #ForWomenInScience @AcadSciences @4womeninscience Félicitations à elle !!! 👏

Tweet Image 1


Thanks @vaiter, @tomamoral and @PierreAblin for your great support during this journey

Congratulations to Dr. Dagréou @Mat_Dag for a brillant PhD defense! @tomamoral, @PierreAblin and I were lucky to have you as a student.

Tweet Image 1


Mathieu Dagréou Reposted

🎉✨ Our paper "Geodesic Optimization for Predictive Shift Adaptation on EEG data" has been accepted as a Spotlight at @NeurIPSConf! #NeurIPS2024 arxiv.org/abs/2407.03878 @AntoineCollas @sylvcheva @agramfort @dngman 1/7


Mathieu Dagréou Reposted

How fast is gradient descent, *for real*? Some (partial) answers in this new blog post on scaling laws for optimization. francisbach.com/scaling-laws-o…


Mathieu Dagréou Reposted

Our paper on Functional Bilevel Optimization (FBO) is accepted as a spotlight at #NeurIPS2024! TLDR: FBO shifts focus from parameters to the prediction function they represent, offering new algorithms that bridge bilevel optimization and deep learning. arxiv.org/abs/2403.20233


Mathieu Dagréou Reposted

We are thrilled to say that NeurIPS@Paris is back for a 4th edition on the 4th and 5th of December 2024 at @Sorbonne_Univ_ A great occasion to meet + discuss recent advances in ML in central Paris! More info: neuripsinparis.github.io/neurips2024par… Registration: forms.gle/Fc4nbeW6ubahYb…

Tweet Image 1

Loading...

Something went wrong.


Something went wrong.