BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UM//UM*Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Detroit
TZURL:http://tzurl.org/zoneinfo/America/Detroit
X-LIC-LOCATION:America/Detroit
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20070311T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20071104T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260224T133525
DTSTART;TZID=America/Detroit:20260408T160000
DTEND;TZID=America/Detroit:20260408T170000
SUMMARY:Workshop / Seminar:Robust and Risk-Sensitive Acceleration in Gradient Methods
DESCRIPTION:First-order methods such as gradient descent (GD) are foundational in optimization. In unconstrained problems with exact gradients\, momentum-based methods—most notably Nesterov’s accelerated gradient descent (AGD) and Polyak’s heavy-ball (HB) method—achieve faster convergence by improving dependence on the condition number. However\, this acceleration comes at a cost: momentum amplifies gradient noise\, making these methods less robust than GD under standard parameter choices and requiring more accurate gradient estimates to attain comparable accuracy. Similar challenges arise in convex and nonconvex min–max optimization.\nMotivated by applications in machine learning\, this talk studies unconstrained and min–max optimization under deterministic\, unbiased stochastic\, and biased stochastic gradient noise. I will present new algorithms that achieve optimal robustness against different noise types\, using control-theoretic tools such as the H_2​ norm\, the H_∞​ norm\, and the risk-sensitivity index\, together with coherent risk measures. I will also discuss worst-case noise constructions and high-probability convergence guarantees. This perspective builds a bridge between optimization and robust control theory and enables the design of noise-robust and risk-sensitive accelerated methods.\nRepresentative Publications:\nM. Gürbüzbalaban\, Y. Syed\, N. S. Aybat\, Accelerated gradient methods with biased gradient estimates: Risk sensitivity\, high-probability guarantees\, and large deviation bounds\, Journal of Nonlinear and Variational Analysis\, 2026 (Special Issue). https://jnva.biemdas.com/archives/2927\nM. Gürbüzbalaban\, Robustly Stable Accelerated Momentum Methods with a Near-Optimal L_2​ Gain and H_∞​ Performance\, Mathematics of Operations Research\, 2025.\nhttps://pubsonline.informs.org/doi/abs/10.1287/moor.2023.0321\nB. Can and M. Gürbüzbalaban\, Entropic risk-averse generalized momentum methods\, Optimization Methods and Software\, 2025. https://www.tandfonline.com/doi/abs/10.1080/10556788.2025.2549356
UID:141373-21888712@events.umich.edu
URL:https://events.umich.edu/event/141373
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:Mathematics
LOCATION:East Hall - 1360
CONTACT:
END:VEVENT
END:VCALENDAR