BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UM//UM*Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Detroit
TZURL:http://tzurl.org/zoneinfo/America/Detroit
X-LIC-LOCATION:America/Detroit
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20070311T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20071104T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20241002T152047
DTSTART;TZID=America/Detroit:20241002T160000
DTEND;TZID=America/Detroit:20241002T170000
SUMMARY:Workshop / Seminar:Sociology Senior Thesis + Honors Program Info Session
DESCRIPTION:Interested in writing an honors thesis in Sociology? Join us for an information session with the Honors Coordinator\, Professor Karin Martin\, and the Undergraduate Program Coordinator\, Lauren Eddelbuettel. Explore the program and application process\, hear from current honors students and ask any questions you may have. 
UID:126790-21857913@events.umich.edu
URL:https://events.umich.edu/event/126790
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:Sessions
LOCATION:
CONTACT:
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20240929T164001
DTSTART;TZID=America/Detroit:20241002T160000
DTEND;TZID=America/Detroit:20241002T170000
SUMMARY:Workshop / Seminar:Student ML Seminar: Lightning Talks on Attention-based Architectures\, Diffusion Models and State Space Models
DESCRIPTION:The event will be divided into 3 lightning talks with the following abstracts.\n\nAttention-based Architectures: We will quickly discuss the basic architecture of transformers and intuitions behind some of its components. As time permits\, we will also look through Andrej Karpathy’s “build-nanogpt” repository to precisely understand some implementation details. \n\nDiffusion Models: Diffusion models\, in essence\, work by learning to denoise data that had been noised with a prescribed forward process. While much work has been done on different mathematical formulations of the process\, the architectural part that actually does the learning is often divorced from the former and relegated to a footnote in the articles. In this lightning talk\, I will illustrate the development and workings of the main backbone architectures used in diffusion models\, from the ubiquitous U-Net to approaches utilizing Transformers.\n\nState Space Models: Much recent excitement has centered on sequential data modeling\, which has been recently dominated by the popular Transformer architecture. Transformers\, however\, suffer from a quadratic memory requirement in sequence length\, limiting its use for the ever-increasing context window demands. State-space models (SSMs)\, in particular Mamba\, have arisen as a potential replacement for Transformers\, having linear memory scaling and comparable predictive performance. In this talk\, we discuss the fundamentals of SSMs from the perspective of linear dynamical systems and the core of the Mamba architecture.
UID:127101-21858419@events.umich.edu
URL:https://events.umich.edu/event/127101
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:Mathematics
LOCATION:East Hall - B737
CONTACT:
END:VEVENT
END:VCALENDAR