BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IDSS STAGE - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://idss-stage.mit.edu
X-WR-CALDESC:Events for IDSS STAGE
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20180311T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20181104T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20190310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20191103T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20200308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20201101T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20190918T160000
DTEND;TZID=America/New_York:20190918T170000
DTSTAMP:20260406T115113
CREATED:20190916T194901Z
LAST-MODIFIED:20190916T194901Z
UID:10702-1568822400-1568826000@idss-stage.mit.edu
SUMMARY:Probabilistic Modeling meets Deep Learning using TensorFlow Probability
DESCRIPTION:IDS.190 – Topics in Bayesian Modeling and Computation \nSpeaker: \nBrian Patton (Google AI) \nAbstract: \nTensorFlow Probability provides a toolkit to enable\nresearchers and practitioners to integrate uncertainty with\ngradient-based deep learning on modern accelerators. In this talk\nwe’ll walk through some practical problems addressed using TFP;\ndiscuss the high-level interfaces\, goals\, and principles of the\nlibrary; and touch on some recent innovations in describing\nprobabilistic graphical models. Time-permitting\, we may touch on a\ncouple areas of research interest for the team.\n\n–\n\n**Taking IDS.190 satisfies the seminar requirement for students in MIT’s Interdisciplinary Doctoral Program in Statistics (IDPS)\, but formal registration is open to any graduate student who can register for MIT classes.  For more information and an up-to-date schedule\, please see https://stellar.mit.edu/S/course/IDS/fa19/IDS.190/\n \n**Meetings are open to any interested researcher.
URL:https://stat.mit.edu/calendar/probabilistic-modeling-meets-deep-learning-using-tensorflow-probability/
LOCATION:E18-304\, United States
CATEGORIES:IDS.190 - Topics in Bayesian Modeling and Computation
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20190911T160000
DTEND;TZID=America/New_York:20190911T170000
DTSTAMP:20260406T115113
CREATED:20190910T184518Z
LAST-MODIFIED:20190910T190807Z
UID:10666-1568217600-1568221200@idss-stage.mit.edu
SUMMARY:Automated Data Summarization for Scalability in Bayesian Inference
DESCRIPTION:IDS.190 – Topics in Bayesian Modeling and Computation \nAbstract: \nMany algorithms take prohibitively long to run on modern\, large datasets. But even in complex data sets\, many data points may be at least partially redundant for some task of interest. So one might instead construct and use a weighted subset of the data (called a “coreset”) that is much smaller than the original dataset. Typically running algorithms on a much smaller data set will take much less computing time\, but it remains to understand whether the output can be widely useful. (1) In particular\, can running an analysis on a smaller coreset yield answers close to those from running on the full data set? (2) And can useful coresets be constructed automatically for new analyses\, with minimal extra work from the user? We answer in the affirmative for a wide variety of problems in Bayesian inference. We demonstrate how to construct “Bayesian coresets” as an automatic\, practical pre-processing step. We prove that our method provides geometric decay in relevant approximation error as a function of coreset size. Empirical analysis shows that our method reduces approximation error by orders of magnitude relative to uniform random subsampling of data. Though we focus on Bayesian methods here\, we also show that our construction can be applied in other domains. \nBiography: \nTamara Broderick is an Associate Professor in EECS at MIT. \n**Meetings are open to any interested researcher.  \n**Taking IDS.190 satisfies the seminar requirement for students in MIT’s Interdisciplinary Doctoral Program in Statistics (IDPS)\, but formal registration is open to any graduate student who can register for MIT classes.  For more information and an up-to-date schedule\, please see https://stellar.mit.edu/S/course/IDS/fa19/IDS.190/ \n 
URL:https://stat.mit.edu/calendar/automated-data-summarization-for-scalability-in-bayesian-inference/
LOCATION:E18-304\, United States
CATEGORIES:IDS.190 - Topics in Bayesian Modeling and Computation
END:VEVENT
END:VCALENDAR