BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IDSS STAGE - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IDSS STAGE
X-ORIGINAL-URL:https://idss-stage.mit.edu
X-WR-CALDESC:Events for IDSS STAGE
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20150308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20151101T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20160313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20161106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20170312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20171105T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20180311T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20181104T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20190310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20191103T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20160101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20180223T110000
DTEND;TZID=America/New_York:20180223T120000
DTSTAMP:20260517T091030
CREATED:20171215T164243Z
LAST-MODIFIED:20180123T190050Z
UID:7152-1519383600-1519387200@idss-stage.mit.edu
SUMMARY:Optimization's Implicit Gift to Learning: Understanding Optimization Bias as a Key to Generalization
DESCRIPTION:Abstract: \nIt is becoming increasingly clear that implicit regularization\nafforded by the optimization algorithms play a central role in machine\nlearning\, and especially so when using large\, deep\, neural\nnetworks. We have a good understanding of the implicit regularization\nafforded by stochastic approximation algorithms\, such as SGD\, and as I\nwill review\, we understand and can characterize the implicit bias of\ndifferent algorithms\, and can design algorithms with specific\nbiases. But in this talk I will focus on implicit biases of\ndeterministic algorithms on underdetermined problem. In an effort to\nuncover the implicit biases of gradient-based optimization of neural\nnetworks\, which holds the key to their empirical success\, I will\ndiscuss recent work on implicit regularization for matrix\nfactorization and for linearly separable problems with monotone\ndecreasing loss functions. \nBio: \nProfessor Nati Srebro obtained his PhD at the Massachusetts Institute\nof Technology (MIT) in 2004\, held a post-doctoral fellowship with the\nMachine Learning Group at the University of Toronto\, and was a\nVisiting Scientist at IBM Haifa Research Labs. Since January 2006\, he\nhas been on the faculty of the Toyota Technological Institute at\nChicago (TTIC) and the University of Chicago\, and has also served as\nthe first Director of Graduate Studies at TTIC. From 2013 to 2014 he\nwas associate professor at the Technion-Israel Institute of\nTechnology. Prof. Srebro’s research encompasses methodological\,\nstatistical and computational aspects of Machine Learning\, as well as\nrelated problems in Optimization. Some of Prof. Srebro’s significant\ncontributions include work on learning “wider” Markov networks\,\nincluding introducing the use of the nuclear norm for machine learning\nand matrix reconstruction and work on fast optimization techniques for\nmachine learning\, and on the relationship between learning and\noptimization.
URL:https://idss-stage.mit.edu/calendar/stochastics-and-statistics-seminar-4/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20180216T110000
DTEND;TZID=America/New_York:20180216T120000
DTSTAMP:20260517T091030
CREATED:20171207T154519Z
LAST-MODIFIED:20180118T181839Z
UID:7109-1518778800-1518782400@idss-stage.mit.edu
SUMMARY:User-friendly guarantees for the Langevin Monte Carlo
DESCRIPTION:Abstract:  \nIn this talk\, I will revisit the recently established theoretical guarantees for the convergence of the Langevin Monte Carlo algorithm of sampling from a smooth and (strongly) log-concave density. I will discuss the existing results when the accuracy of sampling is measured in the Wasserstein distance and provide further insights on relations between\, on the one hand\, the Langevin Monte Carlo for sampling and\, on the other hand\, the gradient descent for optimization. I will also present non-asymptotic guarantees for the accuracy of a version of the Langevin Monte Carlo algorithm that is based on inaccurate evaluations of the gradient. Finally\, I will propose a variable-step version of the Langevin Monte Carlo algorithm that has two advantages. First\, its step-sizes are independent of the target accuracy and\, second\, its rate provides a logarithmic improvement over the constant-step Langevin Monte Carlo algorithm.\nThis is a joint work with A. Karagulyan
URL:https://idss-stage.mit.edu/calendar/stochastics-and-statistics-seminar-arnak-dalalyan-enseacrest/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20180209T110000
DTEND;TZID=America/New_York:20180209T120000
DTSTAMP:20260517T091030
CREATED:20171207T154146Z
LAST-MODIFIED:20180119T204343Z
UID:7106-1518174000-1518177600@idss-stage.mit.edu
SUMMARY:Variable selection using presence-only data with applications to biochemistry
DESCRIPTION:Abstract: \nIn a number of problems\, we are presented with positive and unlabelled data\, referred to as presence-only responses. The application I present today involves studying the relationship between protein sequence and function and presence-only data arises since for many experiments it is impossible to obtain a large set of negative (non-functional) sequences. Furthermore\, if the number of variables is large and the goal is variable selection (as in this case)\, a number of statistical and computational challenges arise due to the non-convexity of the objective. In this talk\, I present an algorithm (PUlasso) with provable guarantees for doing variable selection and classification with presence-only data. Our algorithm involves using the majorization-minimization (MM) framework which is a generalization of the well-known expectation-maximization (EM) algorithm. In particular to make our algorithm scalable\, our algorithm has two computational speed-ups to the standard EM algorithm. I provide a theoretical guarantee where we first show that our algorithm is guaranteed to converge to a stationary point\, and then prove that any stationary point achieves the minimax optimal mean-squared error of slogp/n\, where s is the sparsity of the true parameter. I also demonstrate through simulations that our algorithm out-performs state-of-the-art algorithms in the moderate p settings in terms of classification performance. Finally\, I demonstrate that our PUlasso algorithm performs well on a biochemistry example.
URL:https://idss-stage.mit.edu/calendar/stochastic-and-statistics-seminar-garvesh-raskutti-univ-of-wisconsin/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20180202T110000
DTEND;TZID=America/New_York:20180202T120000
DTSTAMP:20260517T091030
CREATED:20171228T200551Z
LAST-MODIFIED:20180123T191117Z
UID:7195-1517569200-1517572800@idss-stage.mit.edu
SUMMARY:Connections between structured estimation and weak submodularity
DESCRIPTION:Abstract:  Many modern statistical estimation problems rely on imposing additional structure in order to reduce the statistical complexity and provide interpretability. Unfortunately\, these structures often are combinatorial in nature and result in computationally challenging problems. In parallel\, the combinatorial optimization community has placed significant effort in developing algorithms that can approximately solve such optimization problems in a computationally efficient manner. The focus of this talk is to expand upon ideas that arise in combinatorial optimization and connect those algorithms and ideas to statistical questions. We will discuss three main vignettes: Cardinality constrained optimization; low-rank matrix estimation problems; and greedy estimation of sparse fourier components. \nBio:  Professor Negahban is currently an Assistant Professor in the Department of Statistics at Yale University.  Prior to that he worked with Professor Devavrat Shah at MIT as a postdoc and Prof. Martin J. Wainwright at UC Berkeley as a graduate student.
URL:https://idss-stage.mit.edu/calendar/stochastics-and-statistics-seminar-7/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171201T110000
DTEND;TZID=America/New_York:20171201T120000
DTSTAMP:20260517T091030
CREATED:20171120T201126Z
LAST-MODIFIED:20180801T185333Z
UID:7017-1512126000-1512129600@idss-stage.mit.edu
SUMMARY:Challenges in Developing Learning Algorithms to Personalize Treatment in Real Time
DESCRIPTION:Abstract:  \nA formidable challenge in designing sequential treatments is to  determine when and in which context it is best to deliver treatments.  Consider treatment for individuals struggling with chronic health conditions.  Operationally designing the sequential treatments involves the construction of decision rules that input current context of an individual and output a recommended treatment.   That is\, the treatment is adapted to the individual’s context; the context may include  current health status\, current level of social support and current level of adherence for example.  Data sets on individuals with records of time-varying context and treatment delivery can be used to inform the construction of the decision rules.    There is much interest in personalizing the decision rules\, particularly in real time as the individual experiences sequences of treatment.   Here we discuss our work in designing  online “bandit” learning algorithms for use in personalizing mobile health interventions. \nBiography: \nSusan A. Murphy is Professor of Statistics\, Radcliffe Alumnae Professor at the Radcliffe Institute and Professor of Computer Science at the Harvard John A. Paulson School of Engineering and Applied Sciences\, all at Harvard University. Her lab focuses on  improving sequential\, individualized\, decision making in health\, in particular on clinical trial design and data analysis to inform the development of just-in-time adaptive interventions in mobile health.  The lab’s work is funded by National Institute on Drug Abuse \, by National Institute on Alcohol Abuse and Alcoholism\, by National Heart\, Lung and Blood Institute and by National Institute of Biomedical Imaging and Bioengineering.   Susan is a Fellow of the Institute of Mathematical Statistics\, a Fellow of the College on Problems in Drug Dependence\, a former editor of the Annals of Statistics\, a member of the US National Academy of Sciences\, a member of the US National Academy of Medicine and a 2013 MacArthur Fellow.
URL:https://idss-stage.mit.edu/calendar/challenges-in-developing-learning-algorithms-to-personalize-treatment-in-real-time/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171117T110000
DTEND;TZID=America/New_York:20171117T120000
DTSTAMP:20260517T091030
CREATED:20171120T205246Z
LAST-MODIFIED:20180801T184930Z
UID:7021-1510916400-1510920000@idss-stage.mit.edu
SUMMARY:Generative Models and Compressed Sensing
DESCRIPTION:Abstract:  \nThe goal of compressed sensing is to estimate a vector from an under-determined system of noisy linear measurements\, by making use of prior knowledge in the relevant domain. For most results in the literature\, the structure is represented by sparsity in a well-chosen basis. We show how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all. Instead\, we assume that the unknown vectors lie near the range of a generative model\, e.g. a GAN or a VAE. We show how the problems of image inpainting and super-resolution are special cases of our general framework.  \nWe show how to generalize the RIP condition for generative models and that random gaussian measurement matrices have this property with high probability. A Lipschitz condition for the generative neural network is the key technical issue for our results.  \nTime permitting we will discuss follow-up work on how GANs can model causal structure in high-dimensional probability distributions.  (Based on joint works with Ashish Bora\, Ajil Jalal\, Murat Kocaoglu\, Christopher Snyder and Eric Price) \nCode: https://github.com/AshishBora/csgm \nHomepage: users.ece.utexas.edu/~dimakis \nBiography:   \nAlex Dimakis is an Associate Professor at the ECE department\, University of Texas at Austin. He received his Ph.D. in 2008 from UC Berkeley working with Martin Wainwright and Kannan Ramchandran. He received an NSF Career award\, a Google faculty research award and the Eli Jury dissertation award. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. He is currently serving as an associate editor for IEEE Transactions on Information Theory. His research interests include information theory\, coding theory and machine learning.
URL:https://idss-stage.mit.edu/calendar/generative-models-and-compressed-sensing/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171103T110000
DTEND;TZID=America/New_York:20171103T120000
DTSTAMP:20260517T091030
CREATED:20171120T200525Z
LAST-MODIFIED:20171120T200525Z
UID:7014-1509706800-1509710400@idss-stage.mit.edu
SUMMARY:Statistics\, Computation and Learning with Graph Neural Networks
DESCRIPTION:Abstract: \nDeep Learning\, thanks mostly to Convolutional architectures\, has recently transformed computer vision and speech recognition. Their ability to encode geometric stability priors\, while offering enough expressive power\, is at the core of their success. In such settings\, geometric stability is expressed in terms of local deformations\, and it is enforced thanks to localized convolutional operators that separate the estimation into scales. \nMany problems across applied sciences\, from particle physics to recommender systems\, are formulated in terms of signals defined over non-Euclidean geometries\, and also come with strong geometric stability priors. In this talk\, I will present techniques that exploit geometric stability in general geometries with appropriate graph neural network architectures. We will show that these techniques can all be framed in terms of local graph generators such as the graph Laplacian. We will present some stability certificates\, as well as applications to computer graphics\, particle physics and graph estimation problems. In particular\, we will describe how graph neural networks can be used to reach statistical detection thresholds in community detection on random graph families\, and attack hard combinatorial optimization problems\, such as the Quadratic Assignment Problem. \nBiography: \nJoan Bruna graduated from Universitat Politecnica de Catalunya (Barcelona\, Spain) in both Mathematics and Electrical Engineering. He obtained an M.Sc. in applied mathematics from ENS Cachan (France). He then became a research engineer in an image processing startup\, developing real-time video processing algorithms. He obtained his PhD in Applied Mathematics at Ecole Polytechnique (France). He was a postdoctoral researcher at the Courant Institute\, NYU\, New York\, and a fellow at Facebook AI Research. In 2015\, he became Assistant Professor at UC Berkeley\, Statistics Department\, and starting Fall 2016 he joined the Courant Institute (NYU\, New York) as Assistant Professor in Computer Science\, Data Science and Mathematics (affiliated). His research interests include invariant signal representations\, high-dimensional statistics and stochastic processes\, deep learning and its applications to signal processing.
URL:https://idss-stage.mit.edu/calendar/statistics-computation-and-learning-with-graph-neural-networks/
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171101T110000
DTEND;TZID=America/New_York:20171101T120000
DTSTAMP:20260517T091030
CREATED:20171120T181051Z
LAST-MODIFIED:20171120T192008Z
UID:7007-1509534000-1509537600@idss-stage.mit.edu
SUMMARY:Unbiased Markov chain Monte Carlo with couplings
DESCRIPTION:Abstract: Markov chain Monte Carlo methods provide consistent approximations of integrals as the number of iterations goes to infinity. However\, these estimators are generally biased after any fixed number of iterations\, which complicates both parallel computation. In this talk I will explain how to remove this burn-in  bias by using couplings of Markov chains and a telescopic sum argument\, inspired by Glynn & Rhee (2014). The resulting unbiased estimators can be computed independently in parallel\, and averaged. I will present coupling constructions for Metropolis-Hastings\, Gibbs and Hamiltonian Monte Carlo. The proposed methodology will be illustrated on various examples. If time permits\, I will describe how the proposed estimators can approximate the “cut” distribution that arises in Bayesian inference for misspecified models made of sub-models. \nThis is joint work with John O’Leary\, Yves F. Atchade and Jeremy Heng\,\navailable at arxiv.org/abs/1708.03625 and arxiv.org/abs/1709.00404. \nBiography: Pierre Jacob is an Assistant Professor of Statistics at Harvard University since 2015. Pierre was before a postdoctoral research fellow at the University of Oxford and the National University of Singapore. His Ph.D. was from Université Paris-Dauphine on computational methods for Bayesian inference. His current research is on algorithms amenable to parallel computing for Bayesian inference and model comparison\, with a focus on time series models. \nPierre E. Jacob\nAssistant Professor of Statistics\, Harvard University\npersonal website: sites.google.com/site/pierrejacob/\nblog: statisfaction.wordpress.com/
URL:https://idss-stage.mit.edu/calendar/unbiased-markov-chain-monte-carlo-with-couplings/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171027T110000
DTEND;TZID=America/New_York:20171027T120000
DTSTAMP:20260517T091030
CREATED:20171002T194208Z
LAST-MODIFIED:20171120T180403Z
UID:6559-1509102000-1509105600@idss-stage.mit.edu
SUMMARY:Stochastics and Statistics Seminar - Amit Daniely (Google)
DESCRIPTION:Abstract:  \nCan learning theory\, as we know it today\, form a theoretical basis for neural networks. I will try to discuss this question in light of two new results — one positive and one negative. \nBased on joint work with Roy Frostig\, Vineet Gupta and Yoram Singer\, and with Vitaly Feldman \nBiography: \nAmit Daniely is an Assistant Professor at the Hebrew University in Jerusalem\, and a research scientist at Google Research\, Tel-Aviv. Prior to that\, he was a research scientist at Google Research\, Mountain-View. Even prior to that\, he was a Ph.D. student at the Hebrew University of Jerusalem\, Israel\, supervised by Nati Linial and Shai Shalev-Shwartz. His main research interest is Machine Learning Theory.
URL:https://idss-stage.mit.edu/calendar/stochastic-and-statistics-seminar-amit-daniely-google/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171020T110000
DTEND;TZID=America/New_York:20171020T120000
DTSTAMP:20260517T091030
CREATED:20171002T193921Z
LAST-MODIFIED:20171006T202431Z
UID:6555-1508497200-1508500800@idss-stage.mit.edu
SUMMARY:Inference in dynamical systems and the geometry of learning group actions - Sayan Mukherjee (Duke)
DESCRIPTION:
URL:https://idss-stage.mit.edu/calendar/inference-in-dynamical-systems-and-the-geometry-of-learning-group-actions-sayan-mukherjee-duke/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=UTC:20171019T163000
DTEND;TZID=UTC:20171019T173000
DTSTAMP:20260517T091030
CREATED:20170831T230110Z
LAST-MODIFIED:20171002T193958Z
UID:6078-1508430600-1508434200@idss-stage.mit.edu
SUMMARY:Special Stochastics and Statistics Seminar - John Cunningham (Columbia)
DESCRIPTION:
URL:https://idss-stage.mit.edu/calendar/special-stochastics-and-statistics-seminar-john-cunningham-columbia/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171013T110000
DTEND;TZID=America/New_York:20171013T120000
DTSTAMP:20260517T091030
CREATED:20171002T182143Z
LAST-MODIFIED:20171006T201516Z
UID:6549-1507892400-1507896000@idss-stage.mit.edu
SUMMARY:Additivity of Information in Deep Generative Network:  The I-MMSE Transform Method - Galen Reeves (Duke University)
DESCRIPTION:
URL:https://idss-stage.mit.edu/calendar/additivity-of-information-in-deep-generative-network-the-i-mmse-transform-method-galen-reeves-duke-university/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20171006T110000
DTEND;TZID=America/New_York:20171006T120000
DTSTAMP:20260517T091030
CREATED:20170929T210606Z
LAST-MODIFIED:20171002T162240Z
UID:6516-1507287600-1507291200@idss-stage.mit.edu
SUMMARY:Transport maps for Bayesian computation - Youssef Marzouk (MIT)
DESCRIPTION:
URL:https://idss-stage.mit.edu/calendar/transport-maps-for-bayesian-computation/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=UTC:20170908T110000
DTEND;TZID=UTC:20170908T120000
DTSTAMP:20260517T091030
CREATED:20170831T225546Z
LAST-MODIFIED:20170908T172603Z
UID:6074-1504868400-1504872000@idss-stage.mit.edu
SUMMARY:New Provable Techniques for Learning and Inference in Probabilistic Graphical Models
DESCRIPTION:Speaker: Andrej Risteski (Princeton University)\nA common theme in machine learning is succinct modeling of distributions over large domains. Probabilistic graphical models are one of the most expressive frameworks for doing this. The two major tasks involving graphical models are learning and inference. Learning is the task of calculating the “best fit” model parameters from raw data\, while inference is the task of answering probabilistic queries for a model with known parameters (e.g. what is the marginal distribution of a subset of variables\, after conditioning on the values of some other variables). Learning can be thought of as finding a graphical model that “explains” the raw data\, while the inference queries extract the “knowledge” the graphical model contains. \nI will focus on a few vignettes from my research which give new provable techniques for these tasks:\n– In the context of learning\, I will talk about method-of-moments techniques for learning noisy-or Bayesian networks\, which are used for modeling the causal structure of diseases and symptoms.\n– In the context of inference\, I will talk about a new understanding of a class of algorithms for calculating partition functions\, called variational methods\, through the lens of convex programming hierarchies. Time permitting\, I will also speak about MCMC methods for sampling from highly multimodal distributions using simulated tempering. \nThe talk will assume no background\, and is meant as a “meet and greet” talk surveying various questions I’ve worked on and am interested in. \nBiography\nI work in the intersection of machine learning and theoretical computer science\, with the primary goal of designing provable and practical algorithms for problems arising in machine learning. Broadly\, this includes tasks like clustering\, maximum likelihood estimation\, inference\, learning generative models. \nAll of these tend to be non-convex in nature and intractable in general. However\, in practice\, a plethora of heuristics like gradient descent\, alternating minimization\, convex relaxations\, variational methods work reasonably well. In my research\, I endeavor to understand what are realistic conditions under which we can give guarantees of the performance of these algorithms\, as well as devise new\, more efficient methods. \nI was a PhD student in the Computer Science Department at Princeton University\, working under the advisement of Sanjeev Arora. Starting September 2017\, I will hold a joint position in the Institute for Data\, Systems\, and Society (IDSS) and the Applied Mathematics department at MIT\, as a Norbert Wiener Fellow and applied mathematics instructor respectively.
URL:https://idss-stage.mit.edu/calendar/new-provable-techniques-for-learning-and-inference-in-probabilistic-graphical-models/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170519T110000
DTEND;TZID=America/New_York:20170519T110000
DTSTAMP:20260517T091030
CREATED:20190627T212122Z
LAST-MODIFIED:20190627T212122Z
UID:10084-1495191600-1495191600@idss-stage.mit.edu
SUMMARY:Fast Rates for Bandit Optimization with Upper-Confidence Frank-Wolfe
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/fast-rates-for-bandit-optimization-with-upper-confidence-frank-wolfe-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170512T110000
DTEND;TZID=America/New_York:20170512T110000
DTSTAMP:20260517T091030
CREATED:20190627T212123Z
LAST-MODIFIED:20190627T212123Z
UID:10086-1494586800-1494586800@idss-stage.mit.edu
SUMMARY:Invariance and Causality
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/invariance-and-causality-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170505T110000
DTEND;TZID=America/New_York:20170505T110000
DTSTAMP:20260517T091030
CREATED:20190627T212126Z
LAST-MODIFIED:20190627T212126Z
UID:10088-1493982000-1493982000@idss-stage.mit.edu
SUMMARY:Some related phase transitions in phylogenetics and social network analysis 
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/some-related-phase-transitions-in-phylogenetics-and-social-network-analysis-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170428T110000
DTEND;TZID=America/New_York:20170428T110000
DTSTAMP:20260517T091030
CREATED:20190627T212126Z
LAST-MODIFIED:20190627T212126Z
UID:10089-1493377200-1493377200@idss-stage.mit.edu
SUMMARY:Testing properties of distributions over big domains
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/testing-properties-of-distributions-over-big-domains-2/
LOCATION:32-141\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170414T110000
DTEND;TZID=America/New_York:20170414T110000
DTSTAMP:20260517T091030
CREATED:20190627T212126Z
LAST-MODIFIED:20190627T212126Z
UID:10092-1492167600-1492167600@idss-stage.mit.edu
SUMMARY:Active learning with seed examples and search queries
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/active-learning-with-seed-examples-and-search-queries-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170407T110000
DTEND;TZID=America/New_York:20170407T110000
DTSTAMP:20260517T091030
CREATED:20190627T212127Z
LAST-MODIFIED:20190627T212127Z
UID:10096-1491562800-1491562800@idss-stage.mit.edu
SUMMARY:Sample-optimal inference\, computational thresholds\, and the methods of moments 
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/sample-optimal-inference-computational-thresholds-and-the-methods-of-moments-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170324T110000
DTEND;TZID=America/New_York:20170324T110000
DTSTAMP:20260517T091030
CREATED:20190627T212132Z
LAST-MODIFIED:20190627T212132Z
UID:10099-1490353200-1490353200@idss-stage.mit.edu
SUMMARY:Jagers-Nerman stable age distribution theory\, change point detection and power of two choices in evolving networks
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/jagers-nerman-stable-age-distribution-theory-change-point-detection-and-power-of-two-choices-in-evolving-networks-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170317T110000
DTEND;TZID=America/New_York:20170317T110000
DTSTAMP:20260517T091030
CREATED:20190627T212133Z
LAST-MODIFIED:20190627T212133Z
UID:10101-1489748400-1489748400@idss-stage.mit.edu
SUMMARY:Probabilistic factorizations of big tables and networks
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/probabilistic-factorizations-of-big-tables-and-networks-2/
LOCATION:32-141\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170310T110000
DTEND;TZID=America/New_York:20170310T110000
DTSTAMP:20260517T091030
CREATED:20190627T212133Z
LAST-MODIFIED:20190627T212133Z
UID:10102-1489143600-1489143600@idss-stage.mit.edu
SUMMARY:Robust Statistics\, Revisited 
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/robust-statistics-revisited-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170303T110000
DTEND;TZID=America/New_York:20170303T110000
DTSTAMP:20260517T091030
CREATED:20190627T212134Z
LAST-MODIFIED:20190627T212134Z
UID:10106-1488538800-1488538800@idss-stage.mit.edu
SUMMARY:Computing partition functions by interpolation
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/computing-partition-functions-by-interpolation-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170224T110000
DTEND;TZID=America/New_York:20170224T110000
DTSTAMP:20260517T091030
CREATED:20190627T212135Z
LAST-MODIFIED:20190627T212135Z
UID:10108-1487934000-1487934000@idss-stage.mit.edu
SUMMARY:Estimating the number of connected components of large graphs based on subgraph sampling
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/estimating-the-number-of-connected-components-of-large-graphs-based-on-subgraph-sampling-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170217T110000
DTEND;TZID=America/New_York:20170217T110000
DTSTAMP:20260517T091030
CREATED:20190627T212136Z
LAST-MODIFIED:20190627T212136Z
UID:10111-1487329200-1487329200@idss-stage.mit.edu
SUMMARY:Causal Discovery in Systems with Feedback Cycles
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/causal-discovery-in-systems-with-feedback-cycles-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170210T110000
DTEND;TZID=America/New_York:20170210T110000
DTSTAMP:20260517T091030
CREATED:20190627T212137Z
LAST-MODIFIED:20190627T212137Z
UID:10114-1486724400-1486724400@idss-stage.mit.edu
SUMMARY:Slope meets Lasso in sparse linear regression
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/slope-meets-lasso-in-sparse-linear-regression-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20170203T110000
DTEND;TZID=America/New_York:20170203T110000
DTSTAMP:20260517T091030
CREATED:20190627T212137Z
LAST-MODIFIED:20190627T212137Z
UID:10115-1486119600-1486119600@idss-stage.mit.edu
SUMMARY:Non-classical Berry-Esseen inequality and accuracy of the weighted bootstrap
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/non-classical-berry-esseen-inequality-and-accuracy-of-the-weighted-bootstrap-2/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20161202T110000
DTEND;TZID=America/New_York:20161202T110000
DTSTAMP:20260517T091030
CREATED:20190627T212143Z
LAST-MODIFIED:20190627T212143Z
UID:10119-1480676400-1480676400@idss-stage.mit.edu
SUMMARY:Shotgun Assembly of Graphs
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/shotgun-assembly-of-graphs-2/
LOCATION:25-111\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20161118T110000
DTEND;TZID=America/New_York:20161118T110000
DTSTAMP:20260517T091030
CREATED:20190627T212143Z
LAST-MODIFIED:20190627T212143Z
UID:10122-1479466800-1479466800@idss-stage.mit.edu
SUMMARY:Interpretable prediction models for network-linked data
DESCRIPTION:The Stochastics and Statistics Seminar is a weekly meeting organized by the Statistics and Data Science Center (SDSC). It consists of a series of one-hour presentations by worldwide leaders making cutting edge contributions to methodological and theoretical advances in data science. These fields include probability\, statistics\, optimization\, and applied mathematics. The seminar also regularly features experts in applications domains such as biology or engineering. This intellectual diversity has contributed to the organic assembly of a dynamic and diverse audience articulated around a core group composed of faculty\, postdocs and graduate students from different department and affiliated with IDSS. Every week\, this audience is supplemented by a large number—often more than doubled—of attendees from all of MIT reflecting the interdisciplinary nature of the stochastics and statistics seminar. 
URL:https://idss-stage.mit.edu/calendar/interpretable-prediction-models-for-network-linked-data-2/
LOCATION:32-141\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
END:VEVENT
END:VCALENDAR