Views Navigation

Event Views Navigation

Today

Collective Decision Making: Theory and Experiments

32-155 , United States

Abstract: Ranging from jury decisions to political elections, situations in which groups of individuals determine a collective outcome are ubiquitous. There are two important observations that pertain to almost all collective processes observed in reality. First, decisions are commonly preceded by some form of communication among individual decision makers, such as jury deliberations, or election…

Medical Image Imputation

E18-304 , United States

Abstract: We present an algorithm for creating high resolution anatomically plausible images that are consistent with acquired clinical brain MRI scans with large inter-slice spacing. Although large databases of clinical images contain a wealth of information, medical acquisition constraints result in sparse scans that miss much of the anatomy. These characteristics often render computational analysis…

TAP free energy, spin glasses, and variational inference

E18-304 , United States

Abstract: We consider the Sherrington-Kirkpatrick model of spin glasses with ferromagnetically biased couplings. For a specific choice of the couplings mean, the resulting Gibbs measure is equivalent to the Bayesian posterior for a high-dimensional estimation problem known as "Z2 synchronization". Statistical physics suggests to compute the expectation with respect to this Gibbs measure (the posterior mean…

Safeguarding Privacy in Dynamic Decision-Making Problems

32-155

The increasing ubiquity of large-scale infrastructures for surveillance and data analysis has made understanding the impact of privacy a pressing priority in many domains. We propose a framework for studying a fundamental cost vs. privacy tradeoff in dynamic decision-making problems. The central question is: how can a decision maker take actions that are efficient for…

Capacity lower bound for the Ising perceptron

E18-304 , United States

Abstract: The perceptron is a toy model of a simple neural network that stores a collection of given patterns. Its analysis reduces to a simple problem in high-dimensional geometry, namely, understanding the intersection of the cube (or sphere) with a collection of random half-spaces. Despite the simplicity of this model, its high-dimensional asymptotics are not…

Coded Computing: A Transformative Framework for Resilient, Secure, and Private Distributed Learning

32-155

This talk introduces "Coded Computing”, a new framework that brings concepts and tools from information theory and coding into distributed computing to mitigate several performance bottlenecks that arise in large-scale distributed computing and machine learning, such as resiliency to stragglers and bandwidth bottleneck. Furthermore, coded computing can enable (information-theoretically) secure and private learning over untrusted…

Why Aren’t Network Statistics Accompanied By Uncertainty Statements?

E18-304 , United States

Abstract: Over 500K scientific articles have been published since 1999 with the word “network” in the title. And the vast majority of these report network summary statistics of one type or another. However, these numbers are rarely accompanied by any quantification of uncertainty. Yet any error inherent in the measurements underlying the construction of the…

Women in Data Science (WiDS) – Cambridge, MA

This one-day technical conference brings together local academic leaders,  industrial professionals and students to hear about the latest data science-related research in a number of domains, to learn how leading-edge companies are leveraging data science for success, and to connect with potential mentors, collaborators, and others in the field. Watch WiDS Cambridge on YouTube.

A Theory for Representation Learning via Contrastive Objectives

32-155 , United States

Abstract: Representation learning seeks to represent complicated data (images, text etc.) using a vector embedding, which can then be used to solve complicated new classification tasks using simple methods like a linear classifier. Learning such embeddings is an important type of unsupervised learning (learning from unlabeled data) today. Several recent methods leverage pairs of "semantically…

Univariate total variation denoising, trend filtering and multivariate Hardy-Krause variation denoising

E18-304 , United States

Abstract: Total variation denoising (TVD) is a popular technique for nonparametric function estimation. I will first present a theoretical optimality result for univariate TVD for estimating piecewise constant functions. I will then present related results for various extensions of univariate TVD including adaptive risk bounds for higher-order TVD (also known as trend filtering) as well…


© MIT Institute for Data, Systems, and Society | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |