Stochastics and Statistics Seminar Series

  1. Events
  2. Stochastics and Statistics Seminar Series

Views Navigation

Event Views Navigation

Today

Dejan Slepcev

MIT Statistics and Data Science Center host guest lecturers from around the world in this weekly seminar.

An Information-Geometric View of Learning in High Dimensions

32-155 , United States

Abstract: We consider the problem of data feature selection prior to inference task specification, which is central to high-dimensional learning. Introducing natural notions of universality for such problems, we show a local equivalence among them. Our analysis is naturally expressed via information geometry, and represents a conceptually and practically useful learning methodology. The development reveals…

Boaz Nadler

MIT Statistics and Data Science Center host guest lecturers from around the world in this weekly seminar.

Jingbo Liu

E18-304 , United States

Abstract Concentration of measure refers to a collection of tools and results from analysis and probability theory that have been used in many areas of pure and applied mathematics. Arguably, the first data science application of measure concentration (under the name ‘‘blowing-up lemma’’) is the proof of strong converses in multiuser information theory by Ahlswede,…

Efficient Algorithms for the Graph Matching Problem in Correlated Random Graphs

Abstract: The Graph Matching problem is a robust version of the Graph Isomorphism problem: given two not-necessarily-isomorphic graphs, the goal is to find a permutation of the vertices which maximizes the number of common edges. We study a popular average-case variant; we deviate from the common heuristic strategy and give the first quasi-polynomial time algorithm,…

Locally private estimation, learning, inference, and optimality

Abstract: In this talk, we investigate statistical learning and estimation under local privacy constraints, where data providers do not trust the collector of the data and so privatize their data before it is even collected. We identify fundamental tradeoffs between statistical utility and privacy in such local models of privacy, providing instance-specific bounds for private…

Algorithmic thresholds for tensor principle component analysis

Abstract: Consider the problem of recovering a rank 1 tensor of order k that has been subject to Gaussian noise. The log-likelihood for this problem is highly non-convex. It is information theoretically possible to recover the tensor with a finite number of samples via maximum likelihood estimation, however, it is expected that one needs a…

Alan Frieze

MIT Statistics and Data Science Center host guest lecturers from around the world in this weekly seminar.

Joint estimation of parameters in Ising Model

E18-304 , United States

Abstract: Inference in the framework of Ising models has received significant attention in Statistics and Machine Learning in recent years. In this talk we study joint estimation of the inverse temperature parameter β, and the magnetization parameter B, given one realization from the Ising model, under the assumption that the underlying graph of the Ising…

Optimal hypothesis testing for stochastic block models with growing degrees

Abstract: In this talk, we discuss optimal hypothesis testing for distinguishing a stochastic block model from an Erdos--Renyi random graph when the average degree grows to infinity with the graph size. We show that linear spectral statistics based on Chebyshev polynomials of the adjacency matrix can approximate signed cycles of growing lengths when the graph…


© MIT Institute for Data, Systems, and Society | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |