Stochastics and Statistics Seminar Series

  1. Events
  2. Stochastics and Statistics Seminar Series

Views Navigation

Event Views Navigation

Today

One and two sided composite-composite tests in Gaussian mixture models

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

Abstract: Finding an efficient test for a testing problem is often linked to the problem of estimating a given function of the data. When this function is not smooth, it is necessary to approximate it cleverly in order to build good tests. In this talk, we will discuss two specific testing problems in Gaussian mixtures models.…

Statistical estimation under group actions: The Sample Complexity of Multi-Reference Alignment

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

Abstract: Many problems in signal/image processing, and computer vision amount to estimating a signal, image, or tri-dimensional structure/scene from corrupted measurements. A particularly challenging form of measurement corruption are latent transformations of the underlying signal to be recovered. Many such transformations can be described as a group acting on the object to be recovered. Examples…

When Inference is Tractable

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

Abstract: A key capability of artificial intelligence will be the ability to reason about abstract concepts and draw inferences. Where data is limited, probabilistic inference in graphical models provides a powerful framework for performing such reasoning, and can even be used as modules within deep architectures. But, when is probabilistic inference computationally tractable? I will…

Statistical theory for deep neural networks with ReLU activation function

Abstract: The universal approximation theorem states that neural networks are capable of approximating any continuous function up to a small error that depends on the size of the network. The expressive power of a network does, however, not guarantee that deep networks perform well on data. For that, control of the statistical estimation risk is…

Optimality of Spectral Methods for Ranking, Community Detections and Beyond

E18-304 , United States

Abstract: Spectral methods have been widely used for a large class of challenging problems, ranging from top-K ranking via pairwise comparisons, community detection, factor analysis, among others. Analyses of these spectral methods require super-norm perturbation analysis of top eigenvectors. This allows us to UNIFORMLY approximate elements in eigenvectors by linear functions of the observed random…

Testing degree corrections in Stochastic Block Models

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

Abstract:  The community detection problem has attracted signicant attention in re- cent years, and it has been studied extensively under the framework of a Stochas- tic Block Model (SBM). However, it is well-known that SBMs fit real data very poorly, and various extensions have been suggested to replicate characteristics of real data. The recovered community…

Inference, Computation, and Visualization for Convex Clustering and Biclustering

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

Abstract: Hierarchical clustering enjoys wide popularity because of its fast computation, ease of interpretation, and appealing visualizations via the dendogram and cluster heatmap. Recently, several have proposed and studied convex clustering and biclustering which, similar in spirit to hierarchical clustering, achieve cluster merges via convex fusion penalties. While these techniques enjoy superior statistical performance, they…

Size-Independent Sample Complexity of Neural Networks

MIT Building E18, Room 304 Ford Building (E18), 50 Ames Street, Cambridge, MA, United States

MIT Statistics and Data Science Center host guest lecturers from around the world in this weekly seminar.

Fitting a putative manifold to noisy data

E18-304 , United States

Abstract: We give a solution to the following question from manifold learning. Suppose data belonging to a high dimensional Euclidean space is drawn independently, identically distributed from a measure supported on a low dimensional twice differentiable embedded compact manifold M, and is corrupted by a small amount of i.i.d gaussian noise. How can we produce…


© MIT Institute for Data, Systems, and Society | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |