LIDS Seminar Series

  1. Events
  2. LIDS Seminar Series

Views Navigation

Event Views Navigation

Today

The Maps Inside Your Head

MIT Building 32, Room 141 The Stata Center (32-141), 32 Vassar Street, Cambridge, MA, United States

How do our brains make sense of a complex and unpredictable world? In this talk, I will discuss an information theory approach to the neural topography of information processing in the brain. First I will review the brain's architecture, and how neural circuits map out the sensory and cognitive worlds. Then I will describe how…

Regularized Nonlinear Acceleration

MIT Building 32, Room 141 The Stata Center (32-141), 32 Vassar Street, Cambridge, MA, United States

We describe a convergence acceleration technique for generic optimization problems. Our scheme computes estimates of the optimum from a nonlinear average of the iterates produced by any optimization method. The weights in this average are computed via a simple linear system, whose solution can be updated online. This acceleration scheme runs in parallel to the…

Structure, Randomness and Universality

32-G449 (Kiva) , United States

What is the minimum possible number of vertices of a graph that contains every k-vertex graph as an induced subgraph? What is the minimum possible number of edges in a graph that contains every k-vertex graph with maximum degree 3 as a subgraph? These questions and related one were initiated by Rado in the 60s,…

Quantum Limits on the Information Carried by Electromagnetic Radiation

MIT Building 32, Room 141 The Stata Center (32-141), 32 Vassar Street, Cambridge, MA, United States

In many practical applications information is conveyed by means of electromagnetic radiation and a natural question concerns the fundamental limits of this process. Identifying information with entropy, one can ask about the maximum amount of entropy associated to the propagating wave. The standard statistical physics approach to compute entropy is to take the logarithm of…

Comparison Lemmas, Non-Smooth Convex Optimization and Structured Signal Recovery

MIT Building 32, Room 141 The Stata Center (32-141), 32 Vassar Street, Cambridge, MA, United States

In the past couple of decades, non-smooth convex optimization has emerged as a powerful tool for the recovery of structured signals (sparse, low rank, finite constellation, etc.) from possibly noisy measurements in a variety applications in statistics, signal processing and machine learning. While the algorithms (basis pursuit, LASSO, etc.) are often fairly well established, rigorous…

Regularized Nonlinear Acceleration

MIT Building 32, Room 141 The Stata Center (32-141), 32 Vassar Street, Cambridge, MA, United States

We describe a convergence acceleration technique for generic optimization problems. Our scheme computes estimates of the optimum from a nonlinear average of the iterates produced by any optimization method. The weights in this average are computed via a simple linear system, whose solution can be updated online. This acceleration scheme runs in parallel to the…

Submodular Optimization: From Discrete to Continuous and Back

34-101

Abstract Many procedures in statistics and artificial intelligence require solving non-convex problems. Historically, the focus has been to convexify the non-convex objectives. In recent years, however, there has been significant progress to optimize non-convex functions directly. This direct approach has led to provably good guarantees for specific problem instances such as latent variable models, non-negative…

Safe Learning in Robotics

32-141 , United States

Abstract A great deal of research in recent years has focused on robot learning. In many applications, guarantees that specifications are satisfied throughout the learning process are paramount. For the safety specification, we present a controller synthesis technique based on the computation of reachable sets using optimal control. We show recent results in system decomposition…

The Power of Multiple Samples in Generative Adversarial Networks

32-141 , United States

Abstract We bring the tools from Blackwell’s seminal result on comparing two stochastic experiments from 1953, to shine a new light on a modern application of great interest: Generative Adversarial Networks (GAN). Binary hypothesis testing is at the center of training GANs, where a trained neural network (called a critic) determines whether a given sample…


© MIT Institute for Data, Systems, and Society | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |