Views Navigation

Event Views Navigation

Today
  • Frontiers of Efficient Neural-Network Learnability

    E18-304 , United States

    Abstract: What are the most expressive classes of neural networks that can be learned, provably, in polynomial-time in a distribution-free setting? In this talk we give the first efficient algorithm for learning neural networks with two nonlinear layers using tools for solving isotonic regression, a nonconvex (but tractable) optimization problem. If we further assume the…

  • Data Science and Big Data Analytics: Making Data-Driven Decisions

    online

    Developed by 11 MIT faculty members at IDSS, this seven-week course is specially designed for data scientists, business analysts, engineers and technical managers looking to learn strategies to harness data. Offered by MIT xPRO. Course begins September 30, 2019.

  • Selection and Endogenous Bias in Studies of Health Behaviors

    E18-304 , United States

    Abstract: Studies of health behaviors using observational data are prone to bias from selection in behavior choices. How important are these biases? Are they dynamic - that is, are they influenced by the recommendations we make? Are there formal assumptions under which we can use information we have about selection on observed variables to learn…

  • Data-driven Coordination of Distributed Energy Resources

    32-155

    The integration of distributed energy resources (DERs), e.g., rooftop photovoltaics installations, electric energy storage devices, and flexible loads, is becoming prevalent. This integration poses numerous operational challenges on the lower-voltage systems to which the DERs are connected, but also creates new opportunities for the provision of grid services. In the first part of the talk,…

  • Behavior of the Gibbs Sampler in the Imbalanced Case/Bias Correction from Daily Min and Max Temperature Measurements

    E18-304 , United States

    IDS.190 Topics in Bayesian Modeling and Computation *Note:  The speaker this week will give two shorter talks within the usual session Title: Behavior of the Gibbs sampler in the imbalanced case Abstract:   Many modern applications collect highly imbalanced categorical data, with some categories relatively rare. Bayesian hierarchical models combat data sparsity by borrowing information, while also…

  • Theoretical Foundations of Active Machine Learning

    E18-304 , United States

    Title: Theoretical Foundations of Active Machine Learning Abstract: The field of Machine Learning (ML) has advanced considerably in recent years, but mostly in well-defined domains using huge amounts of human-labeled training data. Machines can recognize objects in images and translate text, but they must be trained with more images and text than a person can…

  • Probabilistic Programming and Artificial Intelligence

    E18-304 , United States

    IDS.190 – Topics in Bayesian Modeling and Computation Abstract: Probabilistic programming is an emerging field at the intersection of programming languages, probability theory, and artificial intelligence. This talk will show how to use recently developed probabilistic programming languages to build systems for robust 3D computer vision, without requiring any labeled training data; for automatic modeling…

  • The Planted Matching Problem

    E18-304 , United States

    Abstract: What happens when an optimization problem has a good solution built into it, but which is partly obscured by randomness? Here we revisit a classic polynomial-time problem, the minimum perfect matching problem on bipartite graphs. If the edges have random weights in , Mézard and Parisi — and then Aldous, rigorously — showed that…

  • SES PhD Admissions Webinar

    Learn about admission to the Social and Engineering Systems Doctoral Program. Webinars are led by a member of the IDSS faculty who introduces the program and answers your questions. Please register in advance.

  • Markov Chain Monte Carlo Methods and Some Attempts at Parallelizing Them

    E18-304 , United States

    IDS.190 – Topics in Bayesian Modeling and Computation Abstract: MCMC methods yield approximations that converge to quantities of interest in the limit of the number of iterations. This iterative asymptotic justification is not ideal: it stands at odds with current trends in computing hardware. Namely, it would often be computationally preferable to run many short…


© MIT Institute for Data, Systems, and Society | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |