Past Events

Find Events

Event Views Navigation

Past Events

Events List Navigation

May 2017

Invariance and Causality

May 12 @ 11:00 am

Why are we interested in the causal structure of a process? In classical prediction tasks, for example, it seems that no causal knowledge is required. In many situations, however, we are interested in a system's behavior after parts of this system have been changed. Here, causal models become important because they are usually considered invariant under those changes. A causal prediction (which uses only direct causes of the target variable as predictors) remains valid even if we intervene on predictor…

Find out more »
September 2017

New provable techniques for learning and inference in probabilistic graphical models

Andrej Risteski (Princeton University)

September 8 @ 11:00 am - 12:00 pm
MIT

Abstract: A common theme in machine learning is succinct modeling of distributions over large domains. Probabilistic graphical models are one of the most expressive frameworks for doing this. The two major tasks involving graphical models are learning and inference. Learning is the task of calculating the "best fit" model parameters from raw data, while inference is the task of answering probabilistic queries for a model with known parameters (e.g. what is the marginal distribution of a subset of variables, after…

Find out more »

Sample complexity of population recovery

Yury Polyanskiy (MIT)

September 15 @ 11:00 am - 12:00 pm
MIT

Abstract: In this talk we will first consider a general question of estimating linear functional of the distribution based on the noisy samples from it. We discover that the (two-point) LeCam lower bound is in fact achievable by optimizing bias-variance tradeoff of an empirical-mean type of estimator. Next, we apply this general framework to the specific problem of population recovery. Namely, consider a random poll of sample size n conducted on a population of individuals, where each pollee is asked to…

Find out more »

Walking within growing domains: recurrence versus transience

Amir Dembo (Stanford University)

September 22 @ 11:00 am - 12:00 pm
4-163

Abstract: When is simple random walk on growing in time d-dimensional domains recurrent? For domain growth which is independent of the walk, we review recent progress and related universality conjectures about a sharp recurrence versus transience criterion in terms of the growth rate. We compare this with the question of recurrence/transience for time varying conductance models, where Gaussian heat kernel estimates and evolving sets play an important role. We also briefly contrast such expected universality with examples of the rich…

Find out more »

Optimal lower bounds for universal relation, and for samplers and finding duplicates in streams

Jelani Nelson (Harvard University)

September 29 @ 11:00 am - 12:00 pm
E18-304

Abstract: Consider the following problem: we monitor a sequence of edgeinsertions and deletions in a graph on n vertices, so there are N = (n choose 2) possible edges (e.g. monitoring a stream of friend accepts/removals on Facebook). At any point someone may say "query()", at which point must output a random edge that exists in the graph at that time from a distribution that is statistically close to uniform.  More specifically, with probability p our edge should come from a distribution close to uniform,…

Find out more »
October 2017

2017 Charles River Lectures on Probability and Related Topics

October 2 @ 9:30 am - 5:30 pm

The Charles River Lectures on Probability and Related Topics will be hosted by Harvard University on Monday, October 2, 2017 in Cambridge, MA. The lectures are jointly organized by Harvard University, Massachusetts Institute of Technology and Microsoft Research New England for the benefit of the greater Boston area mathematics community. The event features five lectures by distinguished researchers in the areas of probability and related topics. This year's lectures will be delivered by: For questions regarding the event, please email…

Find out more »

Transport maps for Bayesian computation

Youssef Marzouk (MIT)

October 6 @ 11:00 am - 12:00 pm
E18-304

Abstract: Integration against an intractable probability measure is among the fundamental challenges of Bayesian inference. A useful approach to this problem seeks a deterministic coupling of the measure of interest with a tractable "reference" measure (e.g., a standard Gaussian). This coupling is induced by a transport map, and enables direct simulation from the desired measure simply by evaluating the transport map at samples from the reference. Approximate transports can also be used to "precondition" standard Monte Carlo schemes. Yet characterizing a…

Find out more »

Additivity of Information in Deep Generative Networks: The I-MMSE Transform Method

Galen Reeves (Duke University)

October 13 @ 11:00 am - 12:00 pm

Abstract:  Deep generative networks are powerful probabilistic models that consist of multiple stages of linear transformations (described by matrices) and non-linear, possibly random, functions (described generally by information channels). These models have gained great popularity due to their ability to characterize complex probabilistic relationships arising in a wide variety of inference problems. In this talk, we introduce a new method for analyzing the fundamental limits of statistical inference in settings where the model is known. The validity of our method…

Find out more »

Structure in multi-index tensor data: a trivial byproduct of simpler phenomena?

John Cunningham

October 19 @ 4:30 pm - 5:30 pm

Abstract: As large tensor-variate data become increasingly common across applied machine learning and statistics, complex analysis methods for these data similarly increase in prevalence.  Such a trend offers the opportunity to understand subtler and more meaningful features of the data that, ostensibly, could not be studied with simpler datasets or simpler methodologies.  While promising, these advances are also perilous: novel analysis techniques do not always consider the possibility that their results are in fact an expected consequence of some simpler,…

Find out more »

Inference in dynamical systems and the geometry of learning group actions

Sayan Mukherjee (Duke)

October 20 @ 11:00 am - 12:00 pm

Abstract: We examine consistency of the Gibbs posterior for dynamical systems using a classical idea in dynamical systems called the thermodynamic formalism in tracking dynamical systems. We state a variation formulation under which there is a unique posterior distribution of parameters as well as hidden states using using classic ideas from dynamical systems such as pressure and joinings. We use an example of consistency of hidden Markov with infinite lags as an application of our theory. We develop a geometric framework that…

Find out more »
+ Export Events