On low-dimensional structure in transport and inference
Transportation of measure underlies many powerful tools for Bayesian inference, density estimation, and generative modeling. The central idea is to deterministically couple a probability measure of interest with a tractable "reference" measure (e.g., a standard Gaussian). Such couplings are induced by transport maps and enable direct simulation from the desired measure simply by evaluating the transport map at samples from the reference.
While an enormous variety of representations and constructive algorithms for transport maps have been proposed in recent years, it is inevitably advantageous to exploit the potential for low-dimensional structure in the associated probability measures. I will discuss two such notions of low-dimensional structure, and their interplay with transport-driven methods for sampling and inference. The first seeks to approximate a high-dimensional target measure as a low-dimensional update of a dominating reference measure. The second is low-rank conditional structure, where the goal is to replace conditioning variables with low-dimensional projections or summaries. In both cases, under appropriate assumptions on the reference or target measures, one can derive gradient-based upper bounds on the associated approximation error and minimize these bounds to identify good subspaces for approximation. The associated subspaces then dictate specific structural ansatzes for transport maps that represent the target of interest.
I will showcase several algorithmic instantiations of this idea, with examples drawn from Bayesian inverse problems, data assimilation, and/or simulation-based inference.