When: January 14, 2019 @ 12:30pm
Where: PAB, eScience Data Studio, 6th floor
Maximizing LSST science with probabilistic data products
LSST will produce massive catalogs including detected objects down to unprecedented floors in signal-to-noise ratio, opening the door to a new space of potential discoveries, from illuminating the dark energy accelerating the expansion of the universe to revealing the physical processes underlying transients and variable stars. The anticipated deluge of uncertainty-dominated data, however, demands an unprecedented degree of statistical rigor. Posterior probabilities that quantify complex uncertainties are appropriate successors to the conventional point estimates of physical parameters that suffice for more informative data. In contrast with traditional science analysis pipelines for point estimates and Gaussian errors, inferential infrastructure compatible with probabilistic data products remains underdeveloped. I present mathematically self-consistent techniques for validating, storing, and using such probabilities in the context of photometric redshifts with applications in cosmology. A statistically principled propagation of information will enable us to use every part of the animal and do the best science possible with LSST.