Presentation #409.06 in the session “Supernovae 2”.
While conventional Type Ia supernova (SN Ia) cosmology analyses rely primarily on rest-frame optical light curves to determine distances, SNe Ia are excellent standard candles in near-infrared (NIR) light, which is significantly less sensitive to dust extinction. A SN Ia spectral energy distribution (SED) model capable of fitting rest-frame NIR observations is necessary to fully leverage current and future SN Ia datasets from ground- and space-based telescopes including HST, LSST, JWST, and RST. We have constructed a hierarchical Bayesian model for SN Ia SEDs, continuous over time and wavelength, from the optical to NIR (B through H, or 0.35 − 1.8 μm). We model the SED as a combination of physically-distinct host galaxy dust and intrinsic spectral components. The distribution of intrinsic SEDs over time and wavelength is modelled with probabilistic functional principal components and the covariance of residual functions. We train the model on a nearby sample of 79 SNe Ia with joint optical and NIR light curves by sampling the global posterior distribution over dust and intrinsic latent variables, SED components, and population hyperparameters. The photometric distances of SNe Ia with NIR data near maximum light obtain a total RMS error of 0.10 mag with our BayeSN model, compared to 0.14 mag with SNooPy and SALT2 for the same sample. Jointly fitting the optical and NIR data of the full sample for a global host dust law, we find RV = 2.9 ± 0.2 for host E(B-V) < 0.4, consistent with the Milky Way average. [arXiv:2008.07538]