Presentation #302.19 in the session Computation, Data Handling, Image Analysis — iPoster Session.
Stars are useful tracers of galaxy formation and evolution because they provide information about the conditions in which the stars were born. High precision measurements of stellar properties — such as line-of-sight velocity, distance, surface temperature, gravity, and chemical abundances — require properly-reduced spectral observations, which is done using spectral reduction pipelines (SRPs). In this poster, we present a comparisons of SRP outputs for 600ZD Keck II/DEIMOS spectra of Milky Way halo stars from the HALO7D survey (R~2000, wavelength coverage of ~5000-10000A); this sample includes ~230 stars with a range of magnitudes of 18 < G < 21 and colors 0.3 < B-R < 3.8, with a spectral signal-to-noise ratio (SNR) range of 10-100/pixel (mean of 43.9/pixel). The output 1d spectra are compared between the IDL-based, industry-standard, DEEP2 spec2d pipeline and the newer, Python-based PypeIt pipeline. These comparisons focus on wavelength solutions, continuity across the chip gap, sky line subtraction, extracted SNR, vignetting, strength of spectral features, and amount of quality assessment outputs. In general, our tests show that PypeIt performs as well as, or better than spec2d for our DEIMOS data; wavelength solutions agree within ~1 pixel, better skyline subtraction, more continuous spectra, less vignetting, and higher extracted SNR. PypeIt is also less of a black-box than spec2d, which makes diagnosing and fixing problems easier, and the long list of PypeIt-compatible spectrographs allows users to easily reduce data from many instruments without needing to learn a new pipeline. As PypeIt continues to improve with each update, we expect that these differences will become more prominent, and our suite of tests can be used to show the improvement in each version.