Presentation #536.01 in the session “Community Alert Brokers: Lessons Learned”.
Deep learning has recently been shown to be competitive against previous state-of-the-art feature engineering and random forest classification of periodic variable stars. Although previous work utilizing neural networks has made use of periodicity by period-folding the multiple-cycle time-series into a single cycle — from time-space to phase-space, none of them took advantage of the fact that network predictions should be invariant to the initial-phase of the period-folded sequence, which is experimentally determined and irrelevant for classification. Here, we present cyclic-permutation invariant networks, a new type of neural network for which invariance to phase shifts is guaranteed through polar coordinate convolutions, which we implement by means of “Symmetry Padding.” Across three different datasets of variable star light curves, we show that two implementations of the cyclic-permutation invariant network: the iTCN and the iResNet, consistently outperform both non-invariant versions of the same networks, as well as previous state-of-the-art results with recurrent neural networks and random forests. The methodology we introduce here is also applicable to a wide range of science domains where periodic data abounds due to physical symmetries.