Presentation #117.10 in the session Time-Domain Astrophysics.
AGN show strong and variable emission across multiple wavelengths. The UV emission from an AGN is believed to be dominated by thermal emission from the disc. The X-ray emission from a more compact region in AGN is often found to show faster variability relative to emission at longer wavelengths. Correlations between the variability in these two energy bands have been seen in some individual sources while others do not. In sources where a correlation is found, lags that are related to the light travel time between two emission regions are frequently observed. These lags are often on timescales of days and are longer than hundreds of seconds expected by the standard thin disc model.
One observational challenge of the detection of these lags faced by the current monitoring programs of AGN, e.g. from Swift, is that the lightcurves are not continuously sampled and hence standard Fourier techniques cannot be applied. This uneven sampling of the light curves is imposed by limited telescope time. Traditional cross-correlation studies in the time domain may introduce systematic uncertainties by integrating variability on different timescales. I will present a machine-learning approach to search for X-ray and UV lags in the Swift monitoring program of the NLS1 Mrk 335 based on Gaussian Processes (GPs). GP confers a Bayesian non-parametric framework to model general time series data and has proven effective in tasks such as spectral density estimation in astronomy.