We present a two-staged approach to wide-field wavefront sensing and demonstrate its ability to estimate and enhance image quality for the upcoming Rubin Observatory. The first stage makes sparse local wavefront estimates with a convolutional neural network; the second stage uses linear regression to interpolate the global wavefront. The Rubin Observatory will have a 3.5 degree field of view, highly degenerate optical system, and curvature wavefront sensing system, making it the perfect test case. We trained our model on 600,000 simulated Rubin Observatory intra and extra-focal star images (donuts). It learns to estimate the optics contribution to the wavefront and separate it from a myriad of other contributions. This computationally efficient approach can process 100 times the number of donuts as proposed alternatives. This significant increase in bandwidth leads to a richer and more accurate characterization of the evolution of the telescope optics.