Skip to main content
SearchLogin or Signup

Curating, Calibrating and Distributing DKIST Data

Presentation #313.11 in the session “Solar Physics Division (SPD): Instrumentation and Active Regions”.

Published onJun 18, 2021
Curating, Calibrating and Distributing DKIST Data

The Daniel K Inouye Solar Telescope (DKIST) Data Center (DC) provides a crucial link between the acquisition of science data and associated metadata at the telescope, and its eventual use for achieving DKIST scientific goals.

DKIST is expected to deliver 3 PB of raw data per year. Calibrated data volumes are of similar magnitude. These data as well as higher-level data products have to be curated over the lifetime of the observatory (two full Hale cycles). The complexity and operational flexibility of the five instruments on the Coudé platform add to the data-handling challenge. In contrast to previous national ground-based solar facilities, the DKIST DC program will provide calibrated data to the community. Implementing automated calibration pipelines that remove instrumental and some atmospheric seeing effects constitute significant challenges, but by making calibrated and eventually higher-level data products broadly available, the scientific utility and impact of DKIST is greatly enhanced. Furthermore, the distribution of the large data sets in an efficient and reliable manner to a geographically dispersed user base, and the required user support, constitute additional technological challenges and operational effort.

The DKIST DC will enable open, searchable, and documented access to data and metadata for a broad user base, as well as providing user tools to enhance this access and also scientific interaction with the data.


No comments here