Skip to main content
SearchLoginLogin or Signup

Data Processing Challenges for the ngVLA

Presentation #535.03 in the session “New Views of Galaxy Formation and Evolution”.

Published onJan 11, 2021
Data Processing Challenges for the ngVLA

The Next Generation Very Large Array (ngVLA) will provide observing capabilities at millimeter and centimeter wavelengths well beyond those of existing, or already planned, radio telescopes. These capabilities imply data rates roughly three orders of magnitude higher than the VLA but three orders of magnitude smaller than the Square Kilometer Array, making feasible the storage of raw visibilities and the generation of imaging and spectral cubes with currently available computing technologies.

This poster discusses scalability, throughput, and performance results obtained from executing imaging algorithms (CLEAN) on simulated datasets and extrapolates these results to estimate the size of computing requirements for the Reference Observing Program (ROP), an observation program based on the project’s Key Science Goals; and the Envelope Observing Program (EOP), a notional prediction of how the community might use the ngVLA during a typical year of full science operations. The EOP adopts values for the availability of science time and antennas that are more taxing than the project’s goals, and therefore represents an upper envelope on what will actually be demanded from the facility.

These results show that the system will require massive algorithmic parallelization at a level not supported yet by radio astronomy data processing systems. This poster explores this requirement further, and discusses recent technological advances that could be relevant for implementing a cost-effective solution for this problem.

Comments
0
comment
No comments here