The James Webb Space Telescope (JWST)  will be launched in 2021 and will provide multispectral images (with low spectral resolution) on wide fields of view (with high spatial resolution) and hyperspectral images (with high spectral resolution) on small fields of view (with low spatial resolution). This contribution aims at developing a fusion method that will combine those images to reconstruct the astrophysical scene at high spatial and spectral resolutions. This fused product will allow an enhanced scientific interpretation of the data. For instance, we will be able to derive high spatial resolution maps of physical tracers in the interstellar medium and conditions (i.e. density of gas, dust temperature, etc.), that would otherwise be inaccessible. Hyperspectral images provided by the JWST will be composed of approximately 3000 spectral bands and 30×30 pixels for a 3×3 arcsec2 field of view, while multispectral images will contain up to 29 spectral bands (matching with narrow to wide filters) and 2040×2040 pixels for a 64×64 arcsec2 field of view. The general fusion problem has been deeply investigated for Earth observation . The most powerful methods are based on an inverse problem formulation, consisting in minimizing a data fidelity term complemented by a regularization term. The data fidelity term is derived from a forward model of the observation instruments. The regularization term can be interpreted as a prior information on the fused image. In the particular context of JWST astronomical imaging, the main challenges are due to the very large scale of the fused data, considerably larger than the typical size of data encountered in remote sensing, as well as complexity of both instruments. In this work, we propose an astronomical data fusion method to recover a high spatio-spectral resolution datacube. The proposed inverse problem accounts for the specificities of astronomical instruments, such as spectrally variant blurs. We provide a fast implementation by solving the problem in the frequency domain and in a low-dimensional subspace to efficiently handle the convolution operators as well as the high dimensionality of the data. We conduct experiments on a realistic synthetic dataset of simulated observation of the Orion Bar and we show that our fusion algorithm outperforms state-of-the-art methods commonly used in remote sensing for Earth observation. References:  J. Gardner et al., Space Science Reviews, 2006.  L. Loncan et al., IEEE Geoscience and Remote Sensing Magazine, 2015.