Presentation #107.37 in the session “ISM/Galaxies/Clusters (Poster)”.
In the outskirts of galaxy clusters, entropy profiles measured from X-ray observations of the hot intracluster medium (ICM) drop off unexpectedly. One possible explanation for this effect is gas clumping, where pockets of cooler and denser structure within the ICM are present. Currently, sufficiently sensitive observatories are unable to directly detect these theoretical gas clumps. In part to solve this issue, a telescope called STAR-X has been proposed to the latest Mid-Sized Explorer Class Mission (MIDEX) call by NASA. Due to its low altitude orbit, resulting in a low and stable instrumental background, and spatial resolving power, STAR-X should be sensitive enough to directly detect and characterize these gas clumps, if they do exist. The aim of this work is to simulate observations of clumping in clusters to determine how well STAR-X will be able to detect clumps, as well as what clumping properties reproduce observed entropy profiles. This is achieved by using yt, pyXSIM, SOXS, and other tools to inject clumps into three-dimensional reconstructions of observed clusters, using observed profiles from other X-ray missions. Radial temperature and surface brightness profiles are then extracted from mock observations using concentric annuli centered on the central emission peak. We find that in simulated observations for STAR-X, gas clumps can be successfully identified using wavdetect and subtracted out, eliminating the observed drop in entropy. This demonstrates that STAR-X will be capable of detecting substructure in the outskirts of nearby clusters and that their properties will be able to be probed.