SKATR: A self-supervised summary transformer for SKA
Ayodele Ore, Caroline Heneka, Tilman Plehn
SciPost Phys. 18, 155 (2025) · published 14 May 2025
- doi: 10.21468/SciPostPhys.18.5.155
- Submissions/Reports
-
Abstract
The Square Kilometer Array will initiate a new era of radio astronomy by allowing 3D imaging of the Universe during Cosmic Dawn and Reionization. Modern machine learning is crucial to analyze the highly structured and complex signal. However, accurate training data is expensive to simulate, and supervised learning may not generalize. We introduce a self-supervised vision transformer, SKATR, whose learned encoding can be cheaply adapted for downstream tasks on 21cm maps. Focusing on regression and generative inference of astrophysical and cosmological parameters, we demonstrate that SKATR representations are maximally informative and that SKATR generalizes out-of-domain to differently-simulated, noised, and higher-resolution datasets.
Supplementary Information
External links to supplemental resources; opens in a new tab.
Authors / Affiliation: mappings to Contributors and Organizations
See all Organizations.- 1 Ayodele Ore,
- 1 Caroline Heneka,
- 1 Tilman Plehn
- Bundesministerium für Bildung und Forschung / Federal Ministry of Education and Research [BMBF]
- Daimler und Benz Stiftung
- Deutsche Forschungsgemeinschaft / German Research FoundationDeutsche Forschungsgemeinschaft [DFG]
- Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg
- VolkswagenStiftung / Volkswagen Foundation