Loading [MathJax]/extensions/Safe.js
SciPost logo

SKATR: A self-supervised summary transformer for SKA

Ayodele Ore, Caroline Heneka, Tilman Plehn

SciPost Phys. 18, 155 (2025) · published 14 May 2025

Abstract

The Square Kilometer Array will initiate a new era of radio astronomy by allowing 3D imaging of the Universe during Cosmic Dawn and Reionization. Modern machine learning is crucial to analyze the highly structured and complex signal. However, accurate training data is expensive to simulate, and supervised learning may not generalize. We introduce a self-supervised vision transformer, SKATR, whose learned encoding can be cheaply adapted for downstream tasks on 21cm maps. Focusing on regression and generative inference of astrophysical and cosmological parameters, we demonstrate that SKATR representations are maximally informative and that SKATR generalizes out-of-domain to differently-simulated, noised, and higher-resolution datasets.

Supplementary Information

External links to supplemental resources; opens in a new tab.


Authors / Affiliation: mappings to Contributors and Organizations

See all Organizations.
Funders for the research work leading to this publication