SciPost logo

SciPost Submission Page

SKATR: A Self-Supervised Summary Transformer for SKA

by Ayodele Ore, Caroline Heneka, Tilman Plehn

Submission summary

Authors (as registered SciPost users): Ayodele Ore · Tilman Plehn
Submission information
Preprint Link: https://arxiv.org/abs/2410.18899v1  (pdf)
Code repository: https://github.com/heidelberg-hepml/skatr/
Date submitted: 2024-10-30 15:59
Submitted by: Ore, Ayodele
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • Gravitation, Cosmology and Astroparticle Physics
Approach: Computational

Abstract

The Square Kilometer Array will initiate a new era of radio astronomy by allowing 3D imaging of the Universe during Cosmic Dawn and Reionization. Modern machine learning is crucial to analyze the highly structured and complex signal. However, accurate training data is expensive to simulate, and supervised learning may not generalize. We introduce a self-supervised vision transformer, SKATR, whose learned encoding can be cheaply adapted for downstream tasks on 21cm maps. Focusing on regression and generative inference of astrophysical and cosmological parameters, we demonstrate that SKATR representations are maximally informative and that SKATR generalizes out-of-domain to differently-simulated, noised, and higher-resolution datasets.

Author indications on fulfilling journal expectations

  • Provide a novel and synergetic link between different research areas.
  • Open a new pathway in an existing or a new research direction, with clear potential for multi-pronged follow-up work
  • Detail a groundbreaking theoretical/experimental/computational discovery
  • Present a breakthrough on a previously-identified and long-standing research stumbling block
Current status:
In refereeing

Login to report or comment