SciPost Submission Page
TrackCore-F: Deploying Transformer-Based Subatomic Particle Tracking on FPGAs
by Arjan Blankestijn, Uraz Odyurt, Amirreza Yousefzadeh
Submission summary
| Authors (as registered SciPost users): | Uraz Odyurt |
| Submission information | |
|---|---|
| Preprint Link: | scipost_202509_00061v1 (pdf) |
| Date submitted: | Sept. 30, 2025, 4:51 p.m. |
| Submitted by: | Uraz Odyurt |
| Submitted to: | SciPost Physics Proceedings |
| Proceedings issue: | The 2nd European AI for Fundamental Physics Conference (EuCAIFCon2025) |
| Ontological classification | |
|---|---|
| Academic field: | Physics |
| Specialties: |
|
Abstract
The Transformer Machine Learning (ML) architecture has been gaining considerable momentum in recent years. In particular, computational High-Energy Physics tasks such as jet tagging and particle track reconstruction (tracking), have either achieved proper solutions, or reached considerable milestones using Transformers. On the other hand, the use of specialised hardware accelerators, especially FPGAs, is an effective method to achieve online, or pseudo-online latencies. The development and integration of Transformer-based ML to FPGAs is still ongoing and the support from current tools is very limited to non-existent. Additionally, FPGA resources present a significant constraint. Considering the model size alone, while smaller models can be deployed directly, larger models are to be partitioned in a meaningful and ideally, automated way. We aim to develop methodologies and tools for monolithic, or partitioned Transformer synthesis, specifically targeting inference. Our primary use-case involves two machine learning model designs for tracking, derived from the TrackFormers project. We elaborate our development approach, present preliminary results, and provide comparisons.
