The proposed Super Tau–Charm Facility (STCF) is a next-generation electron–positron collider designed for high-precision studies in the tau–charm energy region. However, its unprecedented luminosity and event rates impose severe demands on real-time data processing—most critically within the High-Level Trigger (HLT) system, where rapid and accurate track reconstruction is paramount.
In this work, we present a fast track identification algorithm tailored for low-transverse-momentum particles, based on an optimized conformal–Hough transform framework. Building upon the classical Hough voting scheme, the proposed method reformulates the parameter-space accumulation process through a matrix-based representation, enabling vectorization, significantly improving computational throughput. Furthermore, considering the sparsity observed in the Hough accumulator under realistic STCF detector occupancies, a sparse-mapping representation is introduced to reduce redundant memory access and storage overhead.
Performance studies are conducted within the STCF offline reconstruction and simulation environment using representative physics channels. The results demonstrate that the proposed algorithm maintains a high signal retention efficiency while achieving a marked improvement in processing speed compared to conventional implementations. In particular, the combined matrix-based and sparsity-aware optimizations result in a significant increase in computational throughput and a pronounced reduction in memory resource consumption, thereby meeting the stringent latency constraints of the STCF HLT system.
These results demonstrate that the proposed approach provides a viable and scalable solution for real-time track finding at STCF. More broadly, this work illustrates how algorithmic reformulation and data-structure optimization, informed by detector-specific statistical characteristics, can effectively bridge the gap between tracking performance and real-time processing requirements in next-generation high-luminosity collider experiments.