Locality-Delicate Hashing-Based Setting pleasant Degree Transformer with Capabilities in Extreme-Energy Physics
Authors: Siqi Miao, Zhiyuan Lu, Mia Liu, Javier Duarte, Pan Li
Abstract: This study introduces a novel transformer model optimized for large-scale degree cloud processing in scientific domains much like high-energy physics (HEP) and astrophysics. Addressing the constraints of graph neural networks and customary transformers, our model integrates native inductive bias and achieves near-linear complexity with hardware-friendly frequent operations. One contribution of this work is the quantitative analysis of the error-complexity tradeoff of varied sparsification methods for developing setting pleasant transformers. Our findings highlight the prevalence of using locality-sensitive hashing (LSH), significantly OR & AND-construction LSH, in kernel approximation for large-scale degree cloud data with native inductive bias. Primarily based totally on this discovering, we advise LSH-based Setting pleasant Degree Transformer (textbf{HEPT}), which mixes E2LSH with OR & AND constructions and is constructed upon frequent computations. HEPT demonstrates distinctive effectivity in two important however time-consuming HEP duties, significantly outperforming present GNNs and transformers in accuracy and computational velocity, marking a giant growth in geometric deep finding out and large-scale scientific data processing. Our code is in the marketplace at url{https://github.com/Graph-COM/HEPT}.