Locality-Delicate Hashing-Based mostly Environment friendly Level Transformer with Functions in Excessive-Power Physics
Authors: Siqi Miao, Zhiyuan Lu, Mia Liu, Javier Duarte, Pan Li
Summary: This examine introduces a novel transformer mannequin optimized for large-scale level cloud processing in scientific domains similar to high-energy physics (HEP) and astrophysics. Addressing the constraints of graph neural networks and customary transformers, our mannequin integrates native inductive bias and achieves near-linear complexity with hardware-friendly common operations. One contribution of this work is the quantitative evaluation of the error-complexity tradeoff of assorted sparsification strategies for constructing environment friendly transformers. Our findings spotlight the prevalence of utilizing locality-sensitive hashing (LSH), particularly OR & AND-construction LSH, in kernel approximation for large-scale level cloud knowledge with native inductive bias. Based mostly on this discovering, we suggest LSH-based Environment friendly Level Transformer (textbf{HEPT}), which mixes E2LSH with OR & AND constructions and is constructed upon common computations. HEPT demonstrates exceptional efficiency in two essential but time-consuming HEP duties, considerably outperforming current GNNs and transformers in accuracy and computational velocity, marking a big development in geometric deep studying and large-scale scientific knowledge processing. Our code is on the market at url{https://github.com/Graph-COM/HEPT}.