To tackle these challenges, a novel lightweight long-range context fusion network, named LightCF-Net, is proposed in this paper. This network attempts to model long-range spatial dependencies while maintaining real-time performance, to better distinguish polyps from...
Spatial-temporalGraph convolutionIntelligent Transportation Systems aim to alleviate traffic congestion and enhance urban traffic management. Transformer-based methods have shown promise in traffic prediction due to their capability to handle long-range dependencies. However, they disregard local context during...
To this end, we propose a novel Propagation Delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction. Specifically, we design a spatial self-attention module to capture the dynamic spatial dependencies. Then, two graph masking matrices are introduced to high...
Fractional Gaussian fields have inspired extensive research in spatial statistics. Fractional fields generalize the notion of fractional noise in two or higher dimensions and are particularly important for studying power laws and modeling long-range dependencies. The early mathematical development of fractiona...
However, CNNs face limitations in capturing long-range dependencies while transformers suffer high computational complexities. To address this, we propose RWKV-UNet, a novel model that integrates the RWKV (Receptance Weighted Key Value) structure into the U-Net architecture. This integration enhances...
that captures both neighborhood and long-range dependencies to enhance the prediction capacity. Specifically, ML-Former first conducts a time-series embedding that integrates neighborhood dependencies, positions, and timestamps. Then, it captures neighborhood and long-range dependencies by using a time-...
to-end automatic seizure detection model named CNN-Informer, which leverages the capability of Convolutional Neural Network (CNN) to extract EEG local features of multi-channel EEGs, and the low computational complexity and memory usage ability of the Informer to capture the long-range dependencies....
In this survey, we give a global picture of the lifecycle of long-context LLMs from four perspectives:architecture,infrastructure,training, andevaluation, includinglength extrapolation,cache optimization,memory management,architecture innovation,training infrastructure,inference infrastructure,long-context pre-tra...
LSTMs are particularly popular in O3time series prediction because of their ability to capture long-range temporal dependencies in O3data. This is a significant advantage over traditional RNNs, which often struggle with long-term dependencies. LSTMs also excel at capturing nonlinear patterns in O3da...
We study a general class of percolation models in Euclidean space including long-range percolation, scale-free percolation, the weight-dependent random con