Authors
- Xiaoming Liu*
- Zhanwei Zhang*
- Lingjuan Lyu
- Zhaohan Zhang*
- Shuai Xiao*
- Chao Shen*
- Philip Yu*
* External authors
Venue
- TKDE
Date
- 2022
Traffic Anomaly Prediction Based on Joint Static-Dynamic Spatio-Temporal Evolutionary Learning
Xiaoming Liu*
Zhanwei Zhang*
Zhaohan Zhang*
Shuai Xiao*
Chao Shen*
Philip Yu*
* External authors
TKDE
2022
Abstract
Accurate traffic anomaly prediction offers an opportunity to save the wounded at the right location in time. However, the complex process of traffic anomaly is affected by both various static factors and dynamic interactions. The recent evolving representation learning provides a new possibility to understand this complicated process, but with challenges of imbalanced data distribution and heterogeneity of features. To tackle these problems, this paper proposes a spatio-temporal evolution model named SNIPER for learning intricate feature interactions to predict traffic anomalies. Specifically, we design spatio-temporal encoders to transform spatio-temporal information into vector space indicating their natural relationship. Then, we propose a temporally dynamical evolving embedding method to pay more attention to rare traffic anomalies and develop an effective attention-based multiple graph convolutional network to formulate the spatially mutual influence from three different perspectives. The FC-LSTM is adopted to aggregate the heterogeneous features considering the spatio-temporal influences. Finally, a loss function is designed to overcome the 'over-smoothing' and solve the imbalanced data problem. Extensive experiments show that SNIPER averagely outperforms state-of-the-arts by 3.9%, 0.9%, 1.9% and 1.6% on Chicago datasets, and 2.4%, 0.6%, 2.6% and 1.3% on New York City datasets in metrics of AUC-PR, AUC-ROC, F1 score, and accuracy, respectively.
Related Publications
Existing collaborative self-supervised learning (SSL) schemes are not suitable for cross-client applications because of their expensive computation and large local data requirements. To address these issues, we propose MocoSFL, a collaborative SSL framework based on Split Fe…
Knowledge Distillation (KD) is a typical method for training a lightweight student model with the help of a well-trained teacher model. However, most KD methods require access to either the teacher's training data or model parameter, which is unrealistic. To tackle this prob…
In real-world applications, deep learning models often run in non-stationary environments where the target data distribution continually shifts over time. There have been numerous domain adaptation (DA) methods in both online and offline modes to improve cross-domain adaptat…
JOIN US
Shape the Future of AI with Sony AI
We want to hear from those of you who have a strong desire
to shape the future of AI.