Hybrid Graph Models for Traffic Prediction
Abstract
:1. Introduction
2. Related Work
2.1. Traditional Method
2.2. Deep Learning Method
2.3. Attention
3. Preliminaries
3.1. Traffic Prediction on Road Graphs
3.2. Graph Neural Network
4. Methodology
4.1. STA Module
4.1.1. Temporal Attention and Spatial Attention
4.1.2. Gated Function
4.2. DGCN Block
5. Experiments
5.1. Settings
5.2. Parameter Settings
5.3. Compared Methods
- HA: HA uses the average of historical data to predict future road conditions.
- ARIMA [11]: ARIMA employs autoregressive and moving average methods for traffic prediction.
- SVR [44]: SVR uses support vector machine to extract spatial–temporal features of the traffic network.
- FC-LSTM [45]: FC-LSTM uses LSTM to analyze spatial–temporal dependencies.
- DCRNN [26]: DCRNN extracts spatial–temporal correlations with diffusion convolution and recurrent neural network.
- STGCN [31]: STGCN combines graph convolution and 1D convolution operations to extract spatial–temporal dependencies.
- GraphWaveNet [29]: GraphWaveNet proposes a hybrid model of graph convolution and dilated convolution to dynamically extract features.
- ASTGCN [30]: ASTGCN extracts spatial and temporal correlations through a convolutional neural network and attention mechanism.
- MTGNN [46]: MTGNN is a GNN-based and CNN-based model which employs adaptive graph, mix–hop propagation layers, and dilated inception layers to capture spatial–temporal correlations.
- GMAN [38]: GMAN captures spatial–temporal information using a spatial–temporal attention mechanism.
- MRA-DGCN [27]: MRA-DGCN captures complex dependencies among nodes using a dynamic adjacency matrix.
6. Experimental Results
6.1. Predictive Performance
6.2. Ablation Studies
6.2.1. Impact of STA Module
6.2.2. Impact of DGCN
6.2.3. Impact of Gated Function
6.3. Time Cost
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dai, G.; Ma, C.; Xu, X. Short-term traffic flow prediction method for urban road sections based on space–time analysis and GRU. IEEE Access 2019, 7, 143025–143035. [Google Scholar] [CrossRef]
- Li, F.; Feng, J.; Yan, H.; Jin, G.; Yang, F.; Sun, F.; Jin, D.; Li, Y. Dynamic graph convolutional recurrent network for traffic prediction: Benchmark and solution. ACM Trans. Knowl. Discov. Data 2023, 17, 1–21. [Google Scholar]
- Medina-Salgado, B.; Sanchez-DelaCruz, E.; Pozos-Parra, P.; Sierra, J.E. Urban traffic flow prediction techniques: A review. Sustain. Comput. Inform. Syst. 2022, 35, 100739. [Google Scholar]
- Ye, X.; Fang, S.; Sun, F.; Zhang, C.; Xiang, S. Meta graph transformer: A novel framework for spatial–temporal traffic prediction. Neurocomputing 2022, 491, 544–563. [Google Scholar] [CrossRef]
- Kashyap, A.A.; Raviraj, S.; Devarakonda, A.; Nayak K, S.R.; KV, S.; Bhat, S.J. Traffic flow prediction models—A review of deep learning techniques. Cogent Eng. 2022, 9, 2010510. [Google Scholar] [CrossRef]
- Bui, K.H.N.; Cho, J.; Yi, H. Spatial-temporal graph neural network for traffic forecasting: An overview and open research issues. Appl. Intell. 2022, 52, 2763–2774. [Google Scholar] [CrossRef]
- Lohrasbinasab, I.; Shahraki, A.; Taherkordi, A.; Delia Jurcut, A. From statistical-to machine learning-based network traffic prediction. Trans. Emerg. Telecommun. Technol. 2022, 33, e4394. [Google Scholar] [CrossRef]
- Zhou, X.; Zhang, Y.; Li, Z.; Wang, X.; Zhao, J.; Zhang, Z. Large-scale cellular traffic prediction based on graph convolutional networks with transfer learning. Neural Comput. Appl. 2022, 34, 5549–5559. [Google Scholar] [CrossRef]
- Jin, J.; Rong, D.; Zhang, T.; Ji, Q.; Guo, H.; Lv, Y.; Ma, X.; Wang, F. A GAN-based short-term link traffic prediction approach for urban road networks under a parallel learning framework. IEEE Trans. Intell. Transp. Syst. 2022, 23, 16185–16196. [Google Scholar] [CrossRef]
- Jiang, W. Internet traffic prediction with deep neural networks. Internet Technol. Lett. 2022, 5, e314. [Google Scholar] [CrossRef]
- Box, G.E.P.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis: Forecasting and Control; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Ghimire, S.; Deo, R.C.; Wang, H.; Al-Musaylh, M.S.; Casillas-Pérez, D.; Salcedo-Sanz, S. Stacked LSTM sequence-to-sequence autoencoder with feature selection for daily solar radiation prediction: A review and new modeling results. Energies 2022, 15, 1061. [Google Scholar] [CrossRef]
- Fan, W.; Ma, Y.; Li, Q.; He, Y.; Zhao, E.; Tang, J.; Yin, D. Graph neural networks for social recommendation. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 417–426. [Google Scholar]
- Jiang, W.; Luo, J. Graph neural network for traffic forecasting: A survey. Expert Syst. Appl. 2022, 207, 117921. [Google Scholar] [CrossRef]
- Zeng, Z.; Huang, Y.; Wu, T.; Deng, H.; Xu, J.; Zheng, B. Graph-based Weakly Supervised Framework for Semantic Relevance Learning in E-commerce. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 3634–3643. [Google Scholar]
- Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Wang, L.; Li, C.; Sun, M. Graph neural networks: A review of methods and applications. AI Open 2020, 1, 57–81. [Google Scholar]
- Pan, B.; Demiryurek, U.; Shahabi, C. Utilizing real-world transportation data for accurate traffic prediction. In Proceedings of the 2012 IEEE 12th International Conference on Data Mining, Brussels, Belgium, 10–13 December 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 595–604. [Google Scholar]
- Boukerche, A.; Wang, J. Machine learning-based traffic prediction models for intelligent transportation systems. Comput. Netw. 2020, 181, 107530. [Google Scholar] [CrossRef]
- Zhang, W.; Zhu, F.; Lv, Y.; Tan, C.; Liu, W.; Zhang, X.; Wang, F.Y. AdapGL: An adaptive graph learning algorithm for traffic prediction based on spatiotemporal neural networks. Transp. Res. Part C Emerg. Technol. 2022, 139, 103659. [Google Scholar] [CrossRef]
- Zhou, J.; Han, T.; Xiao, F.; Gui, G.; Adebisi, B.; Gacanin, H.; Sari, H. Multiscale network traffic prediction method based on deep echo-state network for internet of things. IEEE Internet Things J. 2022, 9, 21862–21874. [Google Scholar] [CrossRef]
- Yao, H.; Tang, X.; Wei, H.; Zheng, G.; Li, Z. Revisiting spatial-temporal similarity: A deep learning framework for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 5668–5675. [Google Scholar]
- Yu, H.; Wu, Z.; Wang, S.; Wang, Y.; Ma, X. Spatiotemporal recurrent convolutional networks for traffic prediction in transportation networks. Sensors 2017, 17, 1501. [Google Scholar] [CrossRef] [Green Version]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Deng, M.; Li, H. T-gcn: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 2019, 21, 3848–3858. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Huang, C.; Xu, Y.; Xia, L.; Dai, P.; Bo, L.; Zhang, J.; Zheng, Y. Traffic flow forecasting with spatial-temporal graph diffusion network. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 2–9 February 2021; Volume 35, pp. 15008–15015. [Google Scholar]
- Yin, X.; Wu, G.; Wei, J.; Shen, Y.; Qi, H.; Yin, B. Multi-stage attention spatial-temporal graph networks for traffic prediction. Neurocomputing 2021, 428, 42–53. [Google Scholar] [CrossRef]
- Li, Y.; Yu, R.; Shahabi, C.; Liu, Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv 2017, arXiv:1707.01926. [Google Scholar]
- Yao, H.; Chen, R.; Xie, Z.; Yang, J.; Hu, M.; Guo, J. MRA-DGCN: Multi-Range Attention-Based Dynamic Graph Convolutional Network for Traffic Prediction. In Proceedings of the 2022 IEEE International Conference on Big Data (Big Data), Osaka, Japan, 17–20 December 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1613–1621. [Google Scholar]
- Lv, M.; Hong, Z.; Chen, L.; Chen, T.; Zhu, T.; Ji, S. Temporal multi-graph convolutional network for traffic flow prediction. IEEE Trans. Intell. Transp. Syst. 2020, 22, 3337–3348. [Google Scholar] [CrossRef]
- Wu, Z.; Pan, S.; Long, G.; Jiang, J.; Zhang, C. Graph wavenet for deep spatial-temporal graph modeling. arXiv 2019, arXiv:1906.00121. [Google Scholar]
- Guo, S.; Lin, Y.; Feng, N.; Song, C.; Wan, H. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 922–929. [Google Scholar]
- Yu, B.; Yin, H.; Zhu, Z. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. arXiv 2017, arXiv:1709.04875. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
- Young, T.; Hazarika, D.; Poria, S.; Cambria, E. Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 2018, 13, 55–75. [Google Scholar] [CrossRef]
- Dai, Z.; Yang, Z.; Yang, Y.; Carbonell, J.; Le, Q.V.; Salakhutdinov, R. Transformer-xl: Attentive language models beyond a fixed-length context. arXiv 2019, arXiv:1901.02860. [Google Scholar]
- Hossain MD, Z.; Sohel, F.; Shiratuddin, M.F.; Laga, H. A comprehensive survey of deep learning for image captioning. ACM Comput. Surv. (CsUR) 2019, 51, 1–36. [Google Scholar] [CrossRef] [Green Version]
- Deng, L.; Hinton, G.; Kingsbury, B. New types of deep neural network learning for speech recognition and related applications: An overview. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 8599–8603. [Google Scholar]
- Chen, W.; Chen, L.; Xie, Y.; Cao, W.; Gao, Y.; Feng, X. Multi-range attentive bicomponent graph convolutional network for traffic forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 3529–3536. [Google Scholar]
- Zheng, C.; Fan, X.; Wang, C.; Qi, J. Gman: A graph multi-attention network for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 1234–1241. [Google Scholar]
- Shuman, D.I.; Narang, S.K.; Frossard, P.; Ortega, A.; Vandergheynst, P. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Process. Mag. 2013, 30, 83–98. [Google Scholar] [CrossRef] [Green Version]
- Micheli, A. Neural network for graphs: A contextual constructive approach. IEEE Trans. Neural Netw. 2009, 20, 498–511. [Google Scholar] [CrossRef]
- Oloulade, B.M.; Gao, J.; Chen, J.; Lyu, T.; Al-Sabri, R. Graph neural architecture search: A survey. Tsinghua Sci. Technol. 2021, 27, 692–708. [Google Scholar] [CrossRef]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Wu, C.-H.; Ho, J.-M.; Lee, D.T. Travel-time prediction with support vector regression. IEEE Trans. Intell. Transp. Syst. 2004, 5, 276–281. [Google Scholar] [CrossRef] [Green Version]
- Graves, A.; Graves, A. Long short-term memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Wu, Z.; Pan, S.; Long, G.; Jiang, J.; Chang, X.; Zhang, C. Connecting the dots: Multivariate time series forecasting with graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Virtual, 6–10 July 2020; pp. 753–763. [Google Scholar]
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
---|---|---|---|---|---|---|---|---|---|---|
PeMS | HA | 2.88 | 5.59 | 6.80% | 2.88 | 5.59 | 6.80% | 2.88 | 5.59 | 6.80% |
ARIMA | 1.62 | 3.30 | 3.50% | 2.33 | 4.76 | 5.40% | 3.38 | 6.50 | 8.30% | |
SVR | 1.85 | 3.59 | 3.82% | 2.48 | 5.18 | 5.50% | 3.28 | 7.08 | 8.00% | |
FC-LSTM | 2.05 | 4.19 | 4.80% | 2.20 | 4.55 | 5.20% | 2.37 | 4.96 | 5.70% | |
DCRNN | 1.38 | 2.95 | 2.90% | 1.74 | 3.97 | 3.90% | 2.07 | 4.74 | 4.90% | |
STGCN | 1.36 | 2.96 | 2.90% | 1.81 | 4.27 | 4.17% | 2.49 | 5.69 | 5.79% | |
Graph WaveNet | 1.30 | 2.74 | 2.73% | 1.63 | 3.70 | 3.67% | 1.95 | 4.52 | 4.63% | |
ASTGCN | 1.53 | 3.13 | 3.22% | 2.01 | 4.27 | 4.48% | 2.61 | 5.42 | 6.00% | |
MTGCN | 1.32 | 2.79 | 2.77% | 1.65 | 3.74 | 3.69% | 1.94 | 4.49 | 4.53% | |
GMAN | 1.34 | 2.82 | 2.81% | 1.62 | 3.72 | 3.63% | 1.86 | 4.32 | 4.31% | |
MRA-DGCN | 1.28 | 2.75 | 2.68% | 1.59 | 3.62 | 3.60% | 1.87 | 4.33 | 4.42% | |
HGM (Ours) | 1.25 | 2.66 | 2.64% | 1.54 | 3.57 | 3.47% | 1.85 | 4.31 | 4.29% | |
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
METR-LA | HA | 4.16 | 7.80 | 13.00% | 4.16 | 7.80 | 13.00% | 4.16 | 7.80 | 13.00% |
ARIMA | 3.99 | 8.21 | 9.60% | 5.15 | 10.45 | 12.70% | 6.90 | 13.23 | 17.40% | |
SVR | 3.99 | 8.45 | 9.30% | 5.05 | 10.87 | 12.10% | 6.72 | 13.76 | 16.70% | |
FC-LSTM | 3.44 | 6.30 | 9.60% | 3.77 | 7.23 | 10.90% | 4.37 | 8.69 | 13.20% | |
DCRNN | 2.77 | 5.38 | 7.30% | 3.15 | 6.45 | 8.80% | 3.60 | 7.60 | 10.50% | |
STGCN | 2.88 | 5.74 | 7.62% | 3.47 | 7.24 | 9.57% | 4.59 | 9.40 | 12.70% | |
Graph WaveNet | 2.69 | 5.15 | 6.90% | 3.07 | 6.22 | 8.37% | 3.53 | 7.37 | 10.01% | |
ASTGCN | 4.86 | 9.27 | 9.21% | 5.43 | 10.61 | 10.13% | 6.51 | 12.52 | 11.64% | |
MTGCN | 2.69 | 5.18 | 6.86% | 3.05 | 6.17 | 8.19% | 3.49 | 7.23 | 9.87% | |
GMAN | 2.80 | 5.55 | 7.41% | 3.12 | 6.49 | 8.73% | 3.44 | 7.35 | 10.07% | |
MRA-DGCN | 2.62 | 5.14 | 6.68% | 3.01 | 6.13 | 8.05% | 3.38 | 7.22 | 9.98% | |
HGM (Ours) | 2.58 | 5.06 | 6.60% | 3.04 | 6.10 | 8.19% | 3.35 | 7.03 | 9.65% |
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
---|---|---|---|---|---|---|---|---|---|---|
PeMS | HGM-V1 | 1.33 | 2.76 | 2.74% | 1.66 | 3.71 | 3.68% | 1.95 | 4.47 | 4.57% |
HGM-V2 | 1.50 | 3.15 | 3.10% | 2.02 | 4.54 | 4.52% | 2.66 | 5.93 | 6.59% | |
HGM-V3 | 1.32 | 2.78 | 2.75% | 1.70 | 3.95 | 3.80% | 1.93 | 4.45 | 4.58% | |
HGM-V4 | 1.32 | 2.79 | 2.73% | 1.64 | 3.71 | 3.64% | 1.97 | 4.52 | 4.60% | |
HGM-V5 | 1.30 | 2.74 | 2.67% | 1.62 | 3.69 | 3.60% | 1.90 | 4.41 | 4.48% | |
HGM | 1.25 | 2.66 | 2.64% | 1.54 | 3.57 | 3.47% | 1.85 | 4.31 | 4.29% | |
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
METR-LA | HGM-V1 | 2.66 | 5.11 | 6.86% | 3.03 | 6.15 | 8.35% | 3.45 | 7.20 | 10.14% |
HGM-V2 | 3.19 | 6.09 | 8.41% | 3.83 | 7.60 | 10.74% | 4.69 | 9.20 | 14.55% | |
HGM-V3 | 2.65 | 5.08 | 6.77% | 3.04 | 6.15 | 8.42% | 3.50 | 7.34 | 10.36% | |
HGM-V4 | 2.70 | 5.25 | 6.97% | 3.12 | 6.42 | 8.64% | 3.55 | 7.47 | 10.21% | |
HGM-V5 | 2.67 | 5.17 | 6.84% | 3.10 | 6.30 | 8.57% | 3.57 | 7.49 | 10.54% | |
HGM | 2.58 | 5.06 | 6.60% | 3.04 | 6.10 | 8.19% | 3.35 | 7.03 | 9.65% |
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
---|---|---|---|---|---|---|---|---|---|---|
PeMS | HGM-V6 | 1.30 | 2.73 | 2.68% | 1.62 | 3.68 | 3.62% | 1.93 | 4.49 | 4.60% |
HGM-V7 | 1.30 | 2.74 | 2.67% | 1.62 | 3.69 | 3.60% | 1.90 | 4.41 | 4.48% | |
HGM | 1.25 | 2.66 | 2.64% | 1.54 | 3.57 | 3.47% | 1.85 | 4.31 | 4.29% | |
Data | Method | MAE | 15 min RMSE | MAPE | MAE | 30 min RMSE | MAPE | MAE | 60 min RMSE | MAPE |
METR-LA | HGM-V6 | 2.68 | 5.18 | 6.87% | 3.06 | 6.21 | 8.42% | 3.53 | 7.44 | 10.43% |
HGM-V7 | 2.67 | 5.17 | 6.84% | 3.10 | 6.30 | 8.57% | 3.57 | 7.49 | 10.54% | |
HGM | 2.58 | 5.06 | 6.60% | 3.04 | 6.10 | 8.19% | 3.35 | 7.03 | 9.65% |
Dataset | Methods | HA | ARIMA | SVR | FC-LSTM | DCRNN | STGCN | Graph WaveNet | ASTGCN | MTGCN | GMAN | MRA-DGCN | HGM |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
METR-LA | Training time | 0.00 | 64.21 | 84.37 | 463.49 | 4598.65 | 6021.55 | 5967.32 | 6218.35 | 7463.74 | 6458.56 | 8165.25 | 7489.54 |
Inference | 0.11 | 0.74 | 0.42 | 1.21 | 0.98 | 1.31 | 0.98 | 1.56 | 1.01 | 1.32 | 1.79 | 1.01 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, R.; Yao, H. Hybrid Graph Models for Traffic Prediction. Appl. Sci. 2023, 13, 8673. https://doi.org/10.3390/app13158673
Chen R, Yao H. Hybrid Graph Models for Traffic Prediction. Applied Sciences. 2023; 13(15):8673. https://doi.org/10.3390/app13158673
Chicago/Turabian StyleChen, Renyi, and Huaxiong Yao. 2023. "Hybrid Graph Models for Traffic Prediction" Applied Sciences 13, no. 15: 8673. https://doi.org/10.3390/app13158673
APA StyleChen, R., & Yao, H. (2023). Hybrid Graph Models for Traffic Prediction. Applied Sciences, 13(15), 8673. https://doi.org/10.3390/app13158673