Citation: | L. Wang, K. Liu, and Y. Yuan, IEEE/CAA J. Autom. Sinica, 2024. doi: 10.1109/JAS.2024.124863 |
[1] |
L. Sun, Z. Zhang, F. Wang, P. Ji, J. Wen, S. Su, and P. Yu, “Aligning dynamic social networks: An optimization over dynamic graph autoencoder,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 6, pp. 5597–5611, 2023.
|
[2] |
X. Xu, T. Zhang, C. Xu, Z. Cui, and J. Yang, “Spatial-temporal tensor graph convolutional network for traffic speed prediction,” IEEE Trans. Intell. Transp. Syst., vol. 24, no. 1, pp. 92–103, 2023. doi: 10.1109/TITS.2022.3215613
|
[3] |
Y. Zhou, X. Luo, and M. Zhou, “Cryptocurrency transaction network embedding from static and dynamic perspectives: An overview,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 5, pp. 1105–1121, 2023. doi: 10.1109/JAS.2023.123450
|
[4] |
Y. Xu, Y. Yuan, Z. Wang, and X. Li, “Noncooperative model predictive game with Markov jump graph,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 4, pp. 931–944, 2023. doi: 10.1109/JAS.2023.123129
|
[5] |
F. Manessi, A. Rozza, and M. Manzo, “Dynamic graph convolutional networks,” Pattern Recognit., vol. 97, 2020.
|
[6] |
A. Pareja, G. Domeniconi, J. Chen, T. F. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. B. Schardl, and C. E. Leiserson, “EvolveGCN: Evolving graph convolutional networks for dynamic graphs,” in Proc. AAAI Conf. Artif. Intell., 2020, pp. 5363–5370.
|
[7] |
O. A. Malik, S. Ubaru, L. Horesh, M. E. Kilmer, and H. Avron, “Dynamic graph convolutional networks using the tensor M-product,” in Proc. SIAM Int. Conf. Data Mining, 2021, pp. 729–737.
|
[8] |
A. Sankar, Y. Wu, L. Gou, W. Zhang, and H. Yang, “DySAT: Deep neural representation learning on dynamic graphs via self-attention networks,” in Proc. Int. Conf. Web Search and Data Mining, 2020, pp. 519–527.
|
[9] |
M. Zhang, S. Wu, X. Yu, Q. Liu, and L. Wang, “Dynamic graph neural networks for sequential recommendation,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 5, pp. 4741–4753, 2023.
|
[10] |
R. Jiang, Z. Wang, J. Yong, P. Jeph, Q. Chen, Y. Kobayashi, X. Song, S. Fukushima, and T. Suzumura, “Spatio-temporal meta-graph learning for traffic forecasting,” in Proc. AAAI Conf. Artif. Intell., 2023, pp. 8078–8086.
|
[11] |
M. Zhang, Y. Xia, Q. Liu, S. Wu, and L. Wang, “Learning long- and short-term representations for temporal knowledge graph reasoning,” in Proc. ACM Web Conf., 2023, pp. 2412–2422.
|
[12] |
M. Fang, L. Tang, X. Yang, Y. Chen, C. Li, and Q. Li, “FTPG: A fine-grained traffic prediction method with graph attention network using big trace data,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 6, pp. 5163–5175, 2022. doi: 10.1109/TITS.2021.3049264
|
[13] |
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Liò and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Represent., 2018.
|
[14] |
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. Int. Conf. Learn. Represent., 2016.
|
[15] |
P. Wu, H. Li, L. W. Hu, J. Ge, and N. Zeng, “A local-global attention fusion framework with tensor decomposition for medical diagnosis,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 0, pp. 1–3, 2023.
|
[16] |
H. Wu, X. Luo, M. Zhou, M. Rawa, K. Sedraoui, and A. Albeshri, “A PID-incorporated latent factorization of tensors approach to dynamically weighted directed network analysis,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 533–546, 2022. doi: 10.1109/JAS.2021.1004308
|
[17] |
X. Tong, L. Cheng and Y. Wu, “Bayesian tensor tucker completion with a flexible core,” IEEE Trans. Signal Process., vol. 71, pp. 4077–4091, 2023. doi: 10.1109/TSP.2023.3327845
|
[18] |
A. Paranjape, A. R. Benson, and J. Leskovec, “Motifs in temporal networks,” in Proc. ACM Inter. Conf. Web Search and Data Mining, 2017, pp. 601–610.
|
[19] |
X. Luo, Y. Yuan, S. Chen, N. Zeng, and Z. Wang, “Position-transitional particle swarm optimization-incorporated latent factor analysis,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 8, pp. 3958–3970, 2022. doi: 10.1109/TKDE.2020.3033324
|
[20] |
Y. Yu, G. Zhou, N. Zheng, Y. Qiu, S. Xie, and Q. Zhao, “Graph-regularized non-negative tensor-ring decomposition for multiway representation learning,” IEEE Trans. Cybern., vol. 53, no. 5, pp. 3114–3127, 2023. doi: 10.1109/TCYB.2022.3157133
|