| Citation: | L. Wang, Y. Yuan, and X. Luo, “Advanced high-order graph convolutional networks with assorted time-frequency transforms,” IEEE/CAA J. Autom. Sinica, vol. 13, no. 2, pp. 1–15, Feb. 2026. doi: 10.1109/JAS.2025.125429 |
| [1] |
Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, “A comprehensive survey on graph neural networks,” IEEE Trans. Neural Networks. Learn. Syst., vol. 32, no. 1, pp. 4–24, Jan. 2021. doi: 10.1109/TNNLS.2020.2978386
|
| [2] |
J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, “Graph neural networks: A review of methods and applications,” AI Open, vol. 1, pp. 57–81, Dec. 2020. doi: 10.1016/j.aiopen.2021.01.001
|
| [3] |
Z. Liu, X. Luo, and M. Zhou, “Symmetry and graph bi-regularized non-negative matrix factorization for precise community detection,” IEEE Trans. Autom. Sci. Eng., vol. 21, no. 2, pp. 1406–1420, Apr. 2024. doi: 10.1109/TASE.2023.3240335
|
| [4] |
M. Lee, G. Yu, and H. Dai, “Decentralized inference with graph neural networks in wireless communication systems,” IEEE Trans. MobileMob. Comput., vol. 22, no. 5, pp. 2582–2598, May 2023. doi: 10.1109/TMC.2021.3125793
|
| [5] |
Q. Zhu, Q. Xiong, Z. Yang, and Y. Yu, “RGCNU: Recurrent graph convolutional network with uncertainty estimation for remaining useful life prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 7, pp. 1640–1642, Jul. 2023. doi: 10.1109/JAS.2023.123369
|
| [6] |
S. Ghosh, N. R. Tallent, and M. Halappanavar, “Characterizing performance of graph neighborhood communication patterns,” IEEE Trans. Parallel Distrib. Syst., vol. 33, no. 4, pp. 915–928, Apr. 2022.
|
| [7] |
T. He, Y. Liu, Y. S. Ong, X. Wu, and X. Luo, “Polarized message-passing in graph neural networks,” Artif. Intell., vol. 331, p. 104129, Jun. 2024. doi: 10.1016/j.artint.2024.104129
|
| [8] |
H. Zhou, T. He, Y. S. Ong, G. Cong, and Q. Chen, “Differentiable clustering for graph attention,” IEEE Trans. Knowl. Data Eng., vol. 36, no. 8, pp. 3751–3764, Aug. 2024. doi: 10.1109/TKDE.2024.3363703
|
| [9] |
F. Bi, T. He, Y. Xie, and X. Luo, “Two-stream graph convolutional network-incorporated latent feature analysis,” IEEE Trans. Serv. Comput., vol. 16, no. 4, pp. 3027–3042, Jul.–Aug. 2023. doi: 10.1109/TSC.2023.3241659
|
| [10] |
S. M. Kazemi, R. Goel, K. Jain, I. Kobyzev, A. Sethi, P. Forsyth, and P. Poupart, “Representation learning for dynamic graphs: A survey,” J. Mach. Learn. Res., vol. 21, no. 1, pp. 2648–2720, Jan. 2020.
|
| [11] |
Z. Zhang, P. Cui, and W. Zhu, “Deep learning on graphs: A survey,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 1, pp. 249–270, Jan. 2022. doi: 10.1109/TKDE.2020.2981333
|
| [12] |
H. Yuan, Q. Sun, X. Fu, C. Ji, and J. Li, “Dynamic graph information bottleneck,” in Proc. of the ACM Web Conf., Singapore, Singapore, 2024, pp. 469–480.
|
| [13] |
A. Cini, I. Marisca, F. M. Bianchi, and C. Alippi, “Scalable spatiotemporal graph neural networks,” in Proc. 37th of the AAAI Conf. Artificial Intelligence, Washington, USA, 2023, pp. 7218–7226.
|
| [14] |
Y. Shin and Y. Yoon, “PGCN: Progressive graph convolutional networks for spatial-temporal traffic forecasting,” IEEE Trans. Intell. Transp. Syst., vol. 25, no. 7, pp. 7633–7644, Jul. 2024. doi: 10.1109/TITS.2024.3349565
|
| [15] |
X. Luo, H. Wu, H. Yuan, and M. Zhou, “Temporal pattern-aware QoS prediction via biased non-negative latent factorization of tensors,” IEEE Trans. Cybern., vol. 50, no. 5, pp. 1798–1809, May 2020. doi: 10.1109/TCYB.2019.2903736
|
| [16] |
F. Manessi, A. Rozza, and M. Manzo, “Dynamic graph convolutional networks,” Pattern Recognit., vol. 97, p. 107000, Jan. 2020. doi: 10.1016/j.patcog.2019.107000
|
| [17] |
A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. B. Schardl, and C. E. Leiserson, “EvolveGCN: Evolving graph convolutional networks for dynamic graphs,” in Proc. of 34th AAAI Conf. Artificial Intelligence, New York, USA, 2020, pp. 5363–5370.
|
| [18] |
M. Zhang, Y. Xia, Q. Liu, S. Wu, and L. Wang, “Learning long- and short-term representations for temporal knowledge graph reasoning,” in Proc. of the ACM Web Conf., Austin, USA, 2023, pp. 2412–2422.
|
| [19] |
C. Gao, J. Zhu, F. Zhang, Z. Wang, and X. Li, “A novel representation learning for dynamic graphs based on graph convolutional networks,” IEEE Trans. Cybern., vol. 53, no. 6, pp. 3599–3612, Jun. 2023. doi: 10.1109/TCYB.2022.3159661
|
| [20] |
G. Liang, K. U, X. Ning, P. Tiwari, S. Nowaczyk, and N. Kumar, “Semantics-aware dynamic graph convolutional network for traffic flow forecasting,” IEEE Trans. Veh. Technol., vol. 72, no. 6, pp. 7796–7809, Jun. 2023. doi: 10.1109/TVT.2023.3239054
|
| [21] |
M. E. Kilmer and C. D. Martin, “Factorization strategies for third-order tensors,” Linear Algebra Appl., vol. 435, no. 3, pp. 641–658, Aug. 2011. doi: 10.1016/j.laa.2010.09.020
|
| [22] |
X. Luo, H. Wu, Z. Wang, J. Wang, and D. Meng, “A novel approach to large-scale dynamically weighted directed network representation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 12, pp. 9756–9773, Dec. 2022. doi: 10.1109/TPAMI.2021.3132503
|
| [23] |
X. Luo, H. Wu, and Z. Li, “Neulft: A novel approach to nonlinear canonical polyadic decomposition on high-dimensional incomplete tensors,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 6, pp. 6148–6166, Jun. 2023.
|
| [24] |
X. Tian, W. Zhang, D. Yu, and J. Ma, “Sparse tensor prior for hyperspectral, multispectral, and panchromatic image fusion,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 1, pp. 284–286, Jan. 2023. doi: 10.1109/JAS.2022.106013
|
| [25] |
H. Wu, X. Luo, M. Zhou, M. J. Rawa, K. Sedraoui, and A. Albeshri, “A PID-incorporated latent factorization of tensors approach to dynamically weighted directed network analysis,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 533–546, Mar. 2022. doi: 10.1109/JAS.2021.1004308
|
| [26] |
M. E. Kilmer, L. Horesh, H. Avron, and E. Newman, “Tensor-tensor algebra for optimal representation and compression of multiway dataTensor-tensor products for optimal representation and compression of multiway data,” Proc. Natl. Acad. Sci. USA, vol. 118, no. 28, p. e20158511181–12, Jul. 2021.
|
| [27] |
E. Kernfeld, M. E. Kilmer, and S. Aeron, “Tensor-tensor products with invertible linear transforms,” Linear Algebra Appl., vol. 485, pp. 545–570, Nov. 2015. doi: 10.1016/j.laa.2015.07.021
|
| [28] |
Y. S. Luo, X. L. Zhao, T. X. Jiang, Y. Chang, M. K. Ng, and C. Li, “Self-supervised nonlinear transform-based tensor nuclear norm for multi-dimensional image recovery,” IEEE Trans. Image Process., vol. 31, pp. 3793–3808, May 2022. doi: 10.1109/TIP.2022.3176220
|
| [29] |
O. A. Malik, S. Ubaru, and L. Horesh, M. E. Kilmer, and H. Avron, “Dynamic graph convolutional networks using the tensor M-product,” in Proc. of the SIAM Int. Conf. Data Mining, 2021, pp. 729–737.
|
| [30] |
S. Winograd, “On computing the discrete Fourier transform,” Proc. Natl. Acad. Sci. USA, vol. 73, no. 4, pp. 1005–1006, Apr. 1976.
|
| [31] |
R. N. Bracewell, “Discrete hartley transform,” J. Opt. Soc. Am., vol. 73, no. 12, pp. 1832–1835, Dec. 1983. doi: 10.1364/JOSA.73.001832
|
| [32] |
N. Ahmed, T. Natarajan, and K. R. Rao, “Discrete cosine transform,” IEEE Trans. Comput., vol. C-23, no. 1, pp. 90–93, Jan. 1974. doi: 10.1109/T-C.1974.223784
|
| [33] |
K. R. Rao, M. A. Narasimham, and K. Revuluri, “A family of discrete Haar transforms,” Comput. Electr. Eng., vol. 2, no. 4, pp. 367–388, Nov. 1975. doi: 10.1016/0045-7906(75)90023-3
|
| [34] |
B. J. Fino and V. R. Algazi, “Unified matrix treatment of the fast Walsh-Hadamard transform,” IEEE Trans. Comput., vol. C-25, no. 11, pp. 1142–1146, Nov. 1976. doi: 10.1109/TC.1976.1674569
|
| [35] |
M. M. Anguh and R. R. Martin, “A truncation method for computing slant transforms with applications to image processingcoding,” IEEE Trans. Commun., vol. 43, no. 6, pp. 2103–2110, Jun. 1995. doi: 10.1109/26.387451
|
| [36] |
Z. Wei, H. Zhao, Z. Li, X. Bu, Y. Chen, X. Zhang, Y. Lv, and F. Y. Wang, “STGSA: A novel spatial-temporal graph synchronous aggregation model for traffic prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 1, pp. 226–238, Jan. 2023. doi: 10.1109/JAS.2023.123033
|
| [37] |
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. of the 5th Int. Conf. Learning Representations, Toulon, France, 2017, pp. 1−14.
|
| [38] |
J. Chen, Y. Yuan, and X. Luo, “SDGNN: Symmetry-preserving dual-stream graph neural networks,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 7, pp. 1717–1719, Jul. 2024. doi: 10.1109/JAS.2024.124410
|
| [39] |
F. Bi, T. He, and X. Luo, “A two-stream light graph convolution network-based latent factor model for accurate cloud service QoS estimation,” in Proc. of IEEE Inter. Conf. Data Mining, Orlando, USA, 2022, pp. 855–860.
|
| [40] |
J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for Quantum chemistry,” in Proc. of 34th Int. Conf. Machine Learning, Sydney, Australia, 2017, pp. 1263–1272.
|
| [41] |
T. He, Y. S. Ong, and L. Bai, “Learning conjoint attentions for graph neural nets,” in Proc. of 35th Conf. Neural Information Processing Systems, 2021, pp. 1–13.
|
| [42] |
X. Wang, X. He, M. Wang, F. Feng, and T. S. Chua, “Neural graph collaborative filtering,” in Proc. of 42nd Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Paris, France, 2019, pp. 165–174.
|
| [43] |
K. Wang, H. Wen, and G. Li, “Accurate frequency estimation by using three-point interpolated discrete Fourier transform based on rectangular window,” IEEE Trans. Ind. Inf., vol. 17, no. 1, pp. 73–81, Jan. 2021. doi: 10.1109/TII.2020.2981542
|
| [44] |
J. Domingos and J. M. F. Moura, “Graph Fourier transform: A stable approximation,” IEEE Trans. Signal Process., vol. 68, pp. 4422–4437, Jul. 2020. doi: 10.1109/TSP.2020.3009645
|
| [45] |
C. C. Tseng, “Eigenvalues and eigenvectors of generalized DFT, generalized DHT, DCT-IV, and DST-IV matrices,” IEEE Trans. Signal Process., vol. 50, no. 4, pp. 866–877, Apr. 2002. doi: 10.1109/78.992133
|
| [46] |
Y. Zeng, G. Bi, A. R. Leyman, “New algorithms for multidimensional discrete Hartley transform,” Signal Process., vol. 82, no. 8, pp. 1086–1095, Aug. 2002. doi: 10.1016/S0165-1684(02)00241-4
|
| [47] |
Z. M. Hafed, and M. D. Levine, “Face recognition using the discrete cosine transform,” in Int. J. Comput. Vision, vol. 43, no. 3, pp. 167–188, Jul. 2001.
|
| [48] |
Z. Qiu, H. Yang, J. Fu, and D. Fu. “Learning spatiotemporal frequency-transformer for compressed video super-resolution,” in Proc. of 17th European Conf. Computer Vision, Tel Aviv, Israel, 2022, pp. 257–273.
|
| [49] |
S. A. More and P. J. Deore, “Gait recognition by cross wavelet transform and graph model,” IEEE/CAA J. Autom. Sinica, vol. 5, no. 3, pp. 718–726, May 2018. doi: 10.1109/JAS.2018.7511081
|
| [50] |
F. A. Dharejo, F. Deeba, Y. Zhou, B. Das, M. A. Jatoi, M. Zawish, Y. Du, and X. Wang, “TWIST-GAN: Towards wavelet transform and transferred GAN for spatioal-temporal single image super resolution,” ACM Trans. Intell. Syst. Technol., vol. 12, no. 6, pp. 71–20, Dec. 2021.
|
| [51] |
E. Ergüün and O. Aydemir, “A hybrid BCI using singular value decomposition values of the fast Walsh-Hadamard transform coefficients,” IEEE Trans. Cognit. Dev. Syst., vol. 15, no. 2, pp. 454–463, Jun. 2023. doi: 10.1109/TCDS.2020.3028785
|
| [52] |
A. K. Jain, “A sinusoidal family of unitary transforms,” IEEE Trans. Pattern Anal. Mach. Intell., vol. PAMI-11, no. 4, pp. 356–365, Oct. 1979.
|
| [53] |
D. Jiang, L. Liu, L. Zhu, X. Wang, X. Rong, and H. Chai, “Adaptive embedding: A novel meaningful image encryption scheme based on parallel compressive sensing and slant transform,” Signal Process., vol. 188, p. 108220, Nov. 2021. doi: 10.1016/j.sigpro.2021.108220
|
| [54] |
R. Nainvarapu, R. B. Tummala, and M. K. Singh, “A slant transform and diagonal Laplacian based fusion algorithm for visual sensor network applications,” in Proc. of High Performance Computing and Networking, Singapore, Singapore: Springer, vol. 853, 2021, pp. 181–191.
|
| [55] |
M. A. Ganaie, M. Hu, A. K. Malik, M. Tanveer, and P. N. Suganthan, “Ensemble deep learning: A review,” Eng. Appl. Artif. Intell., vol. 115, p. 105151, Oct. 2022. doi: 10.1016/j.engappai.2022.105151
|
| [56] |
H. Wu, X. Luo, and M. Zhou, “Advancing non-negative latent factorization of tensors with diversified regularization schemes,” IEEE Trans. Serv. Comput., vol. 15, no. 3, pp. 1334–1344, May–Jun. 2022. doi: 10.1109/TSC.2020.2988760
|
| [57] |
Z. Hao, Z. Lu, G. Li, F. Nie, R. Wang, and X. Li, “Ensemble clustering with attentional representation,” IEEE Trans. Knowl. Data Eng., vol. 36, no. 2, pp. 581–593, Feb. 2024.
|
| [58] |
Z. Li, S. Li, O. O. Bamasag, A. Alhothali, and X. Luo, “Diversified regularization enhanced training for effective manipulator calibration,” IEEE Trans. Neural Networks. Learn. Syst., vol. 34, no. 11, pp. 8778–8790, Nov. 2023. doi: 10.1109/TNNLS.2022.3153039
|
| [59] |
A. Liu, J. Lu, and G. Zhang, “Diverse instance-weighting ensemble based on region drift disagreement for concept drift adaptation,” IEEE Trans. Neural Networks. Learn. Syst., vol. 32, no. 1, pp. 293–307, Jan. 2021. doi: 10.1109/TNNLS.2020.2978523
|
| [60] |
K. Yang, Z. Yu, C. L. P. Chen, W. Cao, J. You, and H. S. Wong, “Incremental weighted ensemble broad learning system for imbalanced data,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 12, pp. 5809–5824, Dec. 2022. doi: 10.1109/TKDE.2021.3061428
|
| [61] |
Y. Yuan, X. Luo, M. Shang, and Z. Wang, “A Kalman-filter-incorporated latent factor analysis model for temporally dynamic sparse data,” IEEE Trans. Cybern., vol. 53, no. 9, pp. 5788–5801, Sept. 2023. doi: 10.1109/TCYB.2022.3185117
|
| [62] |
D. Wu, X. Luo, Y. He, and M. Zhou, “A prediction-sampling-based multilayer-structured latent factor model for accurate representation to high-dimensional and sparse data,” IEEE Trans. Neural Networks. Learn. Syst., vol. 35, no. 3, pp. 3845–3858, Mar. 2024. doi: 10.1109/TNNLS.2022.3200009
|
| [63] |
Y. Yuan, R. Wang, G. Yuan, and X. Luo, “An adaptive divergence-based non-negative latent factor model,” IEEE Trans. Syst. Man Cybern.: Syst., vol. 53, no. 10, pp. 6475–6487, Oct. 2023. doi: 10.1109/TSMC.2023.3282950
|
| [64] |
X. Xu, T. Zhang, C. Xu, Z. Cui, and J. Yang, “Spatial-temporal tensor graph convolutional network for traffic speed prediction,” IEEE Trans. Intell. Transp. Syst., vol. 24, no. 1, pp. 92–103, Jan. 2023. doi: 10.1109/TITS.2022.3215613
|
| [65] |
W. Hoeffding, “Probability inequalities for sums of bounded random variables,” J. Am. Stat. Assoc., vol. 58, no. 301, pp. 13–30, Mar. 1963. doi: 10.1080/01621459.1963.10500830
|
| [66] |
A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, et al., “PyTorch: An imperative style, high-performance deep learning library,” in Proc. of 33rd Conf. Neural Information Processing Systems, Vancouver, Canada, 2019, pp. 8026–8037.
|
| [67] |
X. Shi, Q. He, X. Luo, Y. Bai, and M. Shang, “Large-scale and scalable latent factor analysis via distributed alternative stochastic gradient descent for recommender systems,” IEEE Trans. Big Data, vol. 8, no. 2, pp. 420–431, Apr. 2022.
|
| [68] |
L. Wei, L. Jin, and X. Luo, “Noise-suppressing neural dynamics for time-dependent constrained nonlinear optimization with applications,” IEEE Trans. Syst. Man Cybern.: Syst., vol. 52, no. 10, pp. 6139–6150, Oct. 2022.
|
| [69] |
Y. Zhang, X. Wei, X. Zhang, Y. Hu, and B. Yin, “Self-attention graph convolution residual network for traffic data completion,” IEEE Trans. Big Data, vol. 9, no. 2, pp. 528–541, Apr. 2023. doi: 10.1109/TBDATA.2022.3181068
|
| [70] |
C. Y. Zhang, Z. L. Yao, H. Y. Yao, F. Huang, and C. L. P. Chen, “Dynamic representation learning via recurrent graph neural networks,” IEEE Trans. Syst. Man Cybern.: Syst., vol. 53, no. 2, pp. 1284–1297, Feb. 2023. doi: 10.1109/TSMC.2022.3196506
|
| [71] |
K. Liang, L. Meng, M. Liu, Y. Liu, W. Tu, S. Wang, S. Zhou, and X. Liu, “Learn from relational correlations and periodic events for temporal knowledge graph reasoning,” in Proc. of 46th Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Taipei, China, 2023, pp. 1559–1568.
|
| [72] |
L. Chen, Y. Fu, L. Gu, C. Yan, T. Harada, and G. Huang, “Frequency-aware feature fusion for dense image prediction,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 46, no. 12, pp. 10763–10780, Dec. 2024. doi: 10.1109/TPAMI.2024.3449959
|
| [73] |
Q. Zhang, D. Guo, X. Zhao, L. Yuan, and L. Luo, “Discovering frequency bursting patterns in temporal graphs,” in Proc. of IEEE 39th Int. Conf. Data Engineering, Anaheim, USA, 2023, pp. 599–611.
|
| [74] |
Y. Liu, Z. Yu, and M. Xie, “Cascading time-frequency transformer and spatio-temporal graph attention network for rotating machinery fault diagnosis,” IEEE Trans. Instrum. Meas., vol. 73, pp. 35303101–3530110, Jan. 2024.
|
| [75] |
L. Xiao, R. Zhang, Z. Chen, and J. Chen, “Tucker decomposition with frequency attention for temporal knowledge graph completion,” in Findings of the Association for Computational Linguistics, Toronto, Canada, 2023, pp. 7286–7300.
|
| [76] |
V. Md, S. Misra, G. Ma, R. Mohanty, E. Georganas, A. Heinecke, D. Kalamkar, N. K. Ahmed, and S. Avancha, “DistGNN: Scalable distributed training for large-scale graph neural networks,” in Proc. of the Inter. Conf. for High Performance Computing, Networking, Storage and Analysis, St. Louis, USA, 2021, pp. 1–14.
|
| [77] |
X. Li, J. Ma, Z. Wu, D. Su, W. Zhang, R. Li, and G. Wang, “Rethinking node-wise propagation for large-scale graph learning,” in Proc. of the ACM Web Conf., Singapore, Singapore, 2024, pp. 560–569.
|
| [78] |
J. Ma, L. Tang, F. Fan, J. Huang, X. Mei, and Y. Ma, “SwinFusion: Cross-domain long-range learning for general image fusion via swin transformer,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 7, pp. 1200–1217, Jul. 2022. doi: 10.1109/JAS.2022.105686
|
| [79] |
Y. K. Xu, D. Huang, C. D. Wang, and J. H. Lai, “GLAC-GCN: Global and local topology-aware contrastive graph clustering network,” IEEE Trans. Artif. Intell., vol. 6, no. 6, pp. 1448−1459, Jun. 2025.
|