Citation: | C. Fei, Y. Li, X. K. Huang, G. Zhang, and R. Lu, “Unsupervised dynamic discrete structure learning: A geometric evolution method,” IEEE/CAA J. Autom. Sinica, 2025. doi: 10.1109/JAS.2025.125165 |
[1] |
Y. Ma, D. Tsao, and H.-Y. Shum, “On the principles of parsimony and self-consistency for the emergence of intelligence,” Frontiers of Information Technology & Electronic Engineering, vol. 23, no. 9, pp. 1298–1323, 2022.
|
[2] |
H. Sebastian Seung and D. D.Lee, “The manifold ways of perception,” Science, vol. 290, no. 5500, pp. 2268–2269, 2000. doi: 10.1126/science.290.5500.2268
|
[3] |
T. Willmore, Riemannian geometry, Oxford: Oxford University Press, 1997.
|
[4] |
V.D. Silva and J.B. Tenenbaum, “Global versus local methods in nonlinear dimensionality reduction,” Advances in Neural Information Processing Systems, pp. 705–712, 2003.
|
[5] |
S. Yan, D. Xu, B. Zhang, and H.-J. Zhang, “Graph embedding and extension: a general framework for dimensionality reduction,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 29, no. 1, pp. 40–51, 2007. doi: 10.1109/TPAMI.2007.250598
|
[6] |
L. Fang, D. Zhu, B. Zhang, M. He., “Geometric-spectral reconstruction learning for multi-source open-set classification with hyperspectral and LiDAR data,” IEEE/CAA J. of Autom. Sinica, vol. 9, no. 10, pp. 1892–1895, 2022. doi: 10.1109/JAS.2022.105893
|
[7] |
D. Zang, W. Su, B. Zhang, and H. Liu, “DCANet: Dense Convolutional Attention Network for infrared small target detection,” Measurement, vol. 240, no. 115595, 2025.
|
[8] |
H. Cai, V.W. Zheng, and K. C. Chang, “A comprehensive survey of graph embedding: problems, techniques, and applications,” IEEE Trans. on Knowledge and Data Engineering, vol. 30, no. 9, pp. 1616–1637, 2018. doi: 10.1109/TKDE.2018.2807452
|
[9] |
J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, “Spectral networks and locally connected networks on graphs,” arXiv preprint arXiv: 1312.6203, 2013.
|
[10] |
S. Hauberg, O. Freifeld, and M. J. Black, “A geometric take on metric learning,” Int. Conf. on Neural Information Processing Systems, 2012.
|
[11] |
J. B. Tenenbaum, V. de Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2322, 2000. doi: 10.1126/science.290.5500.2319
|
[12] |
M.A.A. Cox and T.F. Cox, “Multidimensional scaling,” In: Handbook of Data Visulization, pp. 315-347, 2008.
|
[13] |
G. Rosman, M.M. Bronstein, A. M. Bronstein, and R. Kimmel, “Nonlinear dimensionality reduction by topologically constrained isometric embedding,” Int. J. of Computer Vision, vol. 89, pp. 56–68, 2010. doi: 10.1007/s11263-010-0322-1
|
[14] |
T. Zhu, F. Shen, J. Zhao, and Y. Liang, “Topology learning embedding: a fast and incremental method for manifold learning,” Int. Conf. on Neural Information Processing, pp. 43–52, 2007.
|
[15] |
S. Roweis and L. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000. doi: 10.1126/science.290.5500.2323
|
[16] |
M. Belkin and P. Niyogi, “Laplacian eigenmaps and spectral techniques for embedding and clustering,” Advances in Neural Information Processing Systems, pp. 585–591, 2001.
|
[17] |
Z. Zhang and H. Zha, “Principal manifolds and nonlinear dimension reduction via local tangent space alignment,” SIAM J. on Scientific Computing, vol. 26, no. 1, pp. 313–338, 2004. doi: 10.1137/S1064827502419154
|
[18] |
D.L. Donoho and C.E. Crimes, “Hessian eigenmaps: locally linear embedding techniques for high-dimensional data,” PNAS, vol. 100, no. 10, pp. 5591–5596, 2003. doi: 10.1073/pnas.1031596100
|
[19] |
Z. Zhang and J. Wang, “Mlle: modified locally linear embedding using multiple weights,” Advances in Neural Information Processing Systems, pp. 1593–1600, 2006.
|
[20] |
L. Van der Maaten and G. Hinton, “Visualizing data using t-SNE,” Journal of Machine Learning Research, vol. 9, pp. 2579–2605, 2008.
|
[21] |
K.I. Kim, J. Tompkin, and C. Theobalt, “Curvature-aware regularization on Riemannian sub-manifolds,” Int. Conf. on Computer Vision, pp. 88–888, 2013.
|
[22] |
W. Xu, E.R. Hancock, and R.C. Wilson, “Ricci flow embedding for rectifying non-Euclidean dissimilarity data,” Pattern Recognition, vol. 47, no. 11, pp. 3709–3725, 2014. doi: 10.1016/j.patcog.2014.04.021
|
[23] |
Y.Y. Li, “Curvature-aware manifold learning,” Pattern Recognition, vol. 83, pp. 273–286, 2018. doi: 10.1016/j.patcog.2018.06.007
|
[24] |
L. Du, P. Zhou, L. Shi, H. Wang, M. Fan, W. Wang, and Y.-D. Shen, “Robust multiple kernel k-means using $l_{2, 1}-$norm,” Proc. of the 24th Int. Conf. on Artificial Intelligence, pp. 3476–3482, 2015.
|
[25] |
K. Zhan, C. Zhang, J. Guan, and J. Wang, “Graph learning for multiview clustering,” IEEE Trans. on Cybernetics, vol. 48, no. 10, pp. 2887–2895, 2018. doi: 10.1109/TCYB.2017.2751646
|
[26] |
X. Zhu, C. Chang Loy, and S. Gong, “Constructing robust affinity graphs for spectral clustering,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, pp. 1450–1457, 2014.
|
[27] |
J. Huang, F. Nie, and H. Huang, “A new simplex sparse learning model to measure data similarity for clustering,” Twenty-fourth Int. Joint Conf. on Artificial Intelligence, pp. 3569–3575, 2015.
|
[28] |
Z. Kang, H. Pan, S. C.H. Hoi, and Z. Xu, “Robust graph learning from noisy data,” IEEE Trans. on Cybernetics, vol. 50, no. 5, pp. 1833–1843, 2019.
|
[29] |
L. Franceshi, M. Niepert, M. Pontil, and X. He, “Learning discrete structures for graph neural networks,” Proc. of the 36th Int. Conf. on Machine Learning, pp. 1972–1982, 2020.
|
[30] |
Y. Li, C. Fei, C. Wang, H. Shan, and R. Lu, “Geometry Flow-Based Deep Riemannian Metric Learning,” IEEE/CAA J. of Autom. Sinica, vol. 10, no. 9, pp. 1882–1892, 2023. doi: 10.1109/JAS.2023.123399
|
[31] |
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” Int. Conf. on Learning Representations, 2017.
|
[32] |
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Beigio, “Graph attention networks,” Int. Conf. on Learning Representations, vol. 1050, no. 20, pp. 10–48550, 2018.
|
[33] |
Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, and L. Wang, “Deep graph contrastive representation learning,” arXiv: 2006.04131, 2020.
|
[34] |
Q. Sun, W. Zhang, and X. Lin, “Progressive hard negative masking: from global uniformity to local tolerance,” IEEE Trans. on Knowledge and Data Engineering, vol. 35, no. 12, pp. 12932–12943, 2023. doi: 10.1109/TKDE.2023.3269795
|
[35] |
P. Velickovic, W. Fedus, W. L. Hamilton, P. Lio, and R D. Hjelm, “Deep graph infomax,” Int. Conf.e on Learning Representation, vol. 2, no. 3, p. 4, 2019.
|
[36] |
Y. Ollivier, “Ricci curvature of Markov chains on metric spaces,” Journal of Functional Analysis, vol. 256, no. 3, pp. 810–864, 2009. doi: 10.1016/j.jfa.2008.11.001
|
[37] |
Y. Ollivier, “A survey of Ricci curvature for metric spaces and Markov chains,” Probabilistic Approach to Geometry, pp. 343–381, 2010.
|
[38] |
R. P. Sreejith, K. Mohanraj, J. Jost, E. Saucan. and A. Samal, “Forman curvature for complex networks,” J. Stat. Mech: Theory Exp., p. 063206, 2016.
|
[39] |
Z. Ye, K. S. Liu, T. Ma, J. Gao, and C. Chen, “Curvature graph network,” International Conference on Learning Representations, 2019.
|
[40] |
H. Li, J. Cao, J. Zhu, Y. Liu, Q. Zhu, and G. Wu, “Curvature graph neural network,” Information Sciences, vol. 592, pp. 50–66, 2022. doi: 10.1016/j.ins.2021.12.077
|
[41] |
R. Sandhu, T. Georgiou, E. Reznik, L. Zhu, I. Kolesov, Y. Senbabaoglu, and A. Tannenbaum, “Graph curvature for differentiating cancer networks,” Scientific Reports, vol. 5, p. 12323, 2015. doi: 10.1038/srep12323
|
[42] |
J. Sia, E. Jonckheere, and P. Bogdan, “Ollivier-Ricci curvature-based method for community detection in complex networks,” Scientific Reports, vol. 9, no. 1, pp. 1–12, 2019. doi: 10.1038/s41598-018-37186-2
|
[43] |
C.-C. Ni, Y.-Y. Lin, F. Luo, and J. Gao, “Community detection on networks with Ricci flow,” Scientific Reports, vol. 9, no. 1, pp. 1–12, 2019. doi: 10.1038/s41598-018-37186-2
|
[44] |
C.-C. Ni, Y.-Y. Lin, J. Gao, and X. Gu, “Network alignment by discrete Ollivier-Ricci flow,” Int. Symp. on Graph Drawing and Network Visualization, pp. 447–462, 2018.
|
[45] |
A. Samal, R.P. Sreejith, J. Gu, S. Liu, E. Saucan, and J. Jost, “Comparative analysis of two discretizations of Ricci curvature for complex networks,” Scientific Reports, vol. 8, p. 8650, 2018. doi: 10.1038/s41598-018-27001-3
|
[46] |
M. Weber, J. Jost, and E. Saucan, “Forman-Ricci flow for change detection in large dynamic data sets,” Axioms, vol. 5, no. 4, p. 26, 2016. doi: 10.3390/axioms5040026
|
[47] |
Y. Goldberg, A. Zakai, D. Kushnir, and Y. Ritov, “Manifold learning: the price of normalization,” Journal of Machine Learning Research, vol. 9, pp. 1909–1939, 2008.
|
[48] |
P. van der Hoorn, W. J. Cunningham, G. Lippner, C. Trugenberger, and D. Krioukov, “Ollivier-Ricci curvature convergence in random geometric graphs,” Physical Review Research, vol. 3, p. 013211, 2021. doi: 10.1103/PhysRevResearch.3.013211
|
[49] |
X. Lai, S. Bai, and Y. Lin, “Normalized discrete Ricci flow used in community detection,” Physica A, vol. 597, no. 127251, 2022.
|
[50] |
M. Carfora, J. Isenberg, and M. Jackson, “Convergence of the Ricci flow for metrics with indefinite Ricci curvature,” Differential Geometry, vol. 31, pp. 249–263, 1990.
|
[51] |
L. Du, P. Zhou, L. Shi, H. Wang, M. Fan, W. Wang, and Y.-D. Shen, “Robust multiple kernel K-means using $l_{2, 1}$-norm,” Proc. of the Twenty-Fourth Int. Joint Conf. on Artificial Intelligence, pp.3476–3482, 2015.
|
[52] |
A. Y. Ng, M. I. Jordan, and Y. Weiss, “On spectral clustering: Analysis and an algorithm,” Advances in Neural Information Processing Systems, vol. 2, pp. 849–856, 2002.
|
[53] |
X. Zhu, C. C. Loy, and S. Gong, “Constructing robust affinity graphs for spectral clustering,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, pp. 1450–1457, 2014.
|
[54] |
T. N. Kipf and M. Welling, “Variational graph auto-encoders,” 2016, arXiv: 1611.07308.
|
[55] |
X. Glorot and Y. Bengio, “Understanding the Difficulty of Training Deep Feedforward Neural Networks,” Proc. of the Thirteenth Int. Conf. on Artificial Intelligence and Statistics, pp. 249–256, 2010.
|
[56] |
D. P. Kingma and J. L. Ba, “Adam: A Method for Stochastic Optimization,” Proc. of the 3rd Int. Conf. on Learning Representations, 2015.
|
[57] |
N. Srivastava, G. E. Hinton, A. Krizhevsky, I. Sutskever, and R. R. Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks From Overfitting,” Journal of Machine Learning Research, vol. 1, no. 2014, pp. 1929–1958, 2014.
|
[58] |
P. Mernyei and C. Cangea, “Wiki-CS: A Wikipedia-Based Benchmark for Graph Neural Networks,” ICML Workshop on Graph Representation Learning and Beyond, 2020.
|