A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 10 Issue 4
Apr.  2023

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Y. Tian, Y. D. Feng, X. Y. Zhang, and C. Y. Sun, “A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 4, pp. 1048–1063, Apr. 2023. doi: 10.1109/JAS.2022.105437
Citation: Y. Tian, Y. D. Feng, X. Y. Zhang, and C. Y. Sun, “A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 4, pp. 1048–1063, Apr. 2023. doi: 10.1109/JAS.2022.105437

A Fast Clustering Based Evolutionary Algorithm for Super-Large-Scale Sparse Multi-Objective Optimization

doi: 10.1109/JAS.2022.105437
Funds:  This work was supported in part by the National Key Research and Development Program of China (2018AAA0100100), the National Natural Science Foundation of China (61822301, 61876123, 61906001), the Collaborative Innovation Program of Universities in Anhui Province (GXXT-2020-051), the Hong Kong Scholars Program (XJ2019035), and Anhui Provincial Natural Science Foundation (1908085QF271)
More Information
  • During the last three decades, evolutionary algorithms (EAs) have shown superiority in solving complex optimization problems, especially those with multiple objectives and non-differentiable landscapes. However, due to the stochastic search strategies, the performance of most EAs deteriorates drastically when handling a large number of decision variables. To tackle the curse of dimensionality, this work proposes an efficient EA for solving super-large-scale multi-objective optimization problems with sparse optimal solutions. The proposed algorithm estimates the sparse distribution of optimal solutions by optimizing a binary vector for each solution, and provides a fast clustering method to highly reduce the dimensionality of the search space. More importantly, all the operations related to the decision variables only contain several matrix calculations, which can be directly accelerated by GPUs. While existing EAs are capable of handling fewer than 10 000 real variables, the proposed algorithm is verified to be effective in handling 1 000 000 real variables. Furthermore, since the proposed algorithm handles the large number of variables via accelerated matrix calculations, its runtime can be reduced to less than 10% of the runtime of existing EAs.

     

  • loading
  • [1]
    Y. C. Jin and B. Sendhoff, “Pareto-based multiobjective machine learning: An overview and case studies,” IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.), vol. 38, no. 3, pp. 397–415, May 2008. doi: 10.1109/TSMCC.2008.919172
    [2]
    Y. Tian, S. S. Yang, L. Zhang, F. C. Duan, and X. Y. Zhang, “A surrogate-assisted multiobjective evolutionary algorithm for large-scale task-oriented pattern mining,” IEEE Trans. Emerg. Top. Comput. Intell., vol. 3, no. 2, pp. 106–116, Apr. 2019. doi: 10.1109/TETCI.2018.2872055
    [3]
    Y. Xiang, Y. R. Zhou, Z. B. Zheng, and M. Q. Li, “Configuring software product lines by combining many-objective optimization and SAT solvers,” ACM Trans. Softw. Eng. Methodol., vol. 26, no. 4, p. 14, Feb. 2018.
    [4]
    Y. Tian, X. C. Su, Y. S. Su, and X. Y. Zhang, “EMODMI: A multi-objective optimization based method to identify disease modules,” IEEE Trans. Emerg. Top. Comput. Intell., vol. 5, no. 4, pp. 570–582, Aug. 2021. doi: 10.1109/TETCI.2020.3014923
    [5]
    J. Branke, B. Scheckenbach, M. Stein, K. Deb, and H. Schmeck, “Portfolio optimization with an envelope-based multi-objective evolutionary algorithm,” Eur. J. Oper. Res., vol. 199, no. 3, pp. 684–693, Dec. 2009. doi: 10.1016/j.ejor.2008.01.054
    [6]
    R. Cheng, Y. C. Jin, M. Olhofer, and B. Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Trans. Cybern., vol. 47, no. 12, pp. 4108–4121, Dec. 2017. doi: 10.1109/TCYB.2016.2600577
    [7]
    L. M. Antonio and C. A. Coello Coello, “Use of cooperative coevolution for solving large scale multiobjective optimization problems,” in Proc. IEEE Congr. Evolutionary Computation, Cancun, Mexico, 2013, pp. 2758–2765.
    [8]
    L. M. Antonio, C. A. Coello Coello, S. G. Brambila, J. F. González, and G. C. Tapia, “Operational decomposition for large scale multi-objective optimization problems,” in Proc. Genetic and Evolutionary Computation Conf. Companion, Prague, Czech Republic, 2019, pp. 225–226.
    [9]
    X. L. Ma, F. Liu, Y. T. Qi, X. D. Wang, L. L. Li, L. C. Jiao, M. L. Yin, and M. G. Gong, “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables,” IEEE Trans. EComput., vol. 20, no. 2, pp. 275–298, Apr. 2016.
    [10]
    X. Y. Zhang, Y. Tian, R. Cheng, and Y. C. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Trans. EComput., vol. 22, no. 1, pp. 97–112, Feb. 2018.
    [11]
    H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Trans. EComput., vol. 22, no. 2, pp. 260–275, Apr. 2018.
    [12]
    Y. Tian, C. Lu, X. Y. Zhang, K. C. Tan, and Y. C. Jin, “Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks,” IEEE Trans. Cybern., vol. 51, no. 6, pp. 3115–3128, Jun. 2021. doi: 10.1109/TCYB.2020.2979930
    [13]
    Y. C. Hua, Q. Q. Liu, K. R. Hao, and Y. C. Jin, “A survey of evolutionary algorithms for multi-objective optimization problems with irregular Pareto fronts,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 2, pp. 303–318, Feb. 2021. doi: 10.1109/JAS.2021.1003817
    [14]
    C. He, R. Cheng, C. J. Zhang, Y. Tian, Q. Chen, and X. Yao, “Evolutionary large-scale multiobjective optimization for ratio error estimation of voltage transformers,” IEEE Trans. EComput., vol. 24, no. 5, pp. 868–881, Oct. 2020.
    [15]
    S. Singh, J. Kubica, S. Larsen, and D. Sorokina, “Parallel large scale feature selection for logistic regression,” in Proc. SIAM Int. Conf. Data Mining, Sparks, USA, 2009, pp. 1172–1183.
    [16]
    J. Liu, M. G. Gong, Q. G. Miao, X. G. Wang, and H. Li, “Structure learning for deep neural networks based on multiobjective optimization,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 6, pp. 2450–2463, Jun. 2018. doi: 10.1109/TNNLS.2017.2695223
    [17]
    K. Deb and R. B. Agrawal, “Simulated binary crossover for continuous search space,” Complex Syst., vol. 9, no. 4, pp. 115–148, 1995.
    [18]
    K. Deb and M. Goyal, “A combined genetic adaptive search (GeneAS) for engineering design,” Comput. Sci. Inform., vol. 26, no. 4, pp. 30–45, 1996.
    [19]
    Y. Tian, S. S. Yang, X. Y. Zhang, and Y. C. Jin, “Using PlatEMO to solve multi-objective optimization problems in applications: A case study on feature selection,” in Proc. IEEE Congr. Evolutionary Computation, Wellington, New Zealand, 2019, pp. 1–8.
    [20]
    Y. Tian, X. T. Zheng, X. Y. Zhang, and Y. C. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Trans. Cybern., vol. 50, no. 8, pp. 3696–3708, Aug. 2020. doi: 10.1109/TCYB.2019.2906383
    [21]
    W. J. Hong, P. Yang, and K. Tang, “Evolutionary computation for large-scale multi-objective optimization: A decade of progresses,” Int. J. Autom. Comput., vol. 18, no. 2, pp. 155–169, Apr. 2021. doi: 10.1007/s11633-020-1253-0
    [22]
    Y. Tian, L. C. Si, X. Y. Zhang, R. Cheng, C. He, K. C. Tan, and Y. C. Jin, “Evolutionary large-scale multi-objective optimization: A survey,” ACM Comput. Surv., vol. 54, no. 8, p. 174, Nov. 2022.
    [23]
    A. Song, Q. Yang, W. N. Chen, and J. Zhang, “A random-based dynamic grouping strategy for large scale multi-objective optimization,” in Proc. IEEE Congr. Evolutionary Computation, Vancouver, Canada, 2016, pp. 468–475.
    [24]
    F. Sander, H. Zille, and S. Mostaghim, “Transfer strategies from single- to multi-objective grouping mechanisms,” in Proc. Genetic and Evolutionary Computation Conf., Kyoto, Japan, 2018, pp. 729–736.
    [25]
    M. H. Li and J. X. Wei, “A cooperative co-evolutionary algorithm for large-scale multi-objective optimization problems,” in Proc. Genetic and Evolutionary Computation Conf. Companion, Kyoto, Japan, 2018, pp. 1716–1721.
    [26]
    H. K, Chen, X. M. Zhu, W. Pedrycz, S. Yin, G. H. Wu, and H. Yan, “PEA: Parallel evolutionary algorithm by separating convergence and diversity for large-scale multi-objective optimization,” in Proc. 38th IEEE Int. Conf. Distributed Computing Systems, Vienna, Austria, 2018, pp. 223–232.
    [27]
    C. He, L. H. Li, Y. Tian, X. Y. Zhang, R. Cheng, Y. C. Jin, and X. Yao, “Accelerating large-scale multiobjective optimization via problem reformulation,” IEEE Trans. EComput., vol. 23, no. 6, pp. 949–961, Dec. 2019.
    [28]
    H. Qian and Y. Yu, “Solving high-dimensional multi-objective optimization problems with low effective dimensions,” in Proc. 31st AAAI Conf. Artificial Intelligence, San Francisco, USA, 2017, pp. 875–881.
    [29]
    W. J. Hong, K. Tang, A. M. Zhou, H. Ishibuchi, and X. Yao, “A scalable indicator-based evolutionary algorithm for large-scale multiobjective optimization,” IEEE Trans. EComput., vol. 23, no. 3, pp. 525–537, Jun. 2019.
    [30]
    Y. Tian, C. Lu, X. Y. Zhang, F. Cheng, and Y. C. Jin, “A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. Cybern., 2020. vol. 52, no. 7, pp. 6784–6797, Jul. 2022.
    [31]
    Z. C. Lu, I. Whalen, V. Boddeti, Y. Dhebar, K. Deb, E. Goodman, and W. Banzhaf, “NSGA-NET: A multi-objective genetic algorithm for neural architecture search,” arXiv preprint arXiv: 1810.03522, Oct. 2018.
    [32]
    Y. Tian, S. S. Yang, and X. Y. Zhang, “An evolutionary multiobjective optimization based fuzzy method for overlapping community detection,” IEEE Trans. Fuzzy Syst., vol. 28, no. 11, pp. 2841–2855, Nov. 2020. doi: 10.1109/TFUZZ.2019.2945241
    [33]
    X. Y. Zhang, F. C. Duan, L. Zhang, F. Cheng, Y. C. Jin, and K. Tang, “Pattern recommendation in task-oriented applications: A multi-objective perspective [application notes],” IEEE Comput. Intell. Mag., vol. 12, no. 3, pp. 43–53, Aug. 2017. doi: 10.1109/MCI.2017.2708578
    [34]
    J. H. Zhao, Y. Xu, F. J. Luo, Z. Y. Dong, and Y. Y. Peng, “Power system fault diagnosis based on history driven differential evolution and stochastic time domain simulation,” Inf. Sci., vol. 275, pp. 13–29, Aug. 2014. doi: 10.1016/j.ins.2014.02.039
    [35]
    Y. Tian, X. Y. Zhang, C. Wang, and Y. C. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. EComput., vol. 24, no. 2, pp. 380–393, Apr. 2020.
    [36]
    E. G. Talbi, “A unified view of parallel multi-objective evolutionary algorithms,” J. Parallel Distrib. Comput., vol. 133, pp. 349–358, Nov. 2019. doi: 10.1016/j.jpdc.2018.04.012
    [37]
    W. Q. Ying, S. Y. Chen, B. Wu, Y. H. Xie, and Y. Wu, “Distributed parellel MOEA/D on spark,” in Proc. Int. Conf. Computing Intelligence and Information System, Nanjing, China, 2017, pp. 18–23.
    [38]
    N. Kantour, S. Bouroubi, and D. Chaabane, “A parallel MOEA with criterion-based selection applied to the knapsack problem,” Appl. Soft Comput., vol. 80, pp. 358–373, Jul. 2019. doi: 10.1016/j.asoc.2019.04.005
    [39]
    T. F. Qiu and G. Ju, “A selective migration parallel multi-objective genetic algorithm,” in Proc. Chinese Control and Decision Conf., Xuzhou, China, 2010, pp. 463–467.
    [40]
    C. Sanhueza, F. Jiméenez, R. Berretta, and P. Moscato, “PasMoQAP: A parallel asynchronous memetic algorithm for solving the multi-objective quadratic assignment problem,” in Proc. IEEE Congr. Evolutionary Computation, Donostia, Spain, 2017, pp. 1103–1110.
    [41]
    B. Derbel, A. Liefooghe, G. Marquet, and E. G. Talbi, “A fine-grained message passing MOEA/D,” in Proc. IEEE Congr. Evolutionary Computation, Sendai, Japan, 2015, pp. 1837–1844.
    [42]
    B. Xu, Y. Zhang, D. W. Gong, and L. Wang, “A parallel multi-objective cooperative co-evolutionary algorithm with changing variables,” in Proc. Genetic and Evolutionary Computation Conf. Companion, Berlin, Germany, 2017, pp. 1888–1893.
    [43]
    K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. EComput., vol. 6, no. 2, pp. 182–197, Apr. 2002.
    [44]
    Q. F. Zhang and H. Li, “MOEA/D: A multiobjective evolutionary algorithm based on decomposition,” IEEE Trans. Evolutionary Computation, vol. 11, no. 6, pp. 712–731, Dec. 2007.
    [45]
    T. J. Stanley and T. N. Mudge, “A parallel genetic algorithm for multiobjective microprocessor design,” in Proc. 6th Int. Conf. Genetic Algorithms, San Francisco, USA, 1995, pp. 597–604.
    [46]
    R. Szmit and A. Barak, “Evolution strategies for a parallel multi-objective genetic algorithm,” in Proc. 2nd Annu. Conf. Genetic and Evolutionary Computation, Las Vegas, Nevada, USA, 2000, pp. 227–234.
    [47]
    K. Deb and C. Myburgh, “A population-based fast algorithm for a billion-dimensional resource allocation problem with integer variables,” Eur. J. Oper. Res., vol. 261, no. 2, pp. 460–474, Sept. 2017. doi: 10.1016/j.ejor.2017.02.015
    [48]
    B. D. Li, J. L. Li, K. Tang, and X. Yao, “Many-objective evolutionary algorithms: A survey,” ACM Comput. Surv., vol. 48, no. 1, p. 13, Sept. 2015.
    [49]
    M. Ester, H. P. Kriegel, J. Sander, and X. W. Xu, “A density-based algorithm for discovering clusters in large spatial databases with noise,” in Proc. Int. Conf. Knowledge Discovery and Data Mining, Portland, Oregon, USA, 1996, pp. 226–231.
    [50]
    X. Y. Zhang, Y. Tian, R. Cheng, and Y. C. Jin, “An efficient approach to nondominated sorting for evolutionary multiobjective optimization,” IEEE Trans. Evolutionary Computation, vol. 19, no. 2, pp. 201–213, Apr. 2015.
    [51]
    E. Zitzler, M. Laumanns, and L. Thiele, “SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization,” in Proc. 5th Conf. Evolutionary Methods for Design Optimization and Control with Applications to Industrial Problems, Athens, Greece, 2001, pp. 95–100.
    [52]
    R. C. Liu, R. Ren, J. Liu, and J. Liu, “A clustering and dimensionality reduction based evolutionary algorithm for large-scale multi-objective problems,” Appl. Soft Comput., vol. 89, p. 106120, Apr. 2020. doi: 10.1016/j.asoc.2020.106120
    [53]
    Y. Tian, R. Cheng, X. Y. Zhang, and Y. C. Jin, “PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [educational forum],” IEEE Comput. Intell. Mag., vol. 12, no. 4, pp. 73–87, Nov. 2017. doi: 10.1109/MCI.2017.2742868
    [54]
    B. Xue, M. J. Zhang, and W. N. Browne, “Particle swarm optimization for feature selection in classification: A multi-objective approach,” IEEE Trans. Cybern., vol. 43, no. 6, pp. 1656–1671, Dec. 2013. doi: 10.1109/TSMCB.2012.2227469
    [55]
    X. Yao, “A review of evolutionary artificial neural networks,” Int. J. Intell. Syst., vol. 8, no. 4, pp. 539–567, Jan. 1993. doi: 10.1002/int.4550080406
    [56]
    R. Agrawal and R. Srikant, “Fast algorithms for mining association rules in large databases,” in Proc. 20th Int. Conf. Very Large Data Bases, San Francisco, USA, 1994, pp. 487–499.
    [57]
    E. Zitzler, L. Thiele, M. Laumanns, C. M. Fonseca, and V. G. Da Fonseca, “Performance assessment of multiobjective optimizers: An analysis and review,” IEEE Trans. EComput., vol. 7, no. 2, pp. 117–132, Apr. 2003.
    [58]
    Y. Tian, X. S. Xiang, X. Y. Zhang, R. Cheng, and Y. C. Jin, “Sampling reference points on the Pareto fronts of benchmark multi-objective optimization problems,” in Proc. IEEE Congr. Evolutionary Computation, Rio de Janeiro, Brazil, 2018, pp. 1–6.
    [59]
    L. While, P. Hingston, L. Barone, and S. Huband, “A faster algorithm for calculating hypervolume,” IEEE Trans. EComput., vol. 10, no. 1, pp. 29–38, Feb. 2006.
    [60]
    J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm EComput., vol. 1, no. 1, pp. 3–18, Mar. 2011. doi: 10.1016/j.swevo.2011.02.002
    [61]
    X. Xiang, Y. Tian, J. H. Xiao, and X. Y. Zhang, “A clustering-based surrogate-assisted multiobjective evolutionary algorithm for shelter location problem under uncertainty of road networks,” IEEE Trans. Industrial Informatics, vol. 16, no. 12, pp. 7544–7555, Dec. 2020. doi: 10.1109/TII.2019.2962137

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(9)  / Tables(4)

    Article Metrics

    Article views (1099) PDF downloads(125) Cited by()

    Highlights

    • SLMEA serves as the first attempt for super-large-scale multi-objective optimization
    • SLMEA solves problems with up to one million variables in the experiments
    • SLMEA is a totally black-box algorithm without using problem-specific information
    • SLMEA has simple procedures that can be fully accelerated by GPUs
    • All the experimental results can be reproduced via PlatEMO (available from Github)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return