A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 9 Issue 10
Oct.  2022

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Y. Tian, H. W. Chen, H. P. Ma, X. Y. Zhang, K. C. Tan, and Y. C. Jin, “Integrating conjugate gradients into evolutionary algorithms for large-scale continuous multi-objective optimization,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 10, pp. 1801–1817, Oct. 2022. doi: 10.1109/JAS.2022.105875
Citation: Y. Tian, H. W. Chen, H. P. Ma, X. Y. Zhang, K. C. Tan, and Y. C. Jin, “Integrating conjugate gradients into evolutionary algorithms for large-scale continuous multi-objective optimization,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 10, pp. 1801–1817, Oct. 2022. doi: 10.1109/JAS.2022.105875

Integrating Conjugate Gradients Into Evolutionary Algorithms for Large-Scale Continuous Multi-Objective Optimization

doi: 10.1109/JAS.2022.105875
Funds:  This work was supported in part by the National Key Research and Development Program of China (2018AAA0100100), the National Natural Science Foundation of China (61906001, 62136008, U21A20512), the Key Program of Natural Science Project of Educational Commission of Anhui Province (KJ2020A0036), and Alexander von Humboldt Professorship for Artificial Intelligence Funded by the Federal Ministry of Education and Research, Germany
More Information
  • Large-scale multi-objective optimization problems (LSMOPs) pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces. While evolutionary algorithms are good at solving small-scale multi-objective optimization problems, they are criticized for low efficiency in converging to the optimums of LSMOPs. By contrast, mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems, but they have difficulties in finding diverse solutions for LSMOPs. Currently, how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored. In this paper, a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method. On the one hand, conjugate gradients and differential evolution are used to update different decision variables of a set of solutions, where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front. On the other hand, objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions, and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent. In comparison with state-of-the-art evolutionary algorithms, mathematical programming methods, and hybrid algorithms, the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs.

     

  • loading
  • [1]
    L. M. Antonio and C. A. C. Coello, “Use of cooperative coevolution for solving large scale multiobjective optimization problems,” in Proc. IEEE Congr. Evolutionary Computation, Cancun, Mexico, 2013, pp. 2758–2765.
    [2]
    C. He, R. Cheng, C. J. Zhang, Y. Tian, Q. Chen, and X. Yao, “Evolutionary large-scale multiobjective optimization for ratio error estimation of voltage transformers,” IEEE Trans. Evol. Comput., vol. 24, no. 5, pp. 868–881, Oct. 2020. doi: 10.1109/TEVC.2020.2967501
    [3]
    G. L. Li and D. Z. Cao, “A multi-objective particle swarm algorithm for the optimization of IMRT inverse planning,” in Proc. 3rd Int. Conf. Biomedical Engineering and Informatics, Yantai, China, 2010, pp. 1327–1330.
    [4]
    J. Dias, H. Rocha, T. Ventura, B. Ferreira, and M. Do Carmo Lopes, “Automated fluence map optimization based on fuzzy inference systems,” Med. Phys., vol. 43, no. 3, pp. 1083–1095, Mar. 2016. doi: 10.1118/1.4941007
    [5]
    R. Cheng, Y. C. Jin, M. Olhofer, and B. Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Trans. Cybern., vol. 47, no. 12, pp. 4108–4121, Dec. 2017. doi: 10.1109/TCYB.2016.2600577
    [6]
    Y. Tian, L. C. Si, X. Y. Zhang, R. Cheng, C. He, K. C. Tan, and Y. C. Jin, “Evolutionary large-scale multi-objective optimization: A survey,” ACM Comput. Surv., vol. 54, no. 8, pp. 174, Nov. 2022.
    [7]
    H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Trans. Evol. Comput., vol. 22, no. 2, pp. 260–275, Apr. 2018. doi: 10.1109/TEVC.2017.2704782
    [8]
    Y. Tian, C. Lu, X. Y. Zhang, K. C. Tan, and Y. C. Jin, “Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks,” IEEE Trans. Cybern., vol. 51, no. 6, pp. 3115–3128, Jun. 2021. doi: 10.1109/TCYB.2020.2979930
    [9]
    Y. Tian, X. T. Zheng, X. Y. Zhang, and Y. C. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Trans. Cybern., vol. 50, no. 8, pp. 3696–3708, Aug. 2020. doi: 10.1109/TCYB.2019.2906383
    [10]
    R. Cheng, Y. C. Jin, K. Narukawa, and B. Sendhoff, “A multiobjective evolutionary algorithm using Gaussian process-based inverse modeling,” IEEE Trans. Evol. Comput., vol. 19, no. 6, pp. 838–856, Dec. 2015. doi: 10.1109/TEVC.2015.2395073
    [11]
    S. S. Petrova and A. D. Solovév, “The origin of the method of steepest descent,” Hist. Math., vol. 24, no. 4, pp. 361–375, Nov. 1997. doi: 10.1006/hmat.1996.2146
    [12]
    D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Proc. 3rd Int. Conf. Learning Representations, San Diego, USA, 2015.
    [13]
    M. Avriel, Nonlinear Programming: Analysis and Methods, Dover Publications, New York, USA, 2003.
    [14]
    D. F. Shanno, “Conditioning of quasi-Newton methods for function minimization,” Math. Comp., vol. 24, no. 111, pp. 647–656, Jul. 1970. doi: 10.1090/S0025-5718-1970-0274029-X
    [15]
    K. G. Murty, Linear Programming. John Wiley & Sons, UK, 1983.
    [16]
    M. M. Li, “Generalized Lagrange multiplier method and KKT conditions with an application to distributed optimization,” IEEE Trans. Circuits Syst. II: Express Briefs, vol. 66, no. 2, pp. 252–256, Feb. 2019. doi: 10.1109/TCSII.2018.2842085
    [17]
    D. W. Marquardt, “An algorithm for least-squares estimation of nonlinear parameters,” J. Soc. Indust. Appl. Math., vol. 11, no. 2, pp. 431–441, Jun. 1963. doi: 10.1137/0111030
    [18]
    P. T. Boggs and J. W. Tolle, “Sequential quadratic programming,” Acta Numer., vol. 4, pp. 1–51, Jan. 1995. doi: 10.1017/S0962492900002518
    [19]
    B. O’Donoghue, E. Chu, N. Parikh, and S. Boyd, “Conic optimization via operator splitting and homogeneous self-dual embedding,” J. Optim. Theory Appl., vol. 169, no. 3, pp. 1042–1068, Feb. 2016. doi: 10.1007/s10957-016-0892-3
    [20]
    S. S. Yang, Y. Tian, C. He, X. Y. Zhang, K. C. Tan, and Y. C. Jin, “A gradient-guided evolutionary approach to training deep neural networks,” IEEE Trans. Neural Networks Learn. Syst., 2021,
    [21]
    Q. F. Zhang and H. Li, “MOEA/D: A multiobjective evolutionary algorithm based on decomposition,” IEEE Trans. Evol. Comput., vol. 11, no. 6, pp. 712–731, Dec. 2007. doi: 10.1109/TEVC.2007.892759
    [22]
    Y. D. Sergeyev, D. E. Kvasov, and M. S. Mukhametzhanov, “On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget,” Sci. Rep., vol. 8, no. 1, pp. 453, Jan. 2018.
    [23]
    Y. C. Jin and B. Sendhoff, “Pareto-based multiobjective machine learning: An overview and case studies,” IEEE Trans. Syst.,Man,Cybern.,Part C (Appl. Rev.), vol. 38, no. 3, pp. 397–415, May 2008. doi: 10.1109/TSMCC.2008.919172
    [24]
    Y. Tian, X. Y. Zhang, C. Wang, and Y. C. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. Evol. Comput., vol. 24, no. 2, pp. 380–393, Apr. 2020. doi: 10.1109/TEVC.2019.2918140
    [25]
    K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Comput., vol. 6, no. 2, pp. 182–197, Apr. 2002. doi: 10.1109/4235.996017
    [26]
    Y. Tian, R. Cheng, X. Y. Zhang, F. Cheng, and Y. C. Jin, “An indicator-based multiobjective evolutionary algorithm with reference point adaptation for better versatility,” IEEE Trans. Evol. Comput., vol. 22, no. 4, pp. 609–622, Aug. 2018. doi: 10.1109/TEVC.2017.2749619
    [27]
    M. H. Li and J. X. Wei, “A cooperative co-evolutionary algorithm for large-scale multi-objective optimization problems,” in Proc. Genetic and Evolutionary Computation Conf. Companion, Kyoto, Japan, 2018, pp. 1716–1721.
    [28]
    X. Y. Zhang, Y. Tian, R. Cheng, and Y. C. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Trans. Evol. Comput., vol. 22, no. 1, pp. 97–112, Feb. 2018. doi: 10.1109/TEVC.2016.2600642
    [29]
    C. He, L. H. Li, Y. Tian, X. Y. Zhang, R. Cheng, Y. C. Jin, and X. Yao, “Accelerating large-scale multiobjective optimization via problem reformulation,” IEEE Trans. Evol. Comput., vol. 23, no. 6, pp. 949–961, Dec. 2019. doi: 10.1109/TEVC.2019.2896002
    [30]
    Y. L. Feng, L. Feng, S. Kwong, and K. C. Tan, “A multivariation multifactorial evolutionary algorithm for large-scale multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 26, no. 2, pp. 248–262, Apr. 2021.
    [31]
    H. Qian and Y. Yu, “Solving high-dimensional multi-objective optimization problems with low effective dimensions,” in Proc. 31st AAAI Conf. Artificial Intelligence, San Francisco, USA, 2017, pp. 875–881.
    [32]
    Y. Tian, C. Lu, X. Y. Zhang, F. Cheng, and Y. C. Jin, “A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. Cybern., vol. 52, no. 7, pp. 6784–6797, Jul. 2022. doi: 10.1109/TCYB.2020.3041325
    [33]
    Y. Tian, Y. D. Feng, X. Y. Zhang, and C. Y. Sun, “A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization,” IEEE/CAA J. Autom. Sinica. 2021.
    [34]
    Y. J. Zhang, Y. Tian, and X. Y. Zhang, “Improved SparseEA for sparse large-scale multi-objective optimization problems,” Complex Intell. Syst. 2021,
    [35]
    X. Y. Wang, K. Zhang, J. Wang, and Y. C. Jin, “An enhanced competitive swarm optimizer with strongly convex sparse operator for large-scale multi-objective optimization,” IEEE Trans. Evol. Comput., 2021, DOI: 10.1109/TEVC.2021.3111209.
    [36]
    H. K. Chen, R. Cheng, J. M. Wen, H. F. Li, and J. Weng, “Solving large-scale many-objective optimization problems by covariance matrix adaptation evolution strategy with scalable small subpopulations,” Inf. Sci., vol. 509, pp. 457–469, Jan. 2020. doi: 10.1016/j.ins.2018.10.007
    [37]
    C. He, S. H. Huang, R. Cheng, K. C. Tan, and Y. C. Jin, “Evolutionary multiobjective optimization driven by generative adversarial networks (GANs),” IEEE Trans. Cybern., vol. 51, no. 6, pp. 3129–3142, Jun. 2021. doi: 10.1109/TCYB.2020.2985081
    [38]
    X. J. Yao, Q. Zhao, D. W. Gong, and S. Zhu, “Solution of large-scale many-objective optimization problems based on dimension reduction and solving knowledge guided evolutionary algorithm,” IEEE Trans. Evol. Comput., 2021, DOI: 10.1109/TEVC.2021.3110780.
    [39]
    W. J. Hong, K. Tang, A. M. Zhou, H. Ishibuchi, and X. Yao, “A scalable indicator-based evolutionary algorithm for large-scale multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 23, no. 3, pp. 525–537, Jun. 2019. doi: 10.1109/TEVC.2018.2881153
    [40]
    Y. Nesterov, “A method for solving a convex programming problem with convergence rate o(1/k2),” Sov. Math. Dokl., vol. 27, pp. 372–376, 1983.
    [41]
    T. Tieleman and G. Hinton, “Lecture 6.5-RMSProp: Divide the gradient by a running average of its recent magnitude,” COURSERA: Neural Networks Mach. Learn., vol. 4, no. 2, pp. 26–31, 2012.
    [42]
    C. G. Broyden, “Quasi-Newton methods and their application to function minimisation”, Mathematics of Computation, vol. 21, no. 99, pp. 368–381, Jul. 1967.
    [43]
    M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” J. Res. Natl. Bur. Stand., vol. 49, no. 6, pp. 409–436, Dec. 1952. doi: 10.6028/jres.049.044
    [44]
    R. Fletcher and C. M. Reeves, “Function minimization by conjugate gradients,” Comput. J., vol. 7, no. 2, pp. 149–154, Jan. 1964. doi: 10.1093/comjnl/7.2.149
    [45]
    R. T. Marler and J. S. Arora, “The weighted sum method for multi-objective optimization: New insights,” Struct. Multidiscip. Optim., vol. 41, no. 6, pp. 853–862, Jun. 2010. doi: 10.1007/s00158-009-0460-7
    [46]
    A. P. Wierzbicki, “A mathematical basis for satisficing decision making,” Math. Modell., vol. 3, no. 5, pp. 391–405, 1982. doi: 10.1016/0270-0255(82)90038-0
    [47]
    K. Miettinen, Nonlinear Multiobjective Optimization. New York, USA: Springer, 1999.
    [48]
    J. Fliege and B. F. Svaiter, “Steepest descent methods for multicriteria optimization,” Math. Methods Oper. Res., vol. 51, no. 3, pp. 479–494, Aug. 2000. doi: 10.1007/s001860000043
    [49]
    X. Liu and A. C. Reynolds, “A multiobjective steepest descent method with applications to optimal well control,” Comput. Geosci., vol. 20, no. 2, pp. 355–374, Mar. 2016. doi: 10.1007/s10596-016-9562-7
    [50]
    J. Fliege, L. M. G. Drummond, and B. F. Svaiter, “Newton’s method for multiobjective optimization,” SIAM J. Optim., vol. 20, no. 2, pp. 602–626, Jul. 2009. doi: 10.1137/08071692X
    [51]
    K. Izui, T. Yamada, S. Nishiwaki, and K. Tanaka, “Multiobjective optimization using an aggregative gradient-based method,” Struct. Multidisc. Optim., vol. 51, no. 1, pp. 173–182, Jan. 2015. doi: 10.1007/s00158-014-1125-8
    [52]
    Y. Sato, K. Izui, T. Yamada, and S. Nishiwaki, “Gradient-based multiobjective optimization using a distance constraint technique and point replacement,” Eng. Optim., vol. 48, no. 7, pp. 1226–1250, Jul. 2016. doi: 10.1080/0305215X.2015.1111068
    [53]
    M. M. Noel, “A new gradient based particle swarm optimization algorithm for accurate computation of global minimum,” Appl. Soft Comput., vol. 12, no. 1, pp. 353–359, Jan. 2012. doi: 10.1016/j.asoc.2011.08.037
    [54]
    K. Harada, K. Ikeda, and S. Kobayashi, “Hybridization of genetic algorithm and local search in multiobjective function optimization: Recommendation of GA then LS,” in Proc. 8th Annu. Conf. Genetic and Evolutionary Computation, Seattle, USA, 2006, pp. 667–674.
    [55]
    R. Santos, G. Borges, A. Santos, M. Silva, C. Sales, and J. C. W. A. Costa, “A semi-autonomous particle swarm optimizer based on gradient information and diversity control for global optimization,” Appl. Soft Comput., vol. 69, pp. 330–343, Aug. 2018. doi: 10.1016/j.asoc.2018.04.027
    [56]
    R. Salomon, “Evolutionary algorithms and gradient search: Similarities and differences,” IEEE Trans. Evol. Comput., vol. 2, no. 2, pp. 45–55, Jul. 1998. doi: 10.1109/4235.728207
    [57]
    C. K. Goh, Y. S. Ong, K. C. Tan, and E. J. Teoh, “An investigation on evolutionary gradient search for multi-objective optimization,” in Proc. IEEE Congr. Evolutionary Computation, Hong Kong, China, 2008, 3741–3746.
    [58]
    J. C. Strikwerda, Finite Difference Schemes and Partial Differential Equations. Wadsworth Publ. Co., Belmont, USA, 1989.
    [59]
    R. Martí, P. M. Pardalos, and M. G. C. Resende, Handbook of Heuristics. Cham, Germany: Springer, 2018.
    [60]
    Y. Tian, X. S. Xiang, X. Y. Zhang, R. Cheng, and Y. C. Jin, “Sampling reference points on the Pareto fronts of benchmark multi-objective optimization problems,” in Proc. IEEE Congr. Evolutionary Computation, Rio de Janeiro, Brazil, 2018, 1–6.
    [61]
    G. Taguchi, S. Chowdhury, and S. Taguchi, Robust Engineering. New York: McGraw-Hill, 2000.
    [62]
    K. Deb and R. B. Agrawal, “Simulated binary crossover for continuous search space,” Complex Syst., vol. 9, no. 2, pp. 115–148, 1995.
    [63]
    K. Deb and M. Goyal, “A combined genetic adaptive search (GeneAS) for engineering design,” Comput. Sci. Inf., vol. 26, no. 4, pp. 30–45, 1996.
    [64]
    R. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proc. 6th Int. Symp. Micro Machine and Human Science, Nagoya, Japan, 1995, pp. 39–43.
    [65]
    E. Zitzler, K. Deb, and L. Thiele, “Comparison of multiobjective evolutionary algorithms: Empirical results,” Evol. Comput., vol. 8, no. 2, pp. 173–195, Jun. 2000. doi: 10.1162/106365600568202
    [66]
    K. Deb, L. Thiele, M. Laumanns, and E. Zitzler, “Scalable test problems for evolutionary multiobjective optimization,” in Evolutionary Multiobjective Optimization, A. Abraham, L. Jain, and R. Goldberg, Eds. London, UK: Springer, 2005, pp. 105–145.
    [67]
    C. A. C. Coello and N. C. Cortes, “Solving multiobjective optimization problems using an artificial immune system,” Genet. Program. Evolv. Mach., vol. 6, no. 2, pp. 163–190, Jun. 2005. doi: 10.1007/s10710-005-6164-x
    [68]
    K. Shang, H. Ishibuchi, L. J. He, and L. M. Pang, “A survey on the hypervolume indicator in evolutionary multiobjective optimization,” IEEE Trans. Evol. Comput., vol. 25, no. 1, pp. 1–20, Feb. 2021. doi: 10.1109/TEVC.2020.3013290
    [69]
    Y. Tian, R. Cheng, X. Y. Zhang, and Y. C. Jin, “PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [educational forum],” IEEE Comput. Intell. Mag., vol. 12, no. 4, pp. 73–87, Nov. 2017. doi: 10.1109/MCI.2017.2742868
    [70]
    Y. Tian, T. Zhang, J. H. Xiao, X. Y. Zhang, and Y. C. Jin, “A coevolutionary framework for constrained multiobjective optimization problems,” IEEE Trans. Evol. Comput., vol. 25, no. 1, pp. 102–116, Feb. 2021. doi: 10.1109/TEVC.2020.3004012
    [71]
    F. Cheng, Q. Q. Zhang, Y. Tian, and X. Y. Zhang, “Maximizing receiver operating characteristics convex hull via dynamic reference point-based multi-objective evolutionary algorithm,” Appl. Soft Comput., vol. 86, p. 105896, Jan. 2020.
  • All_Data-2022-0315.rar

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(9)  / Tables(4)

    Article Metrics

    Article views (573) PDF downloads(98) Cited by()

    Highlights

    • This work deeply integrates evolutionary computation (EC) with mathematical programming (MP)
    • This work customizes a hybrid algorithm for large-scale multi-objective optimization
    • The proposed algorithm inherits the good convergence of MP and good diversity of EC
    • The proposed algorithm is verified on time-varying ratio error estimation tasks
    • The proposed algorithm is verified on intensity-modulated radiotherapy tasks

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return