A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 10 Issue 8
Aug.  2023

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
X. F. Chen, M. Liu, and  S. Li,  “Echo state network with probabilistic regularization for time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 8, pp. 1743–1753, Aug. 2023. doi: 10.1109/JAS.2023.123489
Citation: X. F. Chen, M. Liu, and  S. Li,  “Echo state network with probabilistic regularization for time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 8, pp. 1743–1753, Aug. 2023. doi: 10.1109/JAS.2023.123489

Echo State Network With Probabilistic Regularization for Time Series Prediction

doi: 10.1109/JAS.2023.123489
Funds:  This work was supported in part by the National Natural Science Foundation of China (62176109), the CAAI-Huawei MindSpore Open Fund (CAAIXSJLJJ-2022-020A), the Natural Science Foundation of Gansu Province (21JR7RA531, 22JR5RA427, 22JR5RA487), the Fundamental Research Funds for the Central Universities (lzujbky-2022-kb12, lzujbky-2022-23), the Science and Technology Project of Chengguan Discrict of Lanzhou (2021-1-2), and the Supercomputing Center of Lanzhou University
More Information
  • Recent decades have witnessed a trend that the echo state network (ESN) is widely utilized in field of time series prediction due to its powerful computational abilities. However, most of the existing research on ESN is conducted under the assumption that data is free of noise or polluted by the Gaussian noise, which lacks robustness or even fails to solve real-world tasks. This work handles this issue by proposing a probabilistic regularized ESN (PRESN) with robustness guaranteed. Specifically, we design a novel objective function for minimizing both the mean and variance of modeling error, and then a scheme is derived for getting output weights of the PRESN. Furthermore, generalization performance, robustness, and unbiased estimation abilities of the PRESN are revealed by theoretical analyses. Finally, experiments on a benchmark dataset and two real-world datasets are conducted to verify the performance of the proposed PRESN. The source code is publicly available at https://github.com/LongJin-lab/probabilistic-regularized-echo-state-network.


  • loading
  • [1]
    C. Lee, H. Hasegawa, and S. Gao, “Complex-valued neural networks: A nomprehensive nurvey,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 8, pp. 1406–1426, Aug. 2022. doi: 10.1109/JAS.2022.105743
    Z. Xie, L. Jin, X. Luo, Z. Sun, and M. Liu, “RNN for repetitive motion generation of redundant robot manipulators: An orthogonal projection-based scheme,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 2, pp. 615–628, Feb. 2022. doi: 10.1109/TNNLS.2020.3028304
    M. Liu, X. Zhang, M. Shang, and L. Jin, “Gradient-based differential kWTA network with application to competitive coordination of multiple robots,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 8, pp. 1452–1463, Aug. 2022. doi: 10.1109/JAS.2022.105731
    Y. Zhang, S. Li, S. Kadry, and B. Liao, “Recurrent neural network for kinematic control of redundant manipulators with periodic input disturbance and physical constraints,” IEEE Trans. Cybern., vol. 49, no. 12, pp. 4194–4205, Dec. 2019. doi: 10.1109/TCYB.2018.2859751
    P. S. Stanimirović, I. S. živković, and Y. Wei, “Recurrent neural network approach based on the integral representation of the Drazin inverse,” Neural Comput., vol. 27, no. 10, pp. 2107–2131, Oct. 2015. doi: 10.1162/NECO_a_00771
    L. Jin, J. Yan, X. Du, X. Xiao, and D. Fu, “RNN for solving time-variant generalized Sylvester equation with applications to robots and acoustic source localization,” IEEE Trans. Ind. Informat., vol. 16, no. 10, pp. 6359–6369, Oct. 2020. doi: 10.1109/TII.2020.2964817
    R. Jiao, K. Peng, and J. Dong, “Remaining useful life prediction for a roller in a hot strip mill based on deep recurrent neural networks,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 7, pp. 1345–1354, Jul. 2021. doi: 10.1109/JAS.2021.1004051
    M. Liu, L. Chen, X. Du, L. Jin, and M. Shang, “Activated gradients for deep neural networks,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 4, pp. 2156–2168, 2023.
    H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” Science, vol. 304, no. 5667, pp. 78–80, Apr. 2004. doi: 10.1126/science.1091277
    A. Gasparin, S. Lukovic, and C. Alippi, “Deep learning for time series forecasting: The electric load case,” CAAI Trans. Intell. Technol., vol. 7, no. 1, pp. 1–25, 2022. doi: 10.1049/cit2.12060
    C. Liu, H. Zhang, Y. Luo, and H. Su, “Dual heuristic programming for optimal control of continuous-time nonlinear systems using single echo state network,” IEEE Trans. Cybern., vol. 52, no. 3, pp. 1701–1712, Mar. 2022. doi: 10.1109/TCYB.2020.2984952
    I. B. Yildiz, H. Jaeger, and S. J. Kiebel, “Re-visiting the echo state property,” Neural Netw., vol. 35, pp. 1–9, 2012. doi: 10.1016/j.neunet.2012.07.005
    H. H. Kim and J. Jeong, “An electrocorticographic decoder for arm movement for brain-machine interface using an echo state network and Gaussian readout,” Appl. Soft Comput., vol. 117, p. 108393, 2022. doi: 10.1016/j.asoc.2021.108393
    H.-P. Ren, H.-P. Yin, C. Bai, and J.-L. Yao, “Performance improvement of chaotic baseband wireless communication using echo state network,” IEEE Trans. Commun., vol. 68, no. 10, pp. 6525–6536, Oct. 2020. doi: 10.1109/TCOMM.2020.3007757
    T. Kim and B. R. King, “Time series prediction using deep echo state networks,” Neural Comput. Appl., vol. 32, no. 23, pp. 17769–17787, 2020. doi: 10.1007/s00521-020-04948-x
    K. Mogushi, “Reduction of transient noise artifacts in gravitational-wave data using deep learning,” arXiv preprint arXiv: 2105.10522, 2021.
    M. Ghahramani, Y. Qiao, M. C. Zhou, A. O’Hagan, and J. Sweeney, “AI-based modeling and data-driven evaluation for smart manufacturing processes,” IEEE/CAA J. Autom. Sinica, vol. 7, no. 4, pp. 1026–1037, Jul. 2020. doi: 10.1109/JAS.2020.1003114
    B. Huang, G. Qin, R. Zhao, Q. Wu, and A. Shahriari, “Recursive Bayesian echo state network with an adaptive inflation factor for temperature prediction,” Neural Comput. Appl., vol. 29, no. 12, pp. 1535–1543, 2018. doi: 10.1007/s00521-016-2698-5
    R. Fourati, B. Ammar, Y. Jin, and A. M. Alimi, “EEG feature learning with intrinsic plasticity based deep echo state network,” in Proc. Int. Joint Conf. Neural Networks, 2020, pp. 1–8.
    Z. Shi and M. Han, “Support vector echo-state machine for chaotic time-series prediction,” IEEE Trans. Neural Netw., vol. 18, no. 2, pp. 359–372, Mar. 2007. doi: 10.1109/TNN.2006.885113
    D. Li, M. Han, and J. Wang, “Chaotic time series prediction based on a novel robust echo state network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 23, no. 5, pp. 787–799, May 2012. doi: 10.1109/TNNLS.2012.2188414
    C. Sheng, J. Zhao, Y. Liu, and W. Wang, “Prediction for noisy nonlinear time series by echo state network based on dual estimation,” Neurocomputing, vol. 82, pp. 186–195, Apr. 2012. doi: 10.1016/j.neucom.2011.11.021
    C. Yang, K. Nie, J. Qiao, and D. Wang, “Robust echo state network with sparse online learning,” Inf. Sci., vol. 594, pp. 95–117, May 2022. doi: 10.1016/j.ins.2022.02.009
    H.-T. Li, C.-Y. Chou, Y.-T. Chen, S.-H. Wang, and A.-Y. Wu, “Robust and lightweight ensemble extreme learning machine engine based on eigenspace domain for compressed learning,” IEEE Trans. Circuits Syst. I Regul. Pap., vol. 66, no. 12, pp. 4699–4712, Dec. 2019. doi: 10.1109/TCSI.2019.2940642
    X. Lu, W. Liu, C. Zhou, and M. Huang, “Robust least-squares support vector machine with minimization of mean and variance of modeling error,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 7, pp. 2909–2920, Jul. 2018.
    J. de Jesús Rubio, I. Elias, D. R. Cruz, and J. Pacheco, “Uniform stable radial basis function neural network for the prediction in two mechatronic processes,” Neurocomputing, vol. 227, pp. 122–130, Mar. 2017. doi: 10.1016/j.neucom.2016.08.109
    Y. P. Zhao, Q. K. Hu, J. G. Xu, B. Li, G. Huang, and Y. T. Pan, “A robust extreme learning machine for modeling a small-scale turbojet engine,” Appl. Energy, vol. 218, pp. 22–35, 2018. doi: 10.1016/j.apenergy.2018.02.175
    X. Lu, L. Ming, W. Liu, and H.-X. Li, “Probabilistic regularized extreme learning lachine for robust modeling of noise data,” IEEE Trans. Cybern., vol. 48, no. 8, pp. 2368–2377, Aug. 2018. doi: 10.1109/TCYB.2017.2738060
    J. Tang, Y. Xue, Z. Wang, S. Hu, T. Gong, Y. Chen, H. Zhao, and L. Xiao, “Bayesian estimation-based sentiment word embedding model for sentiment analysis,” CAAI Trans. Intell. Technol., vol. 7, no. 2, pp. 144–155, 2022. doi: 10.1049/cit2.12037
    X. Chen, X. Luo, L. Jin, S. Li, and M. Liu, “Growing echo state network with an inverse-free weight update strategy,” IEEE Trans. Cybern., vol. 53, no. 2, pp. 753–764, Feb. 2023. doi: 10.1109/TCYB.2022.3155901
    L. Wei, L. Jin, C. Yang, K. Chen, and W. Li, “New noise-tolerant neural algorithms for future dynamic nonlinear optimization with estimation on Hessian matrix inversion,” IEEE Trans. Syst.,Man,Cybern.,Syst., vol. 51, no. 4, pp. 2611–2623, Apr. 2021. doi: 10.1109/TSMC.2019.2916892
    G.-Y. Chen, M. Gan, C. L. P. Chen, and H.-X. Li, “A regularized variable projection algorithm for separable nonlinear least-squares problems,” IEEE Trans. Automat. Contr., vol. 64, no. 2, pp. 526–537, Feb. 2019.
    Y. Tian and Y. Zhang, “A comprehensive survey on regularization strategies in machine learning,” Inf. Fusion, vol. 80, pp. 146–166, Apr. 2022. doi: 10.1016/j.inffus.2021.11.005
    L. Jin, X. Zheng, and X. Luo, “Neural dynamics for distributed collaborative control of manipulators with time delays,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 5, pp. 854–863, May 2022. doi: 10.1109/JAS.2022.105446
    S. Scardapane, M. Panella, D. Comminiello, A. Hussain, and A. Uncini, “Distributed reservoir computing with sparse readouts,” IEEE Comput. Intell. Mag., vol. 11, no. 4, pp. 59–70, Nov. 2016. doi: 10.1109/MCI.2016.2601759
    X. Dutoit, B. Schrauwen, J. Van Campenhout, D. Stroobandt, H. Van Brussel, and M. Nuttin, “Pruning and regularization in reservoir computing,” Neurocomputing, vol. 72, pp. 1534–1546, Mar. 2009. doi: 10.1016/j.neucom.2008.12.020
    G. Chryssolouris, M. Lee, and A. Ramsey, “Confidence interval prediction for neural network models,” IEEE Trans. Neural Netw., vol. 7, no. 1, pp. 229–232, Jan. 1996. doi: 10.1109/72.478409
    M. Lukoševičius, “A practical guide to applying echo state networks,” in Neural Networks: Tricks of the Trade, 2012, pp. 659–686.
    M. Xu, M. Han, T. Qiu, and H. Lin, “Hybrid regularized echo state network for multivariate chaotic time series prediction,” IEEE Trans. Cybern., vol. 49, no. 6, pp. 2305–2315, Jun. 2019. doi: 10.1109/TCYB.2018.2825253
    A. Sherstinsky, “Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network,” Physica D: Nonlinear Phenomena, vol. 404, p. 132306, 2020. doi: 10.1016/j.physd.2019.132306
    Z. Wang, X. Yao, T. Li, and H. Zhang, “Design of PID controller based on echo state network with time-varying reservoir parameter,” IEEE Trans. Cybern., vol. 52, no. 7, pp. 6615–6626, Jul. 2022. doi: 10.1109/TCYB.2021.3090812
    M. Liu, X. Chen, M. Shang, and H. Li, “A pseudoinversion-free method for weight updating in broad learning system,” IEEE Trans. Neural Netw. Learn. Syst., DOI: 10.1109/TNNLS.2022.3190043.
    S. Abrazeh, A. Parvaresh, S.-R. Mohseni, M. J. Zeitouni, M. Gheisarnejad, and M. H. Khooban, “Nonsingular terminal sliding mode control with ultra-local model and single input interval type-2 fuzzy logic control for pitch control of wind turbines,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 3, pp. 690–700, Mar. 2021. doi: 10.1109/JAS.2021.1003889
    G. Ma, S. Xu, B. Jiang, C. Cheng, X. Yang, Y. Shen, T. Yang, Y. Huang, H. Ding, and Y. Yuan, “Real-time personalized health status prediction of lithium-ion batteries using deep transfer learning,” Energy Environ. Sci., vol. 15, no. 10, pp. 4083–4094, Jul. 2022. doi: 10.1039/D2EE01676A
    Y. Yuan, G. Ma, C. Cheng, B. Zhou, H. Zhao, H. T. Zhang, and H. Ding, “A general end-to-end diagnosis framework for manufacturing systems,” Natl. Sci. Rev., vol. 7, no. 2, pp. 418–429, Feb. 2020. doi: 10.1093/nsr/nwz190
    R. Chandra and M. Zhang, “Cooperative coevolution of Elman recurrent neural networks for chaotic time series prediction,” Neurocomputing, vol. 86, pp. 116–123, 2012. doi: 10.1016/j.neucom.2012.01.014
    M. Xu and M. Han, “Adaptive elastic echo state network for multivariate time series prediction,” IEEE Trans. Cybern., vol. 46, no. 10, pp. 2173–2183, Oct. 2016. doi: 10.1109/TCYB.2015.2467167
    Z. Pala and R. Atici, “Forecasting sunspot time series using deep learning methods,” Solar Physics, vol. 294, no. 5, pp. 1–14, 2019.
    P. I. Tebe, G. Wen, J. Li, Y. Yang, W. Tian, J. Chong, and W. Zhang, “5G-enabled medical data transmission in mobile hospital systems,” IEEE Internet Things J., vol. 9, no. 15, pp. 13679–13693, Aug. 2022. doi: 10.1109/JIOT.2022.3143873
    L. Zou, Z. Wang, H. Geng, and X. Liu, “Set-membership filtering subject to impulsive measurement outliers: A recursive algorithm,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 2, pp. 377–388, Feb. 2021. doi: 10.1109/JAS.2021.1003826
    X. Liang, S. Li, S. Zhang, H. Huang, and S. X. Chen, “PM 2.5 data reliability, consistency, and air quality assessment in five Chinese cities,” J. Geophys. Res. Atmos., vol. 121, no. 17, pp. 10–220, Sept. 2016.


    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(7)  / Tables(5)

    Article Metrics

    Article views (287) PDF downloads(58) Cited by()


    • It focuses on putting forward an improved echo state network for predicting time series in the presence of various kinds of noises
    • The mean and variance of the modeling error are minimized by optimizing the constructed objective function in the proposed model
    • It conducts experiments on a benchmark dataset as well as two real-world ones and comparisons based on different prediction models to verify the effectiveness and superiority of the proposed model


    DownLoad:  Full-Size Img  PowerPoint