| Citation: | Y. Xia, J. Wang, and C. Sun, “Analysis and application of a discrete-time neurodynamic approach for fast constrained l1-norm minimization,” IEEE/CAA J. Autom. Sinica, vol. 13, no. 3, pp. 1–14, Mar. 2026. doi: 10.1109/JAS.2026.125729 |
| [1] |
Y. Dodge and J. Jurečková, Adaptive Regression. New York, USA: Springer, 2000.
|
| [2] |
Y. Tsaig and D. L. Donoho, “Breakdown of equivalence between the minimal $\ell^1$-norm solution and the sparsest solution,” Signal Process., vol. 86, no. 3, pp. 533–548, Mar. 2006. doi: 10.1016/j.sigpro.2005.05.028
|
| [3] |
J. Wright, A. Y. Yang, A. Ganesh, S. S. Sastry, and Y. Ma, “Robust face recognition via sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 2, pp. 210–227, Feb. 2009. doi: 10.1109/TPAMI.2008.79
|
| [4] |
B. Xu, Q. Liu, and T. Huang, “A discrete-time projection neural network for sparse signal reconstruction with application to face recognition,” IEEE Trans. Neural Networks Learn. Syst., vol. 30, no. 1, pp. 151–162, Jan. 2019. doi: 10.1109/TNNLS.2018.2836933
|
| [5] |
Y. Zhao, X. Liao, and X. He, “Distributed continuous and discrete time projection neurodynamic approaches for sparse recovery,” IEEE Trans. Emerg. Top. Comput. Intell., vol. 6, no. 6, pp. 1411–1426, Dec. 2022. doi: 10.1109/TETCI.2022.3170514
|
| [6] |
Y. Xia and M. S. Kamel, “Novel cooperative neural fusion algorithms for image restoration and image fusion,” IEEE Trans. Image Process., vol. 16, no. 2, pp. 367–381, Feb. 2007. doi: 10.1109/TIP.2006.888340
|
| [7] |
S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge, UK: Cambridge University Press, 2016.
|
| [8] |
D. A. Lorenz and Q. Tran-Dinh, “Non-stationary Douglas-Rachford and alternating direction method of multipliers: Adaptive step-sizes and convergence,” Comput. Optim. Appl., vol. 74, no. 1, pp. 67–92, May 2019. doi: 10.1007/s10589-019-00106-9
|
| [9] |
MOSEK Modeling Cookbook: MOSEK Optimization Toolbox for MATLAB, 2021.
|
| [10] |
Y. Xia and M. S. Kamel, “A cooperative recurrent neural network for solving $L_1$ estimation problems with general linear constraints,” Neural Comput., vol. 20, no. 3, pp. 844–872, Mar. 2008. doi: 10.1162/neco.2007.10-06-376
|
| [11] |
Y. Xia, “A compact cooperative recurrent neural network for computing general constrained $L_1$ norm estimators,” IEEE Trans. Signal Process., vol. 57, no. 9, pp. 3693–3697, Sep. 2009. doi: 10.1109/TSP.2009.2021499
|
| [12] |
X. Hu, C. Sun, and B. Zhang, “Design of recurrent neural networks for solving constrained least absolute deviation problems,” IEEE Trans. Neural Networks, vol. 21, no. 7, pp. 1073–1086, Jul. 2010. doi: 10.1109/TNN.2010.2048123
|
| [13] |
C. Li and X. Gao, “One-layer neural network for solving least absolute deviation problem with box and equality constraints,” Neurocomputing, vol. 330, pp. 483–489, Feb. 2019. doi: 10.1016/j.neucom.2018.11.037
|
| [14] |
M. Mohammadi, “A compact neural network for fused lasso signal approximator,” IEEE Trans. Cybern., vol. 51, no. 8, pp. 4327–4336, Aug. 2021. doi: 10.1109/TCYB.2019.2925707
|
| [15] |
Y. Xia, H. Leung, and J. Wang, “A projection neural network and its application to constrained optimization problems,” IEEE Trans. Circuits Syst. I: Fundam. Theory Appl., vol. 49, no. 4, pp. 447–458, Apr. 2002. doi: 10.1109/81.995659
|
| [16] |
Y. Zhao, X. He, M. Zhou, and T. Huang, “Accelerated primal-dual projection neurodynamic approach with time scaling for linear and set constrained convex optimization problems,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 6, pp. 1485–1498, Jun. 2024. doi: 10.1109/JAS.2024.124380
|
| [17] |
A. Cichocki and R. Unbehauen, Neural Networks for Optimization and Signal Processing. New York, USA: Wiley, 1993.
|
| [18] |
Y. Xia and J. Wang, “A discrete-time recurrent neural network for shortest-path routing,” IEEE Trans. Autom. Control, vol. 45, no. 11, pp. 2129–2134, Nov. 2000. doi: 10.1109/9.887639
|
| [19] |
M. J. Perez-Ilzarbe, “New discrete-time recurrent neural network proposal for quadratic optimization with general linear constraints,” IEEE Trans. Neural Networks Learn. Syst., vol. 24, no. 2, pp. 322–328, Feb. 2013. doi: 10.1109/TNNLS.2012.2223484
|
| [20] |
Y. Xia, C. Sun, and W. X. Zheng, “Discrete-time neural network for fast solving large linear $L_1$ estimation problems and its application to image restoration,” IEEE Trans. Neural Networks Learn. Syst., vol. 23, no. 5, pp. 812–820, May 2012. doi: 10.1109/TNNLS.2012.2184800
|
| [21] |
Q. Liu and J. Cao, “Global exponential stability of discrete-time recurrent neural network for solving quadratic programming problems subject to linear constraints,” Neurocomputing, vol. 74, no. 17, pp. 3494–3501, Oct. 2011. doi: 10.1016/j.neucom.2011.06.003
|
| [22] |
Q. Liu, T. Huang, and J. Wang, “One-layer continuous-and discrete-time projection neural networks for solving variational inequalities and related optimization problems,” IEEE Trans. Neural Networks Learn. Syst., vol. 25, no. 7, pp. 1308–1318, Jul. 2014. doi: 10.1109/TNNLS.2013.2292893
|
| [23] |
Q. Liu and J. Wang, “ $L_1$-minimization algorithms for sparse signal reconstruction based on a projection neural network,” IEEE Trans. Neural Networks Learn. Syst., vol. 27, no. 3, pp. 698–707, Mar. 2016. doi: 10.1109/TNNLS.2015.2481006
|
| [24] |
M. Mohammadi, “A new discrete-time neural network for quadratic programming with general linear constraints,” Neurocomputing, vol. 424, pp. 107–116, Feb. 2021. doi: 10.1016/j.neucom.2019.11.028
|
| [25] |
L. Luan, X. Wen, and S. Qin, “Distributed neurodynamic approaches to nonsmooth optimization problems with inequality and set constraints,” Complex Intell. Syst., vol. 8, no. 6, pp. 5511–5530, May 2022. doi: 10.1007/s40747-022-00770-1
|
| [26] |
Q. Xiao, T. Huang, and Z. Zeng, “On exponential stability of delayed discrete-time complex-valued inertial neural networks,” IEEE Trans. Cybern., vol. 52, no. 5, pp. 3483–3494, May 2022. doi: 10.1109/TCYB.2020.3009761
|
| [27] |
C.-R. Wang, Y. He, C.-K. Zhang, and M. Wu, “Stability analysis of discrete-time neural networks with a time-varying delay: Extended free-weighting matrices zero equation approach,” IEEE Trans. Cybern., vol. 54, no. 2, pp. 1109–1118, Feb. 2024. doi: 10.1109/TCYB.2022.3201686
|
| [28] |
M. Di Marco, M. Forti, L. Pancioni, and A. Tesi, “On convergence properties of the brain-state-in-a-convex-domain,” Neural Networks, vol. 178, p. 106481, Oct. 2024. doi: 10.1016/j.neunet.2024.106481
|
| [29] |
X. Gao and L.-Z. Liao, “Novel continuous- and discrete-time neural networks for solving quadratic minimax problems with linear equality constraints,” IEEE Trans. Neural Networks Learn. Syst., vol. 35, no. 7, pp. 9814–9828, Jul. 2024. doi: 10.1109/TNNLS.2023.3236695
|
| [30] |
Y. Xia, J. Wang, Z. Lu, and L. Huang, “Two recurrent neural networks with reduced model complexity for constrained $l_1$-norm optimization,” IEEE Trans. Neural Networks Learn. Syst., vol. 34, no. 9, pp. 6173–6185, Sep. 2023. doi: 10.1109/TNNLS.2021.3133836
|
| [31] |
L. Cheng, Z.-G. Hou, Y. Lin, M. Tan, W. C. Zhang, and F.-X. Wu, “Recurrent neural network for non-smooth convex optimization problems with application to the identification of genetic regulatory networks,” IEEE Trans. Neural Network, vol. 22, no. 5, pp. 714–726, May 2011. doi: 10.1109/TNN.2011.2109735
|
| [32] |
L. Guo, X. Shi, and J. Cao, “Exponential convergence of primal-dual dynamical system for linear constrained optimization,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 4, pp. 745–748, Apr. 2022. doi: 10.1109/JAS.2022.105485
|
| [33] |
J. Yang, X. He, and T. Huang, “Neurodynamic approaches for sparse recovery problem with linear inequality constraints,” Neural Networks, vol. 155, pp. 592–601, 2022.
|
| [34] |
J. M. Ortega and W. G. Rheinboldt, Iterative Solution of Nonlinear Equations in Several Variables. New York, USA: Academic Press, 1970.
|
| [35] |
D. Zhang, Q. Lian, Y. Su, and T. Ren, “Dual-prior integrated image reconstruction for quanta image sensors using multi-agent consensus equilibrium,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 6, pp. 1407–1420, Jun. 2023. doi: 10.1109/JAS.2023.123390
|
| [36] |
R. C. Gonzalez and R. E. Woods, Digital Image Processing. 3rd ed. Englewood Cliffs, USA: Prentice Hall, 2007.
|
| [37] |
A. M. Thompson, J. C. Brown, J. W. Kay, and D. M. Titterington, “A study of methods of choosing the smoothing parameter in image restoration by regularization,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 13, no. 4, pp. 326–339, Apr. 1991. doi: 10.1109/34.88568
|
| [38] |
S. Ramani, Z. Liu, J. Rosen, J.-F. Nielsen, and J. A. Fessler, “Regularization parameter selection for nonlinear iterative image restoration and MRI reconstruction using GCV and SURE-based methods,” IEEE Trans. Image Process., vol. 21, no. 8, pp. 3659–3672, Aug. 2012. doi: 10.1109/TIP.2012.2195015
|
| [39] |
S.-S. Kuo and R. J. Mammone, “Image restoration by convex projections using adaptive constraints and the L1 norm,” IEEE Trans. Signal Process., vol. 40, no. 1, pp. 159–168, Jan. 1992. doi: 10.1109/78.157191
|
| [40] |
J. Yang, Y. Zhang, and W. Yin, “An efficient TVL1 algorithm for deblurring multichannel images corrupted by impulsive noise,” SIAM J. Sci. Comput., vol. 31, no. 4, pp. 2842–2865, Jan. 2009. doi: 10.1137/080732894
|
| [41] |
L. Wang, P.-A. Fayolle, and A. G. Belyaev, “Reverse image filtering with clean and noisy filters,” Signal, Image Video Process., vol. 17, no. 2, pp. 333–341, Mar. 2023. doi: 10.1007/s11760-022-02236-w
|
| [42] |
Z. Zhang, Y. Cheng, J. Suo, L. Bian, and Q. Da, “INFWIDE: Image and feature space wiener deconvolution network for non-blind image deblurring in low-light conditions,” IEEE Trans. Image Process., vol. 32, pp. 1390–1402, Feb. 2023. doi: 10.1109/TIP.2023.3244417
|