A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 11 Issue 6
Jun.  2024

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 4% (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Y. Zhao, X. He, M. Zhou, and  T. Huang,  “Accelerated primal-dual projection neurodynamic approach with time scaling for linear and set constrained convex optimization problems,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 6, pp. 1485–1498, Jun. 2024. doi: 10.1109/JAS.2024.124380
Citation: Y. Zhao, X. He, M. Zhou, and  T. Huang,  “Accelerated primal-dual projection neurodynamic approach with time scaling for linear and set constrained convex optimization problems,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 6, pp. 1485–1498, Jun. 2024. doi: 10.1109/JAS.2024.124380

Accelerated Primal-Dual Projection Neurodynamic Approach With Time Scaling for Linear and Set Constrained Convex Optimization Problems

doi: 10.1109/JAS.2024.124380
Funds:  This work was supported by the National Natural Science Foundation of China (62176218, 62176027), the Fundamental Research Funds for the Central Universities (XDJK2020TY003), and the Funds for Chongqing Talent Plan (cstc2024ycjh-bgzxm0082)
More Information
  • The Nesterov accelerated dynamical approach serves as an essential tool for addressing convex optimization problems with accelerated convergence rates. Most previous studies in this field have primarily concentrated on unconstrained smooth convex optimization problems. In this paper, on the basis of primal-dual dynamical approach, Nesterov accelerated dynamical approach, projection operator and directional gradient, we present two accelerated primal-dual projection neurodynamic approaches with time scaling to address convex optimization problems with smooth and nonsmooth objective functions subject to linear and set constraints, which consist of a second-order ODE (ordinary differential equation) or differential conclusion system for the primal variables and a first-order ODE for the dual variables. By satisfying specific conditions for time scaling, we demonstrate that the proposed approaches have a faster convergence rate. This only requires assuming convexity of the objective function. We validate the effectiveness of our proposed two accelerated primal-dual projection neurodynamic approaches through numerical experiments.


  • loading
  • [1]
    Y. E. Nesterov, “A method of solving a convex programming problem with convergence rate 0(1/k2),” Sov. Math. Dokl., vol. 27, no. 2, pp. 372–376, 1983.
    W. Su, S. Boyd, and E. Candès, “A differential equation for modeling Nesterov’s accelerated gradient method: Theory and insights,” in Proc. 27th Int. Conf. Neural Information Processing Systems, Montreal, Canada, 2014, pp. 2510–2518.
    H. Attouch and Z. Chbani. “Fast inertial dynamics and FISTA algorithms in convex optimization. Perturbation aspects,” arXiv preprint arXiv: 1507.01367, 2015.
    H. Attouch, J. Peypouquet, and P. Redont, “Fast convex optimization via inertial dynamics with Hessian driven damping,” J. Differ. Equations, vol. 261, no. 10, pp. 5734–5783, Nov. 2016. doi: 10.1016/j.jde.2016.08.020
    H. Attouch, Z. Chbani, and H. Riahi, “Combining fast inertial dynamics for convex optimization with Tikhonov regularization,” J. Math. Anal. Appl., vol. 457, no. 2, pp. 1065–1094, Jan. 2018. doi: 10.1016/j.jmaa.2016.12.017
    H. Attouch and A. Cabot, “Convergence of a relaxed inertial proximal algorithm for maximally monotone operators,” Math. Program., vol. 184, no. 1–2, pp. 243–287, Nov. 2020. doi: 10.1007/s10107-019-01412-0
    A. Wibisono, A. C. Wilson, and M. I. Jordan, “A variational perspective on accelerated methods in optimization,” in Proc. Natl. Acad. Sci. USA, vol. 113, no. 47, pp. E7351–E7358, Nov. 2016.
    A. C. Wilson, B. Recht, and M. I. Jordan, “A Lyapunov analysis of accelerated methods in optimization,” J. Mach. Learn. Res., vol. 22, no. 1, p. 113, Jan. 2021.
    F. Alimisis, A. Orvieto, G. Bécigneul, and A. Lucchi, “A continuous-time perspective for modeling acceleration in Riemannian optimization,” in Proc. 23rd Int. Conf. Artificial Intelligence and Statistics, Palermo, Italy, 2020, pp. 1297–1307.
    A. Vassilis, A. Jean-François, and D. Charles, “The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case ≤ 3,” SIAM J. Optim., vol. 28, no. 1, pp. 551–574, Jan. 2018. doi: 10.1137/17M1128642
    H. Attouch, Z. Chbani, and H. Riahi, “Fast proximal methods via time scaling of damped inertial dynamics,” SIAM J. Optim., vol. 29, no. 3, pp. 2227–2256, Jan. 2019. doi: 10.1137/18M1230207
    D. Tank and J. Hopfield, “Simple ‘neural’ optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit,” IEEE Trans. Circuits Syst., vol. 33, no. 5, pp. 533–541, May 1986. doi: 10.1109/TCS.1986.1085953
    M. P. Kennedy and L. O. Chua, “Neural networks for nonlinear programming,” IEEE Trans. Circuits Syst., vol. 35, no. 5, pp. 554–562, May 1988. doi: 10.1109/31.1783
    Y. Xia and G. Feng, “On convergence rate of projection neural networks,” IEEE Trans. Autom. Control, vol. 49, no. 1, pp. 91–96, Jan. 2004. doi: 10.1109/TAC.2003.821413
    Y. Xia and J. Wang, “A recurrent neural network for solving nonlinear convex programs subject to linear constraints,” IEEE Trans. Neural Netw., vol. 16, no. 2, pp. 379–386, Mar. 2005. doi: 10.1109/TNN.2004.841779
    X. Hu and J. Wang, “Design of general projection neural networks for solving monotone linear variational inequalities and linear and quadratic optimization problems,” IEEE Trans. Syst., Man, Cybern. Part B (Cybern.), vol. 37, no. 5, pp. 1414–1421, Oct. 2007. doi: 10.1109/TSMCB.2007.903706
    X. He, T. Huang, J. Yu, C. Li, and C. Li, “An inertial projection neural network for solving variational inequalities,” IEEE Trans. Cybern., vol. 47, no. 3, pp. 809–814, Mar. 2017. doi: 10.1109/TCYB.2016.2523541
    S. Zhang and A. G. Constantinides, “Lagrange programming neural networks,” IEEE Trans. Circuits Syst. II: Analog Digit. Signal Process., vol. 39, no. 7, pp. 441–452, Jul. 1992. doi: 10.1109/82.160169
    W. Bian and X. Chen, “Smoothing neural network for constrained non-Lipschitz optimization with applications,” IEEE Trans. Neural Netw. Learn. Syst., vol. 23, no. 3, pp. 399–411, Mar. 2012. doi: 10.1109/TNNLS.2011.2181867
    Q. Liu and J. Wang, “A projection neural network for constrained quadratic minimax optimization,” IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 11, pp. 2891–2900, Nov. 2015. doi: 10.1109/TNNLS.2015.2425301
    Q. Liu and J. Wang, “L1-minimization algorithms for sparse signal reconstruction based on a projection neural network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 27, no. 3, pp. 698–707, Mar. 2016. doi: 10.1109/TNNLS.2015.2481006
    Z. Guo, S. Yang, and J. Wang, “Global synchronization of stochastically disturbed memristive neurodynamics via discontinuous control laws,” IEEE/CAA J. Autom. Sinica, vol. 3, no. 2, pp. 121–131, Apr. 2016. doi: 10.1109/JAS.2016.7451099
    R. Feng, C. S. Leung, A. G. Constantinides, and W. J. Zeng, “Lagrange programming neural network for nondifferentiable optimization problems in sparse approximation,” IEEE Trans. Neural Netw. Learn. Syst., vol. 28, no. 10, pp. 2395–2407, Oct. 2017. doi: 10.1109/TNNLS.2016.2575860
    C. Xu and X. He, “A fully distributed approach to optimal energy scheduling of users and generators considering a novel combined neurodynamic algorithm in smart grid,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 7, pp. 1325–1335, Jul. 2021. doi: 10.1109/JAS.2021.1004048
    Y. Zhao, X. Liao, X. He, and R. Tang, “Centralized and collective neurodynamic optimization approaches for sparse signal reconstruction via L1-minimization,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 12, pp. 7488–7501, Dec. 2022. doi: 10.1109/TNNLS.2021.3085314
    J. Wang, J. Wang, and Q.-L. Han, “Receding-horizon trajectory planning for under-actuated autonomous vehicles based on collaborative neurodynamic optimization,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 11, pp. 1909–1923, Nov. 2022. doi: 10.1109/JAS.2022.105524
    X. Wen, S. Qin, and J. Feng, “A novel projection neural network for solving a class of monotone variational inequalities,” IEEE Trans. Syst., Man, Cybern.: Syst., vol. 53, no. 9, pp. 5580–5590, Sept. 2023. doi: 10.1109/TSMC.2023.3274222
    W. Krichene, A. M. Bayen, and P. L. Bartlett, “Accelerated mirror descent in continuous and discrete time,” in Proc. 28th Int. Conf. Neural Information Processing Systems, Montreal, Canada, 2015, pp 2845–2853.
    Y. Zhao, X. Liao, and X. He, “Novel projection neurodynamic approaches for constrained convex optimization,” Neural Netw., vol. 150, pp. 336–349, Jun. 2022. doi: 10.1016/j.neunet.2022.03.011
    X. Zeng, J. Lei, and J. Chen, “Dynamical primal-dual nesterov accelerated method and its application to network optimization,” IEEE Trans. Autom. Control, vol. 68, no. 3, pp. 1760–1767, Mar. 2023. doi: 10.1109/TAC.2022.3152720
    X. He, R. Hu, and Y. P. Fang, “Convergence rates of inertial primal-dual dynamical methods for separable convex optimization problems,” SIAM J. Control Optim., vol. 59, no. 5, pp. 3278–3301, Jan. 2021. doi: 10.1137/20M1355379
    R. I. Boţ and D. K. Nguyen, “Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping,” J. Differ. Equations., vol. 303, pp. 369–406, Dec. 2021. doi: 10.1016/j.jde.2021.09.021
    H. Attouch, Z. Chbani, J. Fadili, and H. Riahi, “Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics,” J. Optim. Theory Appl., vol. 193, no. 1–3, pp. 704–736, Jun. 2022. doi: 10.1007/s10957-021-01859-2
    X. He, R. Hu, and Y. P. Fang, “Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem,” Automatica, vol. 146, p. 110547, Dec. 2022. doi: 10.1016/j.automatica.2022.110547
    Y. Zhao, X. Liao, X. He, M. Zhou, and C. Li, “Accelerated primal-dual mirror dynamics for centralized and distributed constrained convex optimization problems,” J. Mach. Learn. Res., vol. 24, no. 1, p. 343, Jan. 2023.
    R. T. Rockafellar, Convex Analysis. Princeton, USA: Princeton University Press, 1970.
    N. Parikh and S. Boyd, “Proximal algorithms,” Found. Trends Optim., vol. 1, no. 3, pp. 127–239, Jan. 2014. doi: 10.1561/2400000003
    A. Chambolle and T. Pock, “An introduction to continuous optimization for imaging,” Acta Numer., vol. 25, pp. 161–319, May 2016. doi: 10.1017/S096249291600009X
    D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory., vol. 52, no. 4, pp. 1289–1306, Apr. 2006. doi: 10.1109/TIT.2006.871582
    C. Li, X. Yu, W. Yu, T. Huang, and Z. W. Liu, “Distributed event-triggered scheme for economic dispatch in smart grids,” IEEE Trans. Ind. Inf., vol. 12, no. 5, pp. 1775–1785, Oct. 2016. doi: 10.1109/TII.2015.2479558
    W. T. Lin, Y. W. Wang, C. Li, and X. Yu, “Distributed resource allocation via accelerated saddle point dynamics,” IEEE/CAA J. Automa. Sinica, vol. 8, no. 9, pp. 1588–1599, Sept. 2021. doi: 10.1109/JAS.2021.1004114
    C. Li, X. Yu, T. Huang, and X. He, “Distributed optimal consensus over resource allocation network and its Application to dynamical economic dispatch,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 6, pp. 2407–2418, Jun. 2018. doi: 10.1109/TNNLS.2017.2691760
    Z. Lin, H. Li, and C. Fang, Accelerated Optimization for Machine Learning: First-Order Algorithms. Singapore: Springer, 2020.
    S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Found. Trends Mach. Learn., vol. 3, no. 1, pp. 1–122, Jul. 2011.
    R. I. Boţ and E. R. Csetnek, “A forward-backward dynamical approach to the minimization of the sum of a nonsmooth convex with a smooth nonconvex function,” ESAIM: COCV, vol. 24, no. 2, pp. 463–477, 2018. doi: 10.1051/cocv/2017020
    B. Abbas, H. Attouch, and B. F. Svaiter, “Newton-like dynamics and forward-backward methods for structured monotone inclusions in Hilbert spaces,” J. Optim. Theory Appl., vol. 161, no. 2, pp. 331–360, May 2014. doi: 10.1007/s10957-013-0414-5
    X. Chen and N. Li, “Exponential stability of primal-dual gradient dynamics with non-strong convexity,” in Proc. American Control Conf., Denver, USA, 2020, pp. 1612–1618.


    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索


    Article Metrics

    Article views (116) PDF downloads(25) Cited by()


    • Proposed accelerated primal-dual projection neurodynamic approach (APDPNA-S) with fast convergence rates
    • APDPNA-S can handle convex optimization problems with linear and set constraints, providing wider applicability
    • Extended APDPNA-S to differential inclusion dynamical approach (APDPNA-NS) with same results
    • Two new Lyapunov functions are proposed to prove the accelerated convergence properties of APDPNA-S and APDPNA-NS


    DownLoad:  Full-Size Img  PowerPoint