DOI
10.34229/KCA2522-9664.24.5.10
UDC 517.988
CONVERGENCE RATE OF THE EXTRAPOLATION FROM THE PAST
AND OPERATOR EXTRAPOLATION ALGORITHMS
Abstract. The paper considers variational inequalities in the Hilbert space and two methods of their approximate solution: extrapolation from the past and operator extrapolation algorithms. Both algorithms have less computationally expensive iterations than the classical extragradient algorithm: only one operator computation is needed instead of two. Non-asymptotic linear convergence rate estimates are proved for variational inequalities with the Lipschitz continuous operator, which also satisfy the generalized strong monotonicity condition. The results are new and improve the available estimates.
Keywords: variational inequality, saddle-point problem, linear convergence, extrapolation from the past, operator extrapolation.
full text
REFERENCES
- 1. Nagurney A. Network economics: A variational inequality approach. Dordrecht: Kluwer Academic Publishers. 1999, 325 p.
- 2. Facchinei F., Pang J.-S. Finite-dimensional variational inequalities and complementarily problem. Vol. 2. New York: Springer, 2003. 666 p.
- 3. Bauschke H.H., Combettes P.L. Convex analysis and monotone operator theory in Hilbert spaces. Berlin; Heidelberg; New York: Springer, 2011. 408 p.
- 4. Bakushinskii A.B., Goncharskii A.V. Iterative methods for solving Ill-posed problems. Moscow: Nauka, 1989. 126 p.
- 5. Alber Y., Ryazantseva I. Nonlinear Ill posed problems of monotone type. Dordrecht: Springer, 2006. 410 p.
- 6. Semenov V.V. On the parallel proximal decomposition method for solving the problems of convex optimization. Journal of Automation and Information Sciences. 2010. Vol. 42, Iss. 4. P. 13–18. doi.org/10.1615/JAutomatInfScien.v42.i4.20.
- 7. Gidel G., Berard H., Vincent P., Lacoste-Julien S. A variational inequality perspective on generative adversarial networks. 2018. arXiv preprint arXiv:1802.10551. 2018.
- 8. Nemirovski A. Prox-method with rate of convergence for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM Journal on Optimization. 2004. Vol. 15. P. 229–251.
- 9. Cohen M.B., Sidford A., Tian K. Relative Lipschitzness in extragradient methods and a direct recipe for acceleration. arXiv preprint arXiv:2011.06572. 2021.
- 10. Wang J.-K., Abernethy J., Levy K.Y. No-regret dynamics in the Fenchel game: A unified framework for algorithmic convex optimization. arXiv preprint arXiv:2111.11309. 2021.
- 11. Goodfellow I., Pouget-Abadie J., Mirza M., Xu B., Warde-Farley D., Ozair Sh., Courville A., Bengio Y. Generative adversarial networks. Advances in Neural Information Processing Systems, 2014. P. 2672–2680.
- 12. Arjovsky M., Chintala S., Bottou L. Wasserstein GAN. arXiv preprint arXiv:1701.07875. 2017.
- 13. Daskalakis C., Ilyas A., Syrgkanis V., Zeng H. Training GANs with optimism. arXiv preprint arXiv:1711.00141. 2018.
- 14. Chavdarova T., Gidel G., Fleuret F., Lacoste-Julien S. Reducing noise in GAN training with variance reduced extragradient. arXiv preprint arXiv:1904.08598. 2019.
- 15. Mishchenko K., Kovalev D., Shulgin E., Richtarik P., Malitsky Y. Revisiting stochastic extragradient. arXiv preprint arXiv:1905.11373. 2019.
- 16. Liu M., Zhang W., Mroueh Y., Cui X., Ross J., Yang T., Das P. A decentralized parallel algorithm for training generative adversarial nets. arXiv preprint arXiv:1910.12999. 2019.
- 17. Korpelevich G.M. An extragradient method for finding saddle points and for other problems. Matecon. 1976. Vol. 12, N 4. P. 747–756.
- 18. Gorbunov E., Loizou N., Gidel G. Extragradient method: last-iterate convergence for monotone variational inequalities and connections with cocoercivity. arXiv preprint arXiv: 2110.04261. 2021.
- 19. Yoon T., Ryu E.K. Accelerated algorithms for smooth convex-concave minimax problems with rate on squared gradient norm. Proceedings of the 38th International Conference on Machine Learning. Proceedings of Machine Learning Research. 2021. Vol. 139. P. 12098–12109.
- 20. Nesterov Y. Dual extrapolation and its applications to solving variational inequalities and related problems. Math. Program. 2007. Vol. 109, N 2–3. P. 319–344.
- 21. Tseng P. A modified forward-backward splitting method for maximal monotone mappings. SIAM Journal on Control and Optimization. 2000. Vol. 38. P. 431–446.
- 22. Censor Y., Gibali A., Reich S. The subgradient extragradient method for solving variational inequalities in Hilbert space. Journal of Optimization Theory and Applications. 2011. Vol. 148. P. 318–335. doi.org/10.1007/s10957-010-9757-3.
- 23. Denisov S.V., Nomirovskii D.A., Rublyov B.V., Semenov V.V. Convergence of extragradient algorithm with monotone step size strategy for variational inequalities and operator equations. Journal of Automation and Information Sciences. 2019. Vol. 51, Iss. 6. P. 12–24. doi.org/10.1615/JAutomatInfScien.v51.i6.20.
- 24. Popov L.D. A modification of the Arrow-Hurwicz method for search of saddle points. Mathematical Notes of the Academy of Sciences of the USSR. 1980. Vol. 28, Iss. 5. P. 845–848.
- 25. Malitsky Y., Tam M.K. A forward-backward splitting method for monotone inclusions without cocoercivity. SIAM Journal on Optimization. 2020. Vol. 30. P. 1451–1472. doi.org/10.1137/18M1207260.
- 26. Kotsalis G., Lan G., Li T. Simple and optimal methods for stochastic variational inequalities. I: Operator extrapolation. arXiv preprint arXiv: 2011.02987. 2020.
- 27. Semenov V.V., Denisov S.V., Sandrakov G.V., Kharkov O.S. Convergence of the operator extrapolation method for variational inequalities in Banach spaces. Cybernetics and Systems Analysis. 2022. Vol. 58, N 6. P. 740–753. doi.org/10.1007/s10559-022-00507-5.
- 28. Malitsky Yu.V., Semenov V.V. An extragradient algorithm for monotone variational inequalities. Cybernetics and Systems Analysis. 2014. Vol. 50, N 2. P. 271–277. doi.org/10.1007/s10559-014-9614-8.
- 29. Malitsky Yu. Projected reflected gradient methods for monotone variational inequalities. SIAM Journal on Optimization. 2015. Vol. 25. P. 502–520.
- 30. Chabak L., Semenov V., Vedel Y. A new non-euclidean proximal method for equilibrium problems. In: Chertov O., Mylovanov T., Kondratenko Y., Kacprzyk J., Kreinovich V., Stefanuk V. (eds.) Recent Developments in Data Science and Intelligent Analysis of Information. ICDSIAI 2018. Advances in Intelligent Systems and Computing. Vol. 836. Cham: Springer, 2019. P. 50–58. doi.org/10.1007/978-3-319-97885-7_6.
- 31. Mokhtari A., Ozdaglar A., Pattathil S. A unified analysis of extra-gradient and optimistic gradient methods for saddle point problems: proximal point approach. arXiv preprint arXiv:1901.08511. 2019.
- 32. Semenov V.V., Denisov S.V., Kravets A.V. Adaptive two-stage Bregman method for variational inequalities. Cybernetics and Systems Analysis. 2021. Vol. 57, N 6. P. 959–967. doi.org/10.1007/s10559-021-00421-2.
- 33. Vedel Y.I., Denisov S.V., Semenov V.V. An adaptive algorithm for the variational inequality over the set of solutions of the equilibrium problem. Cybernetics and Systems Analysis. 2021. Vol. 57, N 1. P. 91–100. doi.org/10.1007/s10559-021-00332-2.
- 34. Vedel Y.I., Sandrakov G.V., Semenov V.V. An adaptive two-stage proximal algorithm for equilibrium problems in Hadamard spaces. Cybernetics and Systems Analysis. 2020. Vol. 56, N 6. P. 978–989. doi.org/10.1007/s10559-020-00318-6.
- 35. Ramirez J., Sukumaran R., Bertrand Q., Gidel G. Omega: Optimistiс EMA gragients. arXiv preprint arXiv:2306.07905. 2023.
- 36. Khanh P.D., Vuong P.T. Modified projection method for strongly pseudomonotone variational inequalities. J. Glob. Optim. 2014. Vol. 58. P. 341–350.