Proximal extrapolated gradient methods for variational inequalities

Yu Malitsky*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.

Original languageEnglish
Pages (from-to)140-164
Number of pages25
JournalOptimization Methods and Software
Issue number1
Publication statusPublished - 2 Jan 2018


  • convex optimization
  • ergodic convergence
  • linesearch
  • monotone operator
  • nonmonotone stepsizes
  • proximal methods
  • variational inequality

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics


Dive into the research topics of 'Proximal extrapolated gradient methods for variational inequalities'. Together they form a unique fingerprint.

Cite this