Prikaz osnovnih podataka o dokumentu
A note on hybridization process applied on transformed double step size model
dc.contributor.author | Petrović, Milena | |
dc.contributor.author | Rakočević, Vladimir | |
dc.contributor.author | Valjarević, Dragana | |
dc.contributor.author | Ilić, Dejan | |
dc.date.accessioned | 2023-04-24T10:59:59Z | |
dc.date.available | 2023-04-24T10:59:59Z | |
dc.date.issued | 2019-12-03 | |
dc.identifier.citation | Projekat br.174025, Ministarstvo prosvete, nauke i tehnološkog razvoja Republike Srbije | en_US |
dc.identifier.citation | Projekat br.174024, Ministarstvo prosvete, nauke i tehnološkog razvoja Republike Srbije | en_US |
dc.identifier.uri | https://platon.pr.ac.rs/handle/123456789/1259 | |
dc.description.abstract | We introduce a hybrid gradient model for solving unconstrained optimization problems based on one specific accelerated gradient iteration. Having applied a three term hybridization relation on transformed accelerated double step size model, we develop an efficient hybrid accelerated scheme. We determine an iterative step size variable using Backtracking line search technique in which we take an optimally calculated starting value for the posed method. In convergence analysis, we show that the proposed method is at least linearly convergent on the sets of uniformly convex functions and strictly convex quadratic functions. Numerical computations confirm a significant improvement compared with some relevant hybrid and accelerated gradient processes. More precisely, subject to the number of iterations, the CPU time metric and the number of evaluations of the objective function, defined process outperforms comparative schemes multiple times. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Springer | en_US |
dc.title | A note on hybridization process applied on transformed double step size model | en_US |
dc.title.alternative | Numerical Algorithms | en_US |
dc.type | clanak-u-casopisu | en_US |
dc.description.version | publishedVersion | en_US |
dc.identifier.doi | https://doi.org/10.1007/s11075-019-00821-8 | |
dc.citation.volume | 85 | |
dc.citation.spage | 449 | |
dc.citation.epage | 465 | |
dc.subject.keywords | Unconstrained optimization | en_US |
dc.subject.keywords | Line search | en_US |
dc.subject.keywords | Gradient descent methods | en_US |
dc.subject.keywords | Newton method | en_US |
dc.subject.keywords | Convergence rate | en_US |
dc.type.mCategory | M21a | en_US |
dc.type.mCategory | closedAccess | en_US |
dc.type.mCategory | M21a | en_US |
dc.type.mCategory | closedAccess | en_US |