Line search

In optimization, line search is a basic iterative approach to find a local minimumFirst-order methods assume that f is continuously differentiable, and that we can evaluate not only f but also its derivative.[1]: sec.5 Curve-fitting methods try to attain superlinear convergence by assuming that f has some analytic form, e.g. a polynomial of finite degree.At each iteration, there is a set of "working points" in which we know the value of f (and possibly also its derivative).Based on these points, we can compute a polynomial that fits the known values, and find its minimum analytically.The minimum point becomes a new working point, and we proceed to the next iteration:[1]: sec.5 Curve-fitting methods have superlinear convergence when started close enough to the local minimum, but might diverge otherwise.The line-search method first finds a descent direction along which the objective functionIt can also be solved loosely, by asking for a sufficient decrease in h that does not necessarily approximate the optimum.Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.
Graph of a strictly concave quadratic function with unique maximum.
Optimization computes maxima and minima.
linear searchoptimizationiterativelocal minimumobjective functiondescent directiongradient descentquasi-Newton methodunimodalvalue oracleTernary searchlinear convergenceFibonacci sequenceGolden-section searchgolden ratiobisection methodsuperlinear convergenceNewton's methodquadratic convergenceRegula falsiconjugate gradient methodbacktracking line searchWolfe conditionssimulated annealinglocal minimaTrust regionGrid searchLearning ratePattern search (optimization)Secant methodAlgorithmsmethodsheuristicsUnconstrained nonlinearFunctionsPowell's methodNelder–Mead methodSuccessive parabolic interpolationGradientsConvergenceQuasi–NewtonBerndt–Hall–Hall–HausmanBroyden–Fletcher–Goldfarb–ShannoL-BFGSDavidon–Fletcher–PowellSymmetric rank-one (SR1)Other methodsConjugate gradientGauss–NewtonGradientMirrorLevenberg–MarquardtPowell's dog leg methodTruncated NewtonHessiansConstrained nonlinearBarrier methodsPenalty methodsAugmented Lagrangian methodsSequential quadratic programmingSuccessive linear programmingConvex optimizationConvex minimizationCutting-plane methodReduced gradient (Frank–Wolfe)Subgradient methodLinearquadraticAffine scalingEllipsoid algorithm of KhachiyanProjective algorithm of KarmarkarBasis-exchangeSimplex algorithm of DantzigRevised simplex algorithmCriss-cross algorithmPrincipal pivoting algorithm of LemkeActive-set methodCombinatorialApproximation algorithmDynamic programmingGreedy algorithmInteger programmingBranch and boundGraph algorithmsMinimum spanning treeBorůvkaKruskalShortest pathBellman–FordDijkstraFloyd–WarshallNetwork flowsEdmonds–KarpFord–FulkersonPush–relabel maximum flowMetaheuristicsEvolutionary algorithmHill climbingLocal searchParallel metaheuristicsSpiral optimization algorithmTabu searchSoftware