Golden-section search

The golden-section search is a technique for finding an extremum (minimum or maximum) of a function inside a specified interval.For a strictly unimodal function with an extremum inside the interval, it will find that extremum, while for an interval containing multiple extrema (possibly including the interval boundaries), it will converge to one of them.The method operates by successively narrowing the range of values on the specified interval, which makes it relatively slow, but very robust.The technique derives its name from the fact that the algorithm maintains the function values for four points whose three interval widths are in the ratio φ:1:φ, where φ is the golden ratio.These ratios are maintained for each iteration and are maximally efficient.The algorithm is the limit of Fibonacci search (also described below) for many function evaluations.Unlike finding a zero, where two function evaluations with opposite sign are sufficient to bracket a root, when searching for a minimum, three values are necessary.The golden-section search is an efficient way to progressively reduce the interval locating the minimum.The diagram above illustrates a single step in the technique for finding a minimum., it is clear that a minimum lies inside the interval fromThe next step in the minimization process is to "probe" the function by evaluating it at a new value of x, namelyThus, in either case, we can construct a new narrower search interval that is guaranteed to contain the function's minimum.The golden-section search requires that these intervals be equal.If they are not, a run of "bad luck" could lead to the wider interval being used many times, thus slowing down the rate of convergence.By maintaining the same proportion of spacing throughout the algorithm, we avoid a situation in whichand guarantee that the interval width shrinks by the same constant proportion in each step., then we want Eliminating c from these two simultaneous equations yields or where φ is the golden ratio: The appearance of the golden ratio in the proportional spacing of the evaluation points is how this search algorithm gets its name.Any number of termination conditions may be applied, depending upon the application.The interval ΔX = X4 − X1 is a measure of the absolute error in the estimation of the minimum X and may be used to terminate the algorithm.The value of ΔX is reduced by a factor of r = φ − 1 for each iteration, so the number of iterations to reach an absolute error of ΔX is about ln(ΔX/ΔX0) / ln(r), where ΔX0 is the initial value of ΔX.Because smooth functions are flat (their first derivative is close to zero) near a minimum, attention must be paid not to expect too great an accuracy in locating the minimum.The termination condition provided in the book Numerical Recipes in C is based on testing the gaps amongis approximately proportional to the squared absolute error inFor that same reason, the Numerical Recipes text recommends thatThe examples here describe an algorithm that is for finding the minimum of a function.For maximum, the comparison operators need to be reversed.A very similar algorithm can also be used to find the extremum (minimum or maximum) of a sequence of values that has a single local minimum or local maximum.In order to approximate the probe positions of golden section search while probing only integer sequence indices, the variant of the algorithm for this case typically maintains a bracketing of the solution in which the length of the bracketed interval is a Fibonacci number.Fibonacci search was first devised by Kiefer (1953) as a minimax search for the maximum (minimum) of a unimodal function in an interval.The Bisection method is a similar algorithm for finding a zero of a function.
Diagram of a golden-section search. The initial triplet of x values is { x 1 , x 2 , x 3 }. If f ( x 4 ) = f 4a , the triplet { x 1 , x 2 , x 4 } is chosen for the next iteration. If f ( x 4 ) = f 4b , the triplet { x 2 , x 4 , x 3 } is chosen.
Diagram of the golden section search for a minimum. The initial interval enclosed by X 1 and X 4 is divided into three intervals, and f[X] is evaluated at each of the four X i . The two intervals containing the minimum of f ( X i ) are then selected, and a third interval and functional value are calculated, and the process is repeated until termination conditions are met. The three interval widths are always in the ratio c:cr:c where r = φ − 1 = 0.618... and c = 1 − r = 0.382..., φ being the golden ratio . This choice of interval ratios is the only one that allows the ratios to be maintained during an iteration.
Graph of a strictly concave quadratic function with unique maximum.
Optimization computes maxima and minima.
extremumunimodal functiongolden ratioKieferalgorithmNumerical Recipes in Cabsolute valueFibonacci search techniquesequenceFibonacci numberFibonacci searchminimaxBisection methodTernary searchBrent's methodBinary searchKiefer, J.Proceedings of the American Mathematical SocietyFibonacci QuarterlyMetallic meansPisot numberFibonacci sequenceKepler triangleRectangleRhombusSpiralTriangleSupergolden ratioSilverPell numberSupersilver ratioOptimizationAlgorithmsmethodsheuristicsUnconstrained nonlinearFunctionsPowell's methodLine searchNelder–Mead methodSuccessive parabolic interpolationGradientsConvergenceTrust regionWolfe conditionsQuasi–NewtonBerndt–Hall–Hall–HausmanBroyden–Fletcher–Goldfarb–ShannoL-BFGSDavidon–Fletcher–PowellSymmetric rank-one (SR1)Other methodsConjugate gradientGauss–NewtonGradientMirrorLevenberg–MarquardtPowell's dog leg methodTruncated NewtonHessiansNewton's methodConstrained nonlinearBarrier methodsPenalty methodsAugmented Lagrangian methodsSequential quadratic programmingSuccessive linear programmingConvex optimizationConvex minimizationCutting-plane methodReduced gradient (Frank–Wolfe)Subgradient methodLinearquadraticAffine scalingEllipsoid algorithm of KhachiyanProjective algorithm of KarmarkarBasis-exchangeSimplex algorithm of DantzigRevised simplex algorithmCriss-cross algorithmPrincipal pivoting algorithm of LemkeActive-set methodCombinatorialApproximation algorithmDynamic programmingGreedy algorithmInteger programmingBranch and boundGraph algorithmsMinimum spanning treeBorůvkaKruskalShortest pathBellman–FordDijkstraFloyd–WarshallNetwork flowsEdmonds–KarpFord–FulkersonPush–relabel maximum flowMetaheuristicsEvolutionary algorithmHill climbingLocal searchParallel metaheuristicsSimulated annealingSpiral optimization algorithmTabu searchSoftware