In optimization, line search is a basic iterative approach to find a local minimumFirst-order methods assume that f is continuously differentiable, and that we can evaluate not only f but also its derivative.[1]: sec.5 Curve-fitting methods try to attain superlinear convergence by assuming that f has some analytic form, e.g. a polynomial of finite degree.At each iteration, there is a set of "working points" in which we know the value of f (and possibly also its derivative).Based on these points, we can compute a polynomial that fits the known values, and find its minimum analytically.The minimum point becomes a new working point, and we proceed to the next iteration:[1]: sec.5 Curve-fitting methods have superlinear convergence when started close enough to the local minimum, but might diverge otherwise.The line-search method first finds a descent direction along which the objective functionIt can also be solved loosely, by asking for a sufficient decrease in h that does not necessarily approximate the optimum.Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.