Given a function f(x) of a variable x tabulated at m values y_1 = f(x_1), ..., y_m = f(x_m), assume the function is of known analytic form depending on n parameters f(x;λ_1, ..., λ_n), and consider the overdetermined set of m equations y_1 | = | f(x_1 ;λ_1, λ_2, ..., λ_n) y_m | = | f(x_m ;λ_1, λ_2, ..., λ_n). We desire to solve these equations to obtain the values λ_1, ..., λ_n which best satisfy this system of equations. Pick an initial guess for the λ_i and then define d β_i = y_i - f(x_i ;λ_1, ..., λ_n).