Dud (Doesn't use derivative)

Dud is one of the optimization algorithms, which do not use any derivative of the function being evaluated. It can be seen as a Gauss-Newton method, in the sense that it transforms the nonlinear least square problem into the well-known linear square problem. The difference is that instead of approximating the nonlinear function by its tangent function, the Dud uses an affine function for the linearization. For N calibration parameters, Dud requires (N+1) set of parameters estimates. The affine function for the linearization is formed through all these (N+1) guesses. Note that the affine function gives exact value at each of the (N+1) points. The resulting least square problem is then solved along the affine function to get a new estimate, whose cost is smaller than those of all other previous estimates. If it does not produce a better estimate, the Dud will perform different steps, like searching in opposite direction and/or decreasing searching-step, until a better estimate is found. Afterwards, the estimate with the largest cost is replaced with the new one and the procedure is repeated for the new set of (N+1) estimates. The procedure is stopped when one of the stopping criteria is fulfilled.