They are abbreviations for three line search algorithms that are introduced in "Al-Baali and Fletcher (1986). An Efficient Line Search for Nonlinear Least Square, J. Opt. Theo. Appl., 48, pp. 359-378." BSSearch stands for Bracketing and Sectioning line Search algorithm, SSearch stands for Sectioning line Search algorithm, RSSearch stands for Revised Sectioning line Search algorithm, In theory it is not necessary to do an accurate line search over the steps in the descent direction search algorithms (such as Newton-like methods i.e. DFP and BFGS, and so are CG methods i.e. FR and PR). However there are some necessary and sufficient conditions to guarantee the convergence of these methods that you can find about them for example in "Fletcher, Practical Methods of Optimization". Please don't be diligent to know details about the "constants" and don't manipulate them please. The keynotes that you should be aware on them are the following steps, 1. Write down your object function and the GRADIENT in explicit c++ form. (The golden section search doesn't need any derivatives but all DFP, BFGS, FR, and PR is gradient-based algorithm). 2. The constant "sigma" controls level of desired accuracy in the three line search algorithms. Smaller value for sigma, more accurate line search algorithm. Keep in mind that the accuracy in the line search phase is an independent sense from the accuracy in the solution. Even a value about 0.5 for sigma achieves convergence criterion but by more tries. More accurate line search usually yields less evaluation for gradient sacrificing more evaluation of the object function that is overlay a desired goal. 3. Test and evaluate all different gradient-based algorithms by all four line search algorithms for your problem and measure the number of total calling function for the function and the gradient also measure the successfulness for each way.