class HCL_LineSearch_d : public HCL_Base

Abstract base class for line search routines

Inheritance:


Public Methods

virtual Table& Parameters ()
Access to the parameter table
virtual HCL_EvaluateFunctionalGrad_d* Search ( HCL_FunctionalGrad_d & f, const HCL_Vector_d & xcur, HCL_Vector_d & xnext, HCL_Vector_d & dir, HCL_EvaluateFunctionalGrad_d * xinit_eval)
LineSearch algorithm
virtual void SetScaling ( const HCL_LinearOpAdj_d * S )
SetScaling allows the specification of a different inner product (and hence norm)
virtual void UnSetScaling ()
UnSetScaling causes the line search to use the default inner product

Public

Term codes

Private

Input parameters
double FcnDecreaseTol
Tolerance for decreasing function value (Armijo-Goldstein condition)
double SlopeDecreaseTol
Slope at approximate minimizer must be reduced by a factor of SlopeDecreaseTol (exact interpretation varies)
double MinStep
Minimum step length.
double MaxStep
Maximum step length.
int MaxSample
Maximum number of function evaluations to perform
double ZeroDivideTol
Tolerance for divide-by-zero
int DispFlag
DispFlag controls the amount of output sent to the screen during execution of the Search method
int DumpFlag
DumpFlag has exactly the same purpose as DispFlag, except the information is sent to a file, whose name is defined by DumpFile
char DumpFile [81]
DumpFile is the name of the file to which results are written (if DumpFlag is positive)
int DispPrecision
DispPrecision gives the number of digits in numbers sent to the screen.
int DumpPrecision
DumpPrecision gives the number of digits in numbers sent to the file.
Required output parameters
int TermCode
exit status (<0 = no reduction, 0 = success, >0 = reduction, but failed)
int MaxTkn
maximum step size taken? (1 = taken, 0 = not taken)

Inherited from HCL_Base:

Public Methods

int Count()
void DecCount()
void IncCount()

Private Fields

int ReferenceCount

Documentation

Abstract base class for line search routines. The purpose of a line search algorithm is to locate an approximate minimizer of a function $\phi:[0,\infty) \rightarrow R$. Such problems arise in certain algorithms for unconstrained and constrained optimization, where $\phi$ is a 1D "slice" of the objective or merit function. The is, the original problem is to minimizer some function $f:X\rightarrow R$, where $X$ is a Hilbert space, and $\phi$ is defined by $\phi(a) = f(x+ay)$, where $x$ and $y$ are vectors in $X$.

The main method in a HCL_LineSearch_d class is Search, which is invoked as follows:

       MyFcnl f;  // MyFcnl must be derived from HCL_FunctionalGrad_d
       MyVector x( "x0" ); // Assume MyVector has a constructor which
                           //reads from a file
       MyVector y( "direction" );
       MyVector * xnext = new MyVector( x );  // Assume MyVector has a copy
                                              // constructor.

       HCL_LineSearch_DS_d lsearch( "lsearch.dat" );  // Dennis-Schnabel
                                                      // line search
       HCL_EvaluateFunctionalGrad_d * eval = f.EvaluateGrad( x );
       HCL_EvaluateFunctionalGrad_d * evalnext 
                                      lsearch.Search( f,x,*xnext,y,eval );
    
Note that it is necessary to evaluate the functional at the initial point $x$ and pass this evaluation object to the Search method. The method attempts to locate a suitable point and return it in the vector pointed to by the pointer xnext. Also, the evaluation object of $f$ is the return value of the method.

Standard line search algorithms are usually based on one or two goals. Essentially all algorithms attempt to reduce the value of the function $\phi$ from $\phi(0)$, usually by satisfying a condition

\phi(a) \le \phi(0) + tol a\phi'(0)
$$
(the assumption is made that $\phi'(0)$ is negative).  The number
$tol$ in the above inequality is a standard input parameter
to a line search algorithm; in HCL this parameter is called
{\bf FcnDecreaseTol}.  A second goal which is often sought is
the reduction of the slope from $\phi'(0)$:
$$
|\phi'(a)| \le tol |\phi'(0)|.
$$
The tolerance in this condition is called {\bf SlopeDecreaseTol}
in HCL.

Other important parameters appearing in line search algorithms
are {\bf MaxSample} (the maximum number of times that the
function can be sampled before the algorithm gives up), and
{\bf MinStep} and {\bf MaxStep} (the smallest and largest
allowable steps).  For a complete list of algorithmic parameters,
see the {\bf Input parameters} documentation for the base class,
and for each concrete derived class (note that a specific algorithm
may use a parameter not mentioned in the base class).

Algorithmic parameters can be accessed or modified through the
{\bf Parameters} method of the class, which returns a reference to
the parameter table (see the {\bf Table} class for more details
about parameter tables):
\begin{verbatim}
      lsearch.Parameters().GetValue( "MinStep",m ); // Gets the value of
                                                    // the parameter MinStep
                                                    // and puts it in m.
      lsearch.Parameters().PutValue( "MinStep",2*m ); // Sets the value of
                                                      // the parameter MinStep
                                                      // to 2*m.
   
In addition, derived line search classes should have constructors which take a file name and read parameters from that file (using the Table file format). All input parameters have default values which are given in the (derived) class documentation.

All line searchs are required to set two output parameters in the paramter table, TermCode and MaxTkn. The possible values of TermCode are given by the enum variable ErrorCodes. The documentation of ErrorCodes should be consulted for a complete list of codes, but the following convention is followed: 0 means a succesful completion, a positive values means that the algorithm succeeded in reducing the function value, although the termination criteria were not all satisfied, and a negative value indicates that the search was unable to reduce the function value. MaxTkn is a boolean integer variable; if it is set, this indicates that the step taken was of length MaxStep.

Input parameters

double FcnDecreaseTol
Tolerance for decreasing function value (Armijo-Goldstein condition)

double SlopeDecreaseTol
Slope at approximate minimizer must be reduced by a factor of SlopeDecreaseTol (exact interpretation varies)

double MinStep
Minimum step length.

double MaxStep
Maximum step length.

int MaxSample
Maximum number of function evaluations to perform

double ZeroDivideTol
Tolerance for divide-by-zero

int DispFlag
DispFlag controls the amount of output sent to the screen during execution of the Search method. The usual choices are 0 --- no output 1 --- a summary at the end of execution 2 --- a summary at each iteration In addition, values of DispFlag greater than 2 may send increasing amount of detail to the screen (this is intended primarily for development use, i.e. for debugging).

int DumpFlag
DumpFlag has exactly the same purpose as DispFlag, except the information is sent to a file, whose name is defined by DumpFile

char DumpFile[81]
DumpFile is the name of the file to which results are written (if DumpFlag is positive)

int DispPrecision
DispPrecision gives the number of digits in numbers sent to the screen.

int DumpPrecision
DumpPrecision gives the number of digits in numbers sent to the file.

Required output parameters

int TermCode
exit status (<0 = no reduction, 0 = success, >0 = reduction, but failed)

int MaxTkn
maximum step size taken? (1 = taken, 0 = not taken)

Term codes

virtual Table& Parameters()
Access to the parameter table

virtual void SetScaling( const HCL_LinearOpAdj_d * S )
SetScaling allows the specification of a different inner product (and hence norm). Instead of $(x,y)$, the scaled inner product $(x,S^{-1}y)$ will be used, where $S$ is a symmetric positive definite operator. Through this method, a line search can be made consistent with a minimization algorithm which scales the norm.

virtual void UnSetScaling()
UnSetScaling causes the line search to use the default inner product

virtual HCL_EvaluateFunctionalGrad_d* Search( HCL_FunctionalGrad_d & f, const HCL_Vector_d & xcur, HCL_Vector_d & xnext, HCL_Vector_d & dir, HCL_EvaluateFunctionalGrad_d * xinit_eval)
LineSearch algorithm
Parameters:
f - function
xcur - starting point
dir - starting direction
xinit_eval - evaluation object for starting point


Direct child classes:
HCL_LineSearch_MT_d
HCL_LineSearch_Fl_d
HCL_LineSearch_DS_d

alphabetic index hierarchy of classes


this page has been generated automatically by doc++

(c)opyright by Malte Zöckler, Roland Wunderling
contact: doc++@zib.de