next up previous contents
Next: Hock and Schittkowski set Up: Numerical Results of CONDOR. Previous: Numerical Results of CONDOR.   Contents

Random objective functions

We will use for the tests, the following objective function:

$\displaystyle f(x)= \sum_{i=1}^n \bigg[ a_i - \sum_{j=1}^n ( S_{ij} \sin x_j +
 C_{ij} \cos x_j ) \bigg]^2, x \in \Re^n$ (7.1)

The way of generating the parameters of $ f(x)$ is taken from [RP63], and is as follows. The elements of the $ n \times n$ matrices $ S$ and $ C$ are random integers from the interval $ [-100 ,\; 100]$, and a vector $ x^*$ is chosen whose components are random numbers from $ [-\pi
,\; \pi]$. Then, the parameters $ a_i, i=1, \ldots,n$ are defined by the equation $ f(x^*)=0$, and the starting vector $ x_{start}$ is formed by adding random perturbations of $ [-0.1 \pi,\; 0.1 \pi]$ to the components of $ x^*$. All distributions of random numbers are uniform. There are two remarks to do on this objective function: Using this test function, it is possible to cover every kind of problems, (from the easiest one to the most difficult one).
We will compare the CONDOR algorithm with an older algorithm: ''CFSQP''. CFSQP uses line-search techniques. In CFSQP, the Hessian matrix of the function is reconstructed using a $ BFGS$ update, the gradient is obtained by finite-differences.
Parameters of CONDOR: $ \rho_{start}=0.1 \quad
\rho_{end}=10^{-8}$.
Parameters of $ CFSQP$: $ \epsilon=10^{-10}$. The algorithm stops when the step size is smaller than $ \epsilon $.
Recalling that $ f(x^*)=0$, we will say that we have a success when the value of the objective function at the final point of the optimization algorithm is lower then $ 10^{-9}$.
We obtain the following results, after 100 runs of both algorithms:
  Mean number of   Mean best value of
Dimension $ n$ function evaluations Number of success the objective function
of the space CONDOR CFSQP CONDOR CFSQP CONDOR CFSQP
3 44.96 246.19 100 46 3.060873e-017 5.787425e-011
5 99.17 443.66 99 27 5.193561e-016 8.383238e-011
10 411.17 991.43 100 14 1.686634e-015 1.299753e-010
20 1486.100000 -- 100 -- 3.379322e-016 --


We can now give an example of execution of the algorithm to illustrate the discussion of Section 6.2:
Rosenbrock's function ($ n=2$)
function evaluations Best Value So Far $ \rho_{old}$
33 $ 5.354072 \times 10^{-1}$ $ 10^{-1}$
88 $ 7.300849\times 10^{-8}$ $ 10^{-2}$
91 $ 1.653480\times 10^{-8}$ $ 10^{-3}$
94 $ 4.480416\times 10^{-11}$ $ 10^{-4}$
97 $ 4.906224\times 10^{-17}$ $ 10^{-5}$
100 $ 7.647780\times 10^{-21}$ $ 10^{-6}$
101 $ 7.647780\times 10^{-21}$ $ 10^{-7}$
103 $ 2.415887\times 10^{-30}$ $ 10^{-8}$


With the Rosenbrock's function= $ \displaystyle
100*(x_1-x_0^2)^2+(1-x_0)^2$
We will use the same choice of parameters (for $ \rho_{end}$ and $ \rho_{start}$) as before. The starting point is $ (-1.2 \; ; \; 1.0)$.
As you can see, the number of evaluations performed when $ \rho $ is reduced is far inferior to $ (n+1)(n+2)/2=6$.

next up previous contents
Next: Hock and Schittkowski set Up: Numerical Results of CONDOR. Previous: Numerical Results of CONDOR.   Contents
Frank Vanden Berghen 2004-04-19