next up previous contents
Next: Unconstrained Optimization Up: Introduction Previous: Motivations   Contents

Formal description

This thesis about optimization of non-linear continuous functions subject to box, linear and non-linear constraints. We want to find $ x^* \in$   $ \mbox{$\cal R$}$$ ^n$ which satisfies:

$\displaystyle \mbox{$\cal F$}$$\displaystyle (x^*)= \min_x$   $\displaystyle \mbox{$\cal F$}$$\displaystyle (x) \; \;$    Subject to: $\displaystyle \begin{cases}
 b_l \leq x \leq b_u, \; & b_l,b_u \in \Re^n
 \\  A...
...Re^{m \times n}, b \in \Re^m \\
 c_i(x) \geq 0, \; & i=1,\ldots,l \end{cases}$ (1.1)

$ \mbox{$\cal F$}$$ (x): \Re^n \rightarrow \Re$ is called the objective function.
$ b_l$ and $ b_u$ are the box-constraints.
$ A x \geq b$ are the linear constraints.
$ c_i(x)$ are the non-linear constraints.
The following notation will be used:
$ g_i=\frac{\partial \mbox{$\cal F$}}{\partial x_i}$ $ g$ is the gradient of $ \cal F$.
$ H_{i,j}=\frac{\partial^2 \mbox{$\cal F$}}{\partial x_i \partial x_j}$ $ H$ is the Hessian matrix of $ \cal F$.

The choice of the algorithm to solve an optimization problem mainly depends on: If there are lots of noise on $ \mbox{$\cal F$}$, or if a global minima is needed, or if we have discrete variables we will use evolutionary algorithm (like genetic algorithms). These kind of algorithms can usually only have box constraints.

In the rest of the thesis, we will make the following assumptions: An optimization (minimization) algorithm is nearly always based on this simple principle:
  1. Build an approximation (also called ``local model'') of the objective function around the current point.
  2. Find the minimum of this model and move the current point to this minimum. Go back to step 1.
Like most optimization algorithms, CONDOR uses, as local model, a polynomial of degree two. There are several ways of building this quadratic. CONDOR uses multivariate lagrange interpolation technique to build its model. This techniques is particularly well-suited when the dimension of the search space is low. When there is no noise on the objective function, we can use another, cheaper technique called ``BFGS update'' to construct the quadratic. It allows us to build local models at very low CPU cost (it's very fast).

We have made the assumption that an evaluation of the objective function is very expensive (in term of computing time). If this is not the case, we must construct a local model in a very short time. Indeed, it serves no point to construct a perfect local model using many computer resources to carefully choose the direction to explore. It's best to use an approximate, cheap to obtain, search direction. This will lead to a little more function evaluations but, since they are cheap, the overall process will be faster. An example of such algorithm is the ''Rosenbrock's method''. If the objective function is very cheap to evaluate, it's a good choice. You will find in the annexe (section 13.9) a personal implementation in C++ of this method. Algorithms based on the ``BFGS update'' are also able to construct a good model in a very short time. This time can still become not negligible when the dimension of the search space is high (greater than 1000). For higher dimension, the choice of the algorithm is not clear but, if an approximation of the Hessian matrix $ H$ of the objective function is directly available, a good choice will be a Conjugate-gradient/Lanczos method.

Currently, most of the researches in optimization algorithms are oriented to huge dimensional search-space. In these algorithms, we construct approximate search direction. CONDOR is one of the very few algorithm which adopts the opposite point of view. CONDOR build the most precise local model of the objective function and tries to reduce at all cost the number of function evaluations.

One of the goals of this thesis is to give a detailed explanation of the CONDOR algorithm. The thesis is structured as follows:

next up previous contents
Next: Unconstrained Optimization Up: Introduction Previous: Motivations   Contents
Frank Vanden Berghen 2004-04-19