Subject to: | (1.1) |

is called the objective function.

and are the box-constraints.

are the linear constraints.

are the non-linear constraints.

The following notation will be used:

is the gradient of . | |

is the Hessian matrix of . |

The choice of the algorithm to solve an optimization problem mainly depends on:

- The dimension of the search space.
- Whether or not the derivatives of are available.
- The time needed for one evaluation of for a given .
- The necessity to find a global or a local minimum of .
- The noise on the evaluation of .
- Whether the Objective function is smooth or not.
- Whether the search space is continuous (there is no discrete variable like variable which can take the following values: red, green, blue.).
- The presence of (non-linear) constraints.

In the rest of the thesis, we will make the following assumptions:

- The objective function is smooth
- The non-linear constraints are ``cheap'' to evaluate
- We only want a local minimum.

- Build an approximation (also called ``local model'') of the objective function around the current point.
- Find the minimum of this model and move the current point to this minimum. Go back to step 1.

We have made the assumption that an evaluation of the objective function is very expensive (in term of computing time). If this is not the case, we must construct a local model in a very short time. Indeed, it serves no point to construct a perfect local model using many computer resources to carefully choose the direction to explore. It's best to use an approximate, cheap to obtain, search direction. This will lead to a little more function evaluations but, since they are cheap, the overall process will be faster. An example of such algorithm is the ''Rosenbrock's method''. If the objective function is very cheap to evaluate, it's a good choice. You will find in the annexe (section 13.9) a personal implementation in C++ of this method. Algorithms based on the ``BFGS update'' are also able to construct a good model in a very short time. This time can still become not negligible when the dimension of the search space is high (greater than 1000). For higher dimension, the choice of the algorithm is not clear but, if an approximation of the Hessian matrix of the objective function is directly available, a good choice will be a Conjugate-gradient/Lanczos method.

Currently, most of the researches in optimization algorithms are oriented to huge dimensional search-space. In these algorithms, we construct approximate search direction. CONDOR is one of the very few algorithm which adopts the opposite point of view. CONDOR build the most precise local model of the objective function and tries to reduce at all cost the number of function evaluations.

One of the goals of this thesis is to give a detailed explanation of the CONDOR algorithm. The thesis is structured as follows:

**Unconstrained Optimization:**We will describe the algorithm in the case when there are no constraints. The parallelization of the algorithm will also be explained.**Chapter 2:**A basic description of the CONDOR algorithm.**Chapter 3:**How to construct the local model of the objective function? How to assess its validity?**Chapter 4:**How to compute the minimum of the local model?**Chapter 5:**When we want to check the validity (the precision) of our local model, we need to solve approximately

subject to . How do we do?**Chapter 6:**The precise description of the CONDOR algorithm.**Chapter 7:**Numerical results of the CONDOR algorithm on unconstrained problems.

**Constrained Optimization:****Chapter 8:**We will make a short review of algorithms available for constrained optimization and motivate the choice of our algorithm.**Chapter 9:**Detailed discussion about the chosen and implemented algorithm.**Chapter 10:**Numerical Results for constrained problems.

**The METHOD project**(chapter 11)

The goal of this project is to optimize the shape of the blades inside a Centrifugal Compressor (see illustration of the compressor's blades in Figure 11.1). This is a concrete, industrial example of use of CONDOR.**Conclusion**(chapter 12)**Annexes**(chapter 13)**Code**(chapter 14)