MUTIPLE-GRADIENT DESCENT ALGORITHM FOR MULTIOBJECTIVE OPTIMIZATION
Résumé
The steepest-descent method is a well-known and effective single-objective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multi-objective optimization by considering the concurrent minimization of n smooth criteria {J_i} (i = 1, . . . , n). The novel algorithm is based on the following observation: consider a finite set of vectors {u_i} (u_i ∈ R^N, n ≤ N); in the convex hull of this family, there exists a unique element of minimal norm, say ω ∈ R^N; then, the scalar product of ω with any vector in the convex hull, and in particular, with any u_i, is at least equal to ||ω||^2 ≥ 0. Applying this to the objective-function gradients (u_i = ∇J_i), we conclude that either ω = 0, and the current design point belongs to the Pareto set, or −ω is a descent direction common to all objective functions. We propose to construct a fixed-point iteration in which updates of the element ω are used as successive directions of search. This method converges to a point on the Pareto set. This result applies to both finite-dimensional and functional design spaces. Numerical illustrations have been provided in both cases using either analytical objective functions, or (discretized) functionals in [9] [5]. Here, following [6], a domain-decomposition framework is used to illustrate the necessity, in a (discretized) functional setting, to scale the gradients appropriately.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...