Multi-Objective Optimization
Many engineering and scientific problems involve optimizing multiple conflicting objectives simultaneously. Instead of a single optimal point, the solution is a Pareto front — the set of all non-dominated trade-off solutions.
A solution is Pareto-optimal if no other feasible improves one objective without worsening another:
NSGA-II Algorithm
Section titled “NSGA-II Algorithm”Numra implements NSGA-II (Non-dominated Sorting Genetic Algorithm II), a widely-used evolutionary multi-objective optimizer. The algorithm uses:
-
Non-dominated sorting: The combined parent+offspring population is sorted into Pareto fronts where contains the non-dominated solutions.
-
Crowding distance: Within each front, individuals are ranked by crowding distance — a measure of how isolated they are in objective space. Higher crowding distance means more diversity.
-
Tournament selection: Parents for the next generation are chosen by binary tournament comparing (a) front rank (lower is better), then (b) crowding distance (higher is better).
-
SBX crossover: Simulated Binary Crossover with distribution index creates offspring near the parents with polynomial probability.
-
Polynomial mutation: Each variable is mutated with probability , using a polynomial distribution controlled by .
Basic Bi-Objective Example
Section titled “Basic Bi-Objective Example”use numra::optim::{nsga2_optimize, NsgaIIOptions};
// f1 = x^2, f2 = (x-2)^2// Pareto front: x in [0, 2] (any x in this range is Pareto-optimal)let f1 = |x: &[f64]| x[0] * x[0];let f2 = |x: &[f64]| (x[0] - 2.0).powi(2);let objectives: Vec<&dyn Fn(&[f64]) -> f64> = vec![&f1, &f2];
let bounds = vec![(0.0, 4.0)];let opts = NsgaIIOptions { pop_size: 50, max_generations: 100, ..Default::default()};
let result = nsga2_optimize(&objectives, &bounds, &opts).unwrap();
// Extract the Pareto frontlet pareto = result.pareto.as_ref().unwrap();println!("Pareto front size: {}", pareto.points.len());
for p in &pareto.points { println!(" x={:.3}, f1={:.3}, f2={:.3}", p.x[0], p.objectives[0], p.objectives[1]);}
// All Pareto points should have x in [0, 2]for p in &pareto.points { assert!(p.x[0] >= -0.5 && p.x[0] <= 2.5);}ZDT1 Benchmark
Section titled “ZDT1 Benchmark”The ZDT1 test function is a standard benchmark with a convex Pareto front:
use numra::optim::{nsga2_optimize, NsgaIIOptions};
let n = 3;let bounds = vec![(0.0, 1.0); n];
let f1 = |x: &[f64]| x[0];let f2 = |x: &[f64]| { let g = 1.0 + 9.0 * x[1..].iter().sum::<f64>() / (x.len() - 1) as f64; g * (1.0 - (x[0] / g).sqrt())};
let objectives: Vec<&dyn Fn(&[f64]) -> f64> = vec![&f1, &f2];let opts = NsgaIIOptions { pop_size: 50, max_generations: 100, ..Default::default()};
let result = nsga2_optimize(&objectives, &bounds, &opts).unwrap();let pareto = result.pareto.as_ref().unwrap();
assert!(pareto.points.len() >= 5);println!("ZDT1 Pareto front: {} points", pareto.points.len());Three-Objective Example
Section titled “Three-Objective Example”NSGA-II handles any number of objectives:
use numra::optim::{nsga2_optimize, NsgaIIOptions};
let bounds = vec![(0.0, 2.0), (0.0, 2.0)];let f1 = |x: &[f64]| x[0] * x[0];let f2 = |x: &[f64]| x[1] * x[1];let f3 = |x: &[f64]| (x[0] - 1.0).powi(2) + (x[1] - 1.0).powi(2);
let objectives: Vec<&dyn Fn(&[f64]) -> f64> = vec![&f1, &f2, &f3];let opts = NsgaIIOptions { pop_size: 50, max_generations: 100, ..Default::default()};
let result = nsga2_optimize(&objectives, &bounds, &opts).unwrap();let pareto = result.pareto.as_ref().unwrap();
for p in &pareto.points { assert_eq!(p.objectives.len(), 3);}println!("3-objective Pareto front: {} points", pareto.points.len());NSGA-II Options
Section titled “NSGA-II Options”| Option | Default | Description |
|---|---|---|
pop_size | 100 | Population size (must be even) |
max_generations | 200 | Maximum number of generations |
crossover_eta | 20.0 | SBX crossover distribution index |
mutation_eta | 20.0 | Polynomial mutation distribution index |
crossover_prob | 0.9 | Crossover probability |
mutation_prob | None (= 1/n) | Per-variable mutation probability |
seed | 42 | Random seed for reproducibility |
Tuning guidelines:
- Population size: Use at least 50 for bi-objective, 100+ for three objectives.
- Crossover eta: Higher values (20—30) create offspring closer to parents. Lower values (2—5) allow more exploration.
- Mutation eta: Same interpretation as crossover eta.
- Generations: More generations improve Pareto front quality but increase cost.
Result Structure
Section titled “Result Structure”The OptimResult from NSGA-II contains:
// result.x: best point for the first objective// result.f: best first-objective value// result.pareto: Some(ParetoResult { points: Vec<ParetoPoint> })
let pareto = result.pareto.unwrap();for point in &pareto.points { let x = &point.x; // decision variables let obj = &point.objectives; // objective values [f1, f2, ...]}Each ParetoPoint contains:
x: Vec<S>— the decision variable values.objectives: Vec<S>— the objective function values.
The returned Pareto front is guaranteed to be non-dominated: no point in the set dominates any other.
Declarative Builder API
Section titled “Declarative Builder API”The OptimProblem builder supports multi-objective optimization:
use numra::optim::OptimProblem;
let result = OptimProblem::new(1) .multi_objective(vec![ Box::new(|x: &[f64]| x[0] * x[0]), Box::new(|x: &[f64]| (x[0] - 2.0).powi(2)), ]) .all_bounds(&[(0.0, 4.0)]) .solve() .unwrap();
let pareto = result.pareto.as_ref().unwrap();println!("Pareto front: {} points", pareto.points.len());Practical Considerations
Section titled “Practical Considerations”-
Scaling objectives: If objectives have very different magnitudes, NSGA-II’s crowding distance may favor one over another. Consider normalizing objectives to similar ranges.
-
Constraint handling: NSGA-II does not directly support constraints. For constrained multi-objective problems, use penalty terms in the objectives.
-
Post-processing: The Pareto front contains all trade-off solutions. Use domain knowledge to select a preferred operating point. Common selection criteria include:
- Minimum distance to a utopia point.
- Knee point detection (maximum curvature).
- Weighted sum preference.
-
Deterministic runs: NSGA-II is deterministic given the same seed, population size, and bounds, enabling reproducible experiments.