Abstract :
[en] Traditional optimisation tools focus on deterministic problems: scheduling airline flight crews (with as few employees as possible while still meeting legal constraints, such as maximum working time), finding the shortest path in a graph (used by navigation systems to give directions, usually based on GPS signals), etc.
However, this deterministic hypothesis sometimes yields useless solutions: actual parameters cannot always be known to full precision, one reason being their randomness. For example, when scheduling trucks for freight transportation, if there is unexpected congestion on the roads, the deadlines might not be met, the company might be required to financially compensate for this delay, but also for the following deliveries that could not be made on schedule.
Two main approaches are developed in the literature to take into account this uncertainty: take decision based on probability distributions of the uncertain parameters (stochastic programming) or considering they lie in some set (robust programming). In general, the first one leads to a large increase in the size of the problems to solve (and thus requires algorithms to work around this dimensionality curse), while the second is more conservative but tends to change the nature of the programs (which can impose a new solver technology).
Some authors claim that those two mindsets are equivalent, meaning that the solutions they provide are equivalent when faced with the same uncertainty. The goal of this thesis is to explore this question: for various problems, implement those two approaches, and compare them.
Is one solution more secluded from variations due to the uncertain parameters?
Does it bring benefits over a deterministic approach?
Is one cheaper than the other to compute?