Abstract :
[en] Safeguarding engineering infrastructures in a healthy condition is of paramount importance to sustain the economic and societal growth of most countries. Deterioration mechanisms and mechanical stressors have a detrimental effect on structural performance, inducing a risk of failure that might lead, in some cases, to considerable economic, societal, and environmental consequences. Although the estimation of deterioration processes is associated with significant uncertainties, information from inspections and monitoring can be collected, at a cost, in order to dictate more informed maintenance decisions. Inspection and Maintenance (I&M) planning thus demands methods capable of identifying optimal management strategies in stochastic environments and under imperfect information. Addressing the aforementioned needs, this thesis is devoted to the exploration of efficient methods with the objective of controlling the risk of adverse events by timely planning inspections and optimally dictating maintenance actions. Throughout the investigation, the I&M planning decision-making problem is formally formulated as a Partially Observable Markov Decision Process (POMDP), constituting the underlying principled mathematical foundation of the stochastic control optimization. From medium to high-dimensional state space settings, infinite and finite horizon policies are efficiently computed via POMDP point based solvers, whereas for higher dimensional state, action, and observation space settings, POMDPs are integrated with a multi-agent actor critic deep reinforcement learning approach. Besides overcoming dimensionality limitations by approximating both policy and value functions with artificial neural networks, the formulation of the POMDPs through conditional formations enables the treatment of structural systems under deterioration, reliability, and cost dependence. Sequential monitoring decisions, influenced by the condition of the sensors, can also be aptly allotted through the proposed approach. Extensive numerical experiments have been conducted for both traditional and detailed I&M planning settings with a strong emphasis on offshore wind substructures, thoroughly comparing POMDP policies against corrective, calendar, and heuristic based strategies. The results reveal that POMDP-based policies offer substantial savings compared to their counterparts in all the tested settings.