dc.description.abstract | Engineered systems are typically designed to be robust and have very low probabilities of failure. However, developing accurate estimates of these probabilities can be challenging given the complex (and typically nonlinear) nature of the system behavior and the computational cost of simulating a sufficient number of realizations/scenarios to capture the failure modes. Among other approaches, importance sampling can reduce the computational cost and/or the variance in failure probability estimates; however, the optimal importance sampling distribution can only be computed if the failure probability is already known. This work proposes to use normalizing flows (NFs), a type of machine learning model, to learn a near-optimal importance sampling distribution. NFs are generative modeling techniques amenable to exact, but efficient, density evaluation. The approach is first evaluated on a suite of challenging benchmark reliability estimation problems, comparing against two techniques widely adopted for similar tasks: subset simulation and the cross-entropy method; the results show that the proposed approach can be used to estimate rare-event probability in cases that have extremely low failure probabilities on the order of 10^(-7), high-dimensionality, and multiple failure modes. Finally, the proposed approach is applied to estimate the reliability of two structural mechanics examples. | |