Conic Optimization for Robust Quadratic Regression: Deterministic Bounds and Statistical Analysis
Citations Over TimeTop 25% of 2018 papers
Abstract
This paper is concerned with the robust quadratic regression problem, where the goal is to find the unknown parameters (state) of a system modeled by nonconvex quadratic equations based on observational data. In this problem, a sparse subset of equations are subject to errors (noise values) of arbitrary magnitudes. We propose two techniques based on conic optimization to address this problem. The first one is a penalized conic relaxation, whereas the second one is a more complex iterative conic optimization equipped with a hard thresholding operator. We derive a deterministic bound for the penalized conic relaxation to quantify how many bad measurements the algorithm can tolerate without producing a nonzero estimation error. This bound is then analyzed for Gaussian systems, and it is proved that the proposed method allows up to a square root of the total number of measurements to be grossly erroneous. If the number of measurements is sufficiently large, we show that the iterative conic optimization method recovers the unknown state precisely even when up to a constant fraction of equations are arbitrarily wrong in the Gaussian case. The efficacy of the developed methods is demonstrated on synthetic data and a European power grid.
Related Papers
- → On conic QPCCs, conic QCQPs and completely positive programs(2015)27 cited
- → Convex mixed-integer nonlinear programs derived from generalized disjunctive programming using cones(2024)6 cited
- → Convex Mixed-Integer Nonlinear Programs Derived from Generalized Disjunctive Programming using Cones(2021)1 cited
- → Conic support measures(2018)
- → On an Extension of Condition Number Theory to Non-Conic Convex Optimization(2003)