You are here
On Representing (Mixed-Integer) Linear Programs by Graph Neural Networks
On Representing (Mixed-Integer) Linear Programs by Graph Neural Networks
On Representing (Mixed-Integer) Linear Programs by Graph Neural Networks
Abstract: While Mixed-integer linear programming (MILP) is NP-hard in general, practical MILP has received roughly 100--fold speedup in the past twenty years. Still, many classes of MILPs quickly become unsolvable as their sizes increase, motivating researchers to seek new acceleration techniques for MILPs. With deep learning, they have obtained strong empirical results, and many results were obtained by applying graph neural networks (GNNs) to making decisions in various stages of MILP solution processes.
We study the theoretical foundation and discover a fundamental limitation: there exist feasible and infeasible MILPs that all GNNs will, however, treat equally, indicating GNN's lacking power to express general MILPs. Then we show that linear programs (LPs) without integer constraints do not suffer from this limitation and that, by restricting the MILPs to unfoldable ones or by adding random features, there exist GNNs that can reliably predict MILP feasibility, optimal objective values, and optimal solutions up to prescribed precision. We conduct small-scale numerical experiments to validate our theoretical findings.
This talk is based on joint works with Jialin Liu, Xinshang Wang, Jianfeng Lu, and Wotao Yin.
Department of Mathematics and Statistics