Strong mixed-integer programming formulations for trained neural networks

20 Nov 2018Ross AndersonJoey HuchetteChristian TjandraatmadjaJuan Pablo Vielma

We present an ideal mixed-integer programming (MIP) formulation for a rectified linear unit (ReLU) appearing in a trained neural network. Our formulation requires a single binary variable and no additional continuous variables beyond the input and output variables of the ReLU... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.