Core Operations

Core Operations

Our ability to cast the input-output constraints of a neural net to an efficient set of linear and integer constraints boils down to the following basic operations, over which the layers provide a convenient layer of abstraction.

Index

Internal

MIPVerify.abs_geMethod.
abs_ge(x)

Expresses a one-sided absolute-value constraint: output is constrained to be at least as large as |x|.

Only use when you are minimizing over the output in the objective.

is_constant(x)

Checks whether a JuMPLinearType is constant (and thus has no model associated) with it. This can only be true if it is an affine expression with no stored variables.

Calculates the lowerbound only if u is positive; otherwise, returns u (since we expect) the ReLU to be fixed to zero anyway.

masked_relu(x, m; nta)

Expresses a masked rectified-linearity constraint, with three possibilities depending on the value of the mask. Output is constrained to be:

1) max(x, 0) if m=0,
2) 0 if m<0
3) x if m>0
MIPVerify.maximumMethod.

Expresses a maximization constraint: output is constrained to be equal to max(xs).

Expresses a one-sided maximization constraint: output is constrained to be at least max(xs).

Only use when you are minimizing over the output in the objective.

NB: If all of xs are constant, we simply return the largest of them.

MIPVerify.reluMethod.
relu(x)
relu(x; nta)

Expresses a rectified-linearity constraint: output is constrained to be equal to max(x, 0).

set_max_indexes(model, xs, target_indexes; tolerance)

Imposes constraints ensuring that one of the elements at the targetindexes is the largest element of the array x. More specifically, we require x[j] - x[i] ≥ tolerance for some `j ∈ targetindexesand for alli ∉ target_indexes`.

Calculates a tight bound of type bound_type on the variable x using the specified tightening algorithm nta.

If an upper bound is proven to be below cutoff, or a lower bound is proven to above cutoff, the algorithm returns early with whatever value was found.