linprog(method=’Simplex’)¶
Solve the following linear programming problem via a two-phase simplex algorithm.
c : array_like
Coefficients of the linear objective function to be minimized.
A_ub : array_like
2-D array which, when matrix-multiplied by x, gives the values of the upper-bound inequality constraints at x.
b_ub : array_like
1-D array of values representing the upper-bound of each inequality constraint (row) in A_ub.
A_eq : array_like
2-D array which, when matrix-multiplied by x, gives the values of the equality constraints at x.
b_eq : array_like
1-D array of values representing the RHS of each equality constraint (row) in A_eq.
bounds : array_like
The bounds for each independent variable in the solution, which can take one of three forms:: None : The default bounds, all variables are non-negative. (lb, ub) : If a 2-element sequence is provided, the same
lower bound (lb) and upper bound (ub) will be applied to all variables.
each variable x_i will be bounded by lb[i] and ub[i].
Infinite bounds are specified using -np.inf (negative) or np.inf (positive).
callback : callable
If a callback function is provide, it will be called within each iteration of the simplex algorithm. The callback must have the signature callback(xk, **kwargs) where xk is the current solution vector and kwargs is a dictionary containing the following:: “tableau” : The current Simplex algorithm tableau “nit” : The current iteration. “pivot” : The pivot (row, column) used for the next iteration. “phase” : Whether the algorithm is in Phase 1 or Phase 2. “bv” : A structured array containing a string representation of each
basic variable and its current value.
A scipy.optimize.OptimizeResult consisting of the following fields:
x : ndarray The independent variable vector which optimizes the linear programming problem. fun : float Value of the objective function. slack : ndarray The values of the slack variables. Each slack variable corresponds to an inequality constraint. If the slack is zero, then the corresponding constraint is active. success : bool Returns True if the algorithm succeeded in finding an optimal solution. status : int An integer representing the exit status of the optimization:: 0 : Optimization terminated successfully 1 : Iteration limit reached 2 : Problem appears to be infeasible 3 : Problem appears to be unbounded nit : int The number of iterations performed. message : str A string descriptor of the exit status of the optimization.
For documentation for the rest of the parameters, see scipy.optimize.linprog
maxiter : int
The maximum number of iterations to perform.
disp : bool
If True, print exit status message to sys.stdout
tol : float
The tolerance which determines when a solution is “close enough” to zero in Phase 1 to be considered a basic feasible solution or close enough to positive to to serve as an optimal solution.
bland : bool
If True, use Bland’s anti-cycling rule [3] to choose pivots to prevent cycling. If False, choose pivots which should lead to a converged solution more quickly. The latter method is subject to cycling (non-convergence) in rare instances.
[R806] | Dantzig, George B., Linear programming and extensions. Rand Corporation Research Study Princeton Univ. Press, Princeton, NJ, 1963 |
[R807] | Hillier, S.H. and Lieberman, G.J. (1995), “Introduction to Mathematical Programming”, McGraw-Hill, Chapter 4. |
[R808] | Bland, Robert G. New finite pivoting rules for the simplex method. Mathematics of Operations Research (2), 1977: pp. 103-107. |
Consider the following problem:
This problem deviates from the standard linear programming problem. In standard form, linear programming problems assume the variables x are non-negative. Since the variables don’t have standard bounds where 0
There are two upper-bound constraints, which can be expressed as
The input for this problem is as follows:
>>> from scipy.optimize import linprog >>> c = [-1, 4] >>> A = [[-3, 1], [1, 2]] >>> b = [6, 4] >>> x0_bnds = (None, None) >>> x1_bnds = (-3, None) >>> res = linprog(c, A, b, bounds=(x0_bnds, x1_bnds)) >>> print(res) fun: -22.0 message: 'Optimization terminated successfully.' nit: 1 slack: array([ 39., 0.]) status: 0 success: True x: array([ 10., -3.])
linprog(method=’simplex’)#
Linear programming: minimize a linear objective function subject to linear equality and inequality constraints using the tableau-based simplex method.
Deprecated since version 1.9.0: method=’simplex’ will be removed in SciPy 1.11.0. It is replaced by method=’highs’ because the latter is faster and more robust.
Linear programming solves problems of the following form:
where \(x\) is a vector of decision variables; \(c\) , \(b_\) , \(b_\) , \(l\) , and \(u\) are vectors; and \(A_\) and \(A_\) are matrices.
A_ub @ x b_ub A_eq @ x == b_eq lb x ub
Note that by default lb = 0 and ub = None unless specified with bounds .
Parameters : c 1-D array
The coefficients of the linear objective function to be minimized.
A_ub 2-D array, optional
The inequality constraint matrix. Each row of A_ub specifies the coefficients of a linear inequality constraint on x .
b_ub 1-D array, optional
The inequality constraint vector. Each element represents an upper bound on the corresponding value of A_ub @ x .
A_eq 2-D array, optional
The equality constraint matrix. Each row of A_eq specifies the coefficients of a linear equality constraint on x .
b_eq 1-D array, optional
The equality constraint vector. Each element of A_eq @ x must equal the corresponding element of b_eq .
bounds sequence, optional
A sequence of (min, max) pairs for each element in x , defining the minimum and maximum values of that decision variable. Use None to indicate that there is no bound. By default, bounds are (0, None) (all decision variables are non-negative). If a single tuple (min, max) is provided, then min and max will serve as bounds for all decision variables.
method str
This is the method-specific documentation for ‘simplex’. ‘highs’ , ‘highs-ds’ , ‘highs-ipm’ , ‘interior-point’ (default), and ‘revised simplex’ are also available.
callback callable, optional
Callback function to be executed once per iteration.
Returns : res OptimizeResult
The values of the decision variables that minimizes the objective function while satisfying the constraints.
The optimal value of the objective function c @ x .
The (nominally positive) values of the slack variables, b_ub — A_ub @ x .
The (nominally zero) residuals of the equality constraints, b_eq — A_eq @ x .
True when the algorithm succeeds in finding an optimal solution.
An integer representing the exit status of the algorithm.
0 : Optimization terminated successfully.
1 : Iteration limit reached.
2 : Problem appears to be infeasible.
3 : Problem appears to be unbounded.
4 : Numerical difficulties encountered.
A string descriptor of the exit status of the algorithm.
The total number of iterations performed in all phases.
For documentation for the rest of the parameters, see scipy.optimize.linprog
The maximum number of iterations to perform in either phase.
disp bool (default: False)
Set to True if indicators of optimization status are to be printed to the console each iteration.
presolve bool (default: True)
Presolve attempts to identify trivial infeasibilities, identify trivial unboundedness, and simplify the problem before sending it to the main solver. It is generally recommended to keep the default setting True ; set to False if presolve is to be disabled.
tol float (default: 1e-12)
The tolerance which determines when a solution is “close enough” to zero in Phase 1 to be considered a basic feasible solution or close enough to positive to serve as an optimal solution.
autoscale bool (default: False)
Set to True to automatically perform equilibration. Consider using this option if the numerical values in the constraints are separated by several orders of magnitude.
rr bool (default: True)
Set to False to disable automatic redundancy removal.
bland bool
If True, use Bland’s anti-cycling rule [3] to choose pivots to prevent cycling. If False, choose pivots which should lead to a converged solution more quickly. The latter method is subject to cycling (non-convergence) in rare instances.
unknown_options dict
Optional arguments not used by this particular solver. If unknown_options is non-empty a warning is issued listing all unused options.
Dantzig, George B., Linear programming and extensions. Rand Corporation Research Study Princeton Univ. Press, Princeton, NJ, 1963
Hillier, S.H. and Lieberman, G.J. (1995), “Introduction to Mathematical Programming”, McGraw-Hill, Chapter 4.
Bland, Robert G. New finite pivoting rules for the simplex method. Mathematics of Operations Research (2), 1977: pp. 103-107.