Decomposing the Capacitated p-Median Problem

The capacitated p-median problem (CPMP) is a known and well-studied problem from the literature. Given \(n \in \mathbb{N}\) locations, the task is to select \(p \in \mathbb{N}\) median locations with \(p \leq n\) and to assign each location to exactly one median. For any two locations \(i,j \in [n]\), the distance between them is given by \(d_{ij} \in \mathbb{R}\). The distance between the locations and their assigned medians is minimized. Every location \(i \in [n]\) has a demand \(q_i \in \mathbb{R}\) and a maximum capacity \(Q_i \in \mathbb{R}\). For every selected median, the sum of the demands of the locations assigned to it must not exceed its capacity.

The CPMP can be formulated as a MIP. Here is a classical compact formulation of problem:

\[\begin{split}\begin{alignat*}{3} z^{\star}_{IP}\,\,\,=\,\,\,\min{} &\sum_{i=1}^n \sum_{j=1}^n d_{ij} x_{ij} \hspace{-2em} &&&&\\ \text{s. t.} &\sum_{j=1}^n x_{ij} &&= 1 \quad &\forall i &\in [n]\\ &\sum_{i=1}^{n} q_i x_{ij} &&\leq Q_j y_j \quad &\forall j &\in [n] \\ &\sum_{j=1}^n y_j &&= p && \\ &x_{ij} \in \{0,1\}, y_j \in \{0,1\} \hspace{-6em} &&\hspace{6em}\quad& \forall i,j &\in [n]. \end{alignat*}\end{split}\]

There are different approaches to solve the CPMP. As it has a structure, we can try to solve using a Branch-Cut-and-Price approach. For that we want to use the PyGCGOpt interface to interact with GCG. We will consider three use-cases: (1) The Automatic Mode, (2) Exploring different Decompositions, and (3) Building a custom Decomposition.

To follow along with this tutorial interactively, please download the Jupyter notebook from the examples folder.

Reading in the Instance

The PyGCGOpt interface offers two methods to specify a problem. The first is to load the model from a standardized file format (e.g., .lp or .mps) that is supported by SCIP using Model.readProb(). Optionally, one can also read in a decomposition from a .dec file in the same manner. In this example, we will use the second method: Modeling a problem directly in Python.

Execute the following cell which includes a trivial test instance and function to load more instances from a custom JSON file format.

[1]:
import json

def get_simple_instance():
    n = 5
    p = 2
    d = {0: {0: 0, 1: 25, 2: 46, 3: 43, 4: 30}, 1: {1: 0, 2: 22, 3: 20, 4: 22}, 2: {2: 0, 3: 22, 4: 40}, 3: {3: 0, 4: 22}, 4: {4: 0}}
    q = {0: 14, 1: 13, 2: 9, 3: 15, 4: 6}
    Q = {i: 33 for i in range(5)}
    return n, p, d, q, Q

def read_instance_json(path):
    with open(path) as f:
        instance = json.load(f)
    d = {int(k): {int(kk): vv for kk, vv in v.items()} for k, v in instance["d"].items()}
    q = {int(k): v for k, v in instance["q"].items()}
    Q = {int(k): v for k, v in instance["Q"].items()}
    return instance["n"], instance["p"], d, q, Q

n_locations, n_clusters, distances, demands, capacities = read_instance_json("instances/p550-01.json")

Setting up the Model

Now, we want to build the model based on the above formulation. Please note that this part is not specific to GCG but is identical to how one would build the same model with PySCIPOpt. In technical terms, Model in PyGCGOpt is a subclass of Model in PySCIPOpt and, therefore, you can use all methods of PySCIPOpt Model to build your problem.

In order to recreate the model multiple times during this example, we create a method that will return the model. The method also returns the different constraints added to the model grouped by type. This will be important later in use-case 3.

[2]:
from pygcgopt import Model, quicksum

def build_model(n_locations, n_clusters, distances, demands, capacities):
    m = Model()

    m.printVersion()
    m.redirectOutput()

    m.setMinimize()

    x = {}
    y = {}

    for j in range(n_locations):
        y[j] = m.addVar(f"y_{j}", vtype="B")
        for i in range(n_locations):
            x[i, j] = m.addVar(f"x_{i}_{j}", vtype="B", obj=distances[min(i,j)][max(i,j)])

    # Hold different constraint types
    conss_assignment = []
    conss_capacity = []
    cons_pmedian = None

    # Create the assignment constraints
    for i in range(n_locations):
        conss_assignment.append(
            m.addCons(quicksum(x[i, j] for j in range(n_locations)) == 1)
        )

    # Create the capacity constraints
    for j in range(n_locations):
        conss_capacity.append(
            m.addCons(quicksum(demands[i] * x[i, j] for i in range(n_locations)) <= capacities[j] * y[j])
        )

    # Create the p-median constraint
    cons_pmedian = m.addCons(quicksum(y[j] for j in range(n_locations)) == n_clusters)

    return m, conss_assignment, conss_capacity, cons_pmedian

Use-Case 1: The Automatic Mode

With the model built, we can now simply call the optimize() function and let GCG do its magic.

[3]:
m, *conss = build_model(n_locations, n_clusters, distances, demands, capacities)
m.optimize()
GCG version 3.5.0 [GitHash: 202908425-dirty]
Copyright (C) 2010-2021 Operations Research, RWTH Aachen University
                        Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)

SCIP version 8.0.0 [precision: 8 byte] [memory: block] [mode: optimized] [LP solver: SoPlex 6.0.0] [GitHash: 8441670cd5-dirty]
Copyright (C) 2002-2021 Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)
presolving:
(round 1, exhaustive) 0 del vars, 0 del conss, 0 add conss, 0 chg bounds, 0 chg sides, 0 chg coeffs, 100 upgd conss, 0 impls, 50 clqs
   (0.0s) probing: 51/2550 (2.0%) - 0 fixings, 0 aggregations, 0 implications, 0 bound changes
   (0.0s) probing aborted: 50/50 successive totally useless probings
presolving (2 rounds: 2 fast, 2 medium, 2 exhaustive):
 0 deleted vars, 0 deleted constraints, 0 added constraints, 0 tightened bounds, 0 added holes, 0 changed sides, 0 changed coefficients
 0 implications, 2550 cliques
presolved problem has 2550 variables (2550 bin, 0 int, 0 impl, 0 cont) and 101 constraints
     50 constraints of type <knapsack>
     50 constraints of type <setppc>
      1 constraints of type <linear>
transformed objective value is always integral (scale: 1)
Presolving Time: 0.04
 Consclassifier "nonzeros" yields a classification with 2  different constraint classes
 Consclassifier "constypes" yields a classification with 3 different constraint classes
 Consclassifier "constypes according to miplib" yields a classification with 3 different constraint classes
 Conspartition "constypes according to miplib" is not considered since it offers the same structure as "constypes" conspartition
 Consclassifier "gamsdomain" yields a classification with 1  different constraint classes
 Consclassifier "gamssymbols" yields a classification with 1  different constraint classes
 Conspartition "gamssymbols" is not considered since it offers the same structure as "gamsdomain" conspartition
 Varclassifier "gamsdomain" yields a classification with 1  different variable classes
 Varclassifier "gamssymbols" yields a classification with 1  different variable classes
 Varpartition "gamssymbols" is not considered since it offers the same structure as "gamsdomain"
 Varclassifier "vartypes" yields a classification with 1 different variable classes
 Varpartition "vartypes" is not considered since it offers the same structure as "gamsdomain"
 Varclassifier "varobjvals" yields a classification with 116 different variable classes
 Varclassifier "varobjvalsigns" yields a classification with 2 different variable classes
 the current varclass distribution includes 116 classes but only 18 are allowed for GCGconshdlrDecompCalcCandidatesNBlocks()
 in dec_consclass: there are 3 different constraint classes
  the current constraint classifier "nonzeros" consists of 2 different classes
  the current constraint classifier "constypes" consists of 3 different classes
  the current constraint classifier "gamsdomain" consists of 1 different classes
 dec_consclass found 11 new partialdecs
dec_densemasterconss found 1 new partialdec
dec_neighborhoodmaster found 1 new partialdec
 the current varclass distribution includes POSTPROCESSING of decompositions. Added 0 new decomps.
Found 11 finished decompositions.
Measured running time per detector:
Detector consclass                 worked on        7 finished decompositions and took a total time of      0.000
Detector neighborhoodmaster        worked on        1 finished decompositions and took a total time of      0.000
Detector connectedbase             worked on       10 finished decompositions and took a total time of      0.007
Detector varclass                  worked on        2 finished decompositions and took a total time of      0.001
Detection Time: 0.02

A Dantzig-Wolfe reformulation is applied to solve the original problem.
Chosen structure has 50 blocks and 51 linking constraints.
This decomposition has a maxwhite score of 0.485149.
Matrix has 50 blocks, using 50 pricing problems.

  time | 116 classes but only 9 are allowed for propagatePartialdec() of var class detector
node  | left  |SLP iter|MLP iter|LP it/n| mem |mdpt |ovars|mvars|ocons|mcons|mcuts|  dualbound   | primalbound  |  deg   |  gap
p  0.2s|     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 2.711000e+03 |   --   |    Inf
p  0.2s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.785000e+03 |   --   |    Inf
p  0.3s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.102000e+03 |   --   |    Inf
   0.3s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.102000e+03 |   --   |    Inf

 0.3s|
     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 |  50 | 102 | 102 |   0 | 0.000000e+00 | 1.102000e+03 |   0.00%|    Inf    0.3s
|     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 | 150 | 102 | 102 |   0 | 0.000000e+00 | 1.102000e+03 |   0.00%|    Inf
Starting reduced cost pricing...
   0.5s|     1 |     0 |   2081 |   2081 |     - |  41M|   0 |2550 |1699 | 102 | 102 |   0 | 7.050000e+02 | 1.102000e+03 |  36.95%|  56.31%
d  1.1s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1749 | 102 | 102 |   0 | 7.050000e+02 | 1.016000e+03 |  36.95%|  44.11%
X  1.4s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1799 | 102 | 102 |   0 | 7.050000e+02 | 7.730000e+02 |  36.95%|   9.65%
Y  1.5s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.5s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.6s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.6s|     1 |     2 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
*r 1.6s|     2 |     1 |   3063 |   3063 | 316.0 |  58M|   1 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.140000e+02 |   --   |   1.28%
*r 2.3s|    16 |     5 |   6962 |   6962 | 281.0 |  61M|   5 |2550 |2305 | 102 | 102 |   0 | 7.102222e+02 | 7.130000e+02 |   --   |   0.39%

SCIP Status        : problem is solved [optimal solution found]
Solving Time (sec) : 2.62
Solving Nodes      : 21
Primal Bound       : +7.13000000000000e+02 (13 solutions)
Dual Bound         : +7.13000000000000e+02
Gap                : 0.00 %

Use-Case 2: Exploring different Decompositions

Above, we have seen GCG in its fully automatic mode. If we want to dig deeper, we can inspect the different decompositions that GCG detects. For that, we recreate the model and manually execute presolve() and detect() for the model. At this stage it is possible to list and visualize the found decompositions.

[4]:
m, *conss = build_model(n_locations, n_clusters, distances, demands, capacities)
m.presolve()
m.detect()

decomps = m.listDecompositions()

print("GCG found {} finnished decompositions.".format(len(decomps)))
GCG version 3.5.0 [GitHash: 202908425-dirty]
Copyright (C) 2010-2021 Operations Research, RWTH Aachen University
                        Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)

SCIP version 8.0.0 [precision: 8 byte] [memory: block] [mode: optimized] [LP solver: SoPlex 6.0.0] [GitHash: 8441670cd5-dirty]
Copyright (C) 2002-2021 Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)
presolving:
(round 1, exhaustive) 0 del vars, 0 del conss, 0 add conss, 0 chg bounds, 0 chg sides, 0 chg coeffs, 100 upgd conss, 0 impls, 50 clqs
   (0.0s) probing: 51/2550 (2.0%) - 0 fixings, 0 aggregations, 0 implications, 0 bound changes
   (0.0s) probing aborted: 50/50 successive totally useless probings
presolving (2 rounds: 2 fast, 2 medium, 2 exhaustive):
 0 deleted vars, 0 deleted constraints, 0 added constraints, 0 tightened bounds, 0 added holes, 0 changed sides, 0 changed coefficients
 0 implications, 2550 cliques
presolved problem has 2550 variables (2550 bin, 0 int, 0 impl, 0 cont) and 101 constraints
     50 constraints of type <knapsack>
     50 constraints of type <setppc>
      1 constraints of type <linear>
transformed objective value is always integral (scale: 1)
Presolving Time: 0.04
starting detection
 Consclassifier "nonzeros" yields a classification with 2  different constraint classes
 Consclassifier "constypes" yields a classification with 3 different constraint classes
 Consclassifier "constypes according to miplib" yields a classification with 3 different constraint classes
 Conspartition "constypes according to miplib" is not considered since it offers the same structure as "constypes" conspartition
 Consclassifier "gamsdomain" yields a classification with 1  different constraint classes
 Consclassifier "gamssymbols" yields a classification with 1  different constraint classes
 Conspartition "gamssymbols" is not considered since it offers the same structure as "gamsdomain" conspartition
 Varclassifier "gamsdomain" yields a classification with 1  different variable classes
 Varclassifier "gamssymbols" yields a classification with 1  different variable classes
 Varpartition "gamssymbols" is not considered since it offers the same structure as "gamsdomain"
 Varclassifier "vartypes" yields a classification with 1 different variable classes
 Varpartition "vartypes" is not considered since it offers the same structure as "gamsdomain"
 Varclassifier "varobjvals" yields a classification with 116 different variable classes
 Varclassifier "varobjvalsigns" yields a classification with 2 different variable classes
 the current varclass distribution includes 116 classes but only 18 are allowed for GCGconshdlrDecompCalcCandidatesNBlocks()
 in dec_consclass: there are 3 different constraint classes
  the current constraint classifier "nonzeros" consists of 2 different classes
  the current constraint classifier "constypes" consists of 3 different classes
  the current constraint classifier "gamsdomain" consists of 1 different classes
 dec_consclass found 11 new partialdecs
dec_densemasterconss found 1 new partialdec
dec_neighborhoodmaster found 1 new partialdec
POSTPROCESSING of decompositions. Added 0 new decomps.
Found 11 finished decompositions.
Measured running time per detector:
Detector consclass                 worked on        7 finished decompositions and took a total time of      0.000
Detector neighborhoodmaster        worked on        1 finished decompositions and took a total time of      0.000
Detector connectedbase             worked on       10 finished decompositions and took a total time of      0.006
Detector varclass                  worked on        2 finished decompositions and took a total time of      0.001
Detection Time: 0.02
GCG found 11 finnished decompositions.
 the current varclass distribution includes 116 classes but only 9 are allowed for propagatePartialdec() of var class detector

Inspecting Decompositions

The call to listDecompositions() returns a list of PartialDecomposition objects. We can print a decomposition using the Python print() function to get a summary or access different fields directly.

For a full overview of available methods, take a look at the online documentation for the PartialDecomposition class, or execute help(d) where d is a decomposition object.

[5]:
print(decomps)

d = decomps[2]

print(
    f"Decomp scores: {d.classicScore=:.04f}, {d.borderAreaScore=:.04f}, {d.maxWhiteScore=:.04f}, {d.maxForWhiteScore=:.04f}"
)
[<PartialDecomposition: nBlocks=0, nMasterConss=101, nMasterVars=2550, nLinkingVars=0, maxForWhiteScore=0.0>, <PartialDecomposition: nBlocks=1, nMasterConss=0, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.0>, <PartialDecomposition: nBlocks=50, nMasterConss=51, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.48514851485148514>, <PartialDecomposition: nBlocks=51, nMasterConss=50, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.49504950495049505>, <PartialDecomposition: nBlocks=1, nMasterConss=50, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.0>, <PartialDecomposition: nBlocks=1, nMasterConss=100, nMasterVars=2500, nLinkingVars=0, maxForWhiteScore=0.009706853038245034>, <PartialDecomposition: nBlocks=1, nMasterConss=1, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.0>, <PartialDecomposition: nBlocks=50, nMasterConss=51, nMasterVars=50, nLinkingVars=0, maxForWhiteScore=0.4853426519122501>, <PartialDecomposition: nBlocks=1, nMasterConss=1, nMasterVars=0, nLinkingVars=0, maxForWhiteScore=0.0>, <PartialDecomposition: nBlocks=101, nMasterConss=0, nMasterVars=0, nLinkingVars=2550, maxForWhiteScore=0.019291161956034086>, <PartialDecomposition: nBlocks=0, nMasterConss=101, nMasterVars=100, nLinkingVars=2450, maxForWhiteScore=0.0>]
Decomp scores: d.classicScore=-1.0000, d.borderAreaScore=-1.0000, d.maxWhiteScore=0.4851, d.maxForWhiteScore=0.4851

Visualizing Decompositions

In addition, GCG can create graphical visualizations of decompositions. They can easily be displayed in a Jupyter nodebook like so:

[6]:
d
[6]:
../../_images/examples_cpmp_cpmp_12_0.svg

Selecting Decompositions

After this investigation, we decide that we like this particular decomposition. The following code orders GCG to select our decomposition instead of an automatic one. Then, the optimization process is started.

[7]:
d.isSelected = True

m.optimize()

A Dantzig-Wolfe reformulation is applied to solve the original problem.
Chosen structure has 50 blocks and 51 linking constraints.
This decomposition has a maxwhite score of 0.485149.
Matrix has 50 blocks, using 50 pricing problems.

  time | node  | left  |SLP iter|MLP iter|LP it/n| mem |mdpt |ovars|mvars|ocons|mcons|mcuts|  dualbound   | primalbound  |  deg   |  gap
p  0.2s|     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 2.711000e+03 |   --   |    Inf
p  0.2s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.785000e+03 |   --   |    Inf
p  0.3s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.102000e+03 |   --   |    Inf
   0.3s|     1 |     0 |      0 |      0 |     - |  33M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 1.102000e+03 |   --   |    Inf

 0.3s|     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 |  50 | 102 | 102 |   0 | 0.000000e+00 | 1.102000e+03 |   0.00%|    Inf    0.3s|     1 |     0 |      0 |      0 |     - |  34M|   0 |2550 | 150 | 102 | 102 |   0 | 0.000000e+00 | 1.102000e+03 |   0.00%|    Inf


Starting reduced cost pricing...
   0.5s|     1 |     0 |   2081 |   2081 |     - |  41M|   0 |2550 |1699 | 102 | 102 |   0 | 7.050000e+02 | 1.102000e+03 |  36.95%|  56.31%d  1.0s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1749 | 102 | 102 |   0 | 7.050000e+02 | 1.016000e+03 |  36.95%|  44.11%
X  1.4s
|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1799 | 102 | 102 |   0 | 7.050000e+02 | 7.730000e+02 |  36.95%|   9.65%
Y  1.5s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.5s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.5s|     1 |     0 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
   1.5s|     1 |     2 |   3055 |   3055 |     - |  57M|   0 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.580000e+02 |  36.95%|   7.52%
*r 1.5s|     2 |     1 |   3063 |   3063 | 316.0 |  58M|   1 |2550 |1849 | 102 | 102 |   0 | 7.050000e+02 | 7.140000e+02 |   --   |   1.28%
*r 2.2s|    16 |     5 |   6962 |   6962 | 281.0 |  61M|   5 |2550 |2305 | 102 | 102 |   0 | 7.102222e+02 | 7.130000e+02 |   --   |   0.39%

SCIP Status        : problem is solved [optimal solution found]
Solving Time (sec) : 2.55
Solving Nodes      : 21
Primal Bound       : +7.13000000000000e+02 (13 solutions)
Dual Bound         : +7.13000000000000e+02
Gap                : 0.00 %

Use-Case 3: Building a custom Decomposition

In the previous use-cases we run GCG with automatically detected decompositions. This is useful if we do not know the exact structure of a model but still want to exploit GCG’s decomposition approach.

However, for our model we do know its structure. If you take another look at our build_model() method, you will notice that we store the created constraints in different variables based on their type. This is a typical approach when we want to specify a custom decomposition after building the model using the Python interface.

In the following code, we recreate our model and use the different constraint types fo select constraints for reformulation and constraints for the Dantzig-Wolfe master problem.

[8]:
m, conss_assignment, conss_capacity, cons_pmedian = build_model(
    n_locations, n_clusters, distances, demands, capacities
)

conss_master = conss_assignment + [cons_pmedian]
conss_reform = conss_capacity

pd = m.createPartialDecomposition()
pd.fixConssToMaster(conss_master)

for block, c in enumerate(conss_reform):
    pd.fixConsToBlock(c, block)

m.addPreexistingPartialDecomposition(pd)

m.optimize()
GCG version 3.5.0 [GitHash: 202908425-dirty]
Copyright (C) 2010-2021 Operations Research, RWTH Aachen University
                        Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)

SCIP version 8.0.0 [precision: 8 byte] [memory: block] [mode: optimized] [LP solver: SoPlex 6.0.0] [GitHash: 8441670cd5-dirty]
Copyright (C) 2002-2021 Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB)
 added complete decomp for original problem with 50 blocks and 51 masterconss, 0 linkingvars, 0 mastervars, and max white score of   0.485149
there is an original decomposition and problem is not presolved yet -> disable presolving and start optimizing (rerun with presolve command before detect command for detecting in presolved problem)
presolving:
presolving (0 rounds: 0 fast, 0 medium, 0 exhaustive):
 0 deleted vars, 0 deleted constraints, 0 added constraints, 0 tightened bounds, 0 added holes, 0 changed sides, 0 changed coefficients
 0 implications, 0 cliques
presolved problem has 2550 variables (2550 bin, 0 int, 0 impl, 0 cont) and 101 constraints
    101 constraints of type <linear>
transformed objective value is always integral (scale: 1)
Presolving Time: 0.00
 calculated translation; number of missing constraints: 0; number of other partialdecs: 1
Preexisting decomposition found. Solution process started...

A Dantzig-Wolfe reformulation is applied to solve the original problem.
Chosen structure has 50 blocks and 51 linking constraints.
This decomposition has a maxwhite score of 0.485149.
Matrix has 50 blocks, using 50 pricing problems.

  time | node  | left  |SLP iter|MLP iter|LP it/n| mem |mdpt |ovars|mvars|ocons|mcons|mcuts|  dualbound   | primalbound  |  deg   |  gap
p  0.1s|     1 |     0 |      0 |      0 |     - |  28M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 2.848000e+03 |   --   |    Inf
   0.1s|     1 |     0 |      0 |      0 |     - |  28M|   0 |2550 |   0 | 101 |   0 |   0 | 0.000000e+00 | 2.848000e+03 |   --   |    Inf

 0.1s|     1 |     0 |      0 |      0 |     - |  28M|
   0 |2550 |  50 | 102 | 102 |   0 | 0.000000e+00 | 2.848000e+03 |   0.00%|    Inf    0.1s|     1 |     0 |      0
|      0 |     - |  28M|   0 |2550 |  50 | 102 | 102 |   0 | 0.000000e+00 | 2.848000e+03 |   0.00%|    Inf    0.3s|     1 |     0 |   1839 |   1839 |     - |  38M|   0 |2550 |2302 | 102 | 102 |   0 | 0.000000e+00 | 2.848000e+03 |  49.42%|    Inf    0.6s|     1 |     0 |   3939 |   3939 |     - |  41M|   0 |2550 |2933 | 102 | 102 |   0 | 7.024809e+02 | 2.848000e+03 |  35.22%| 305.42%   0.6s|     1 |     0 |   3939 |   3939 |     - |  41M|   0 |2550 |2933 | 102 | 102 |   0 | 7.050000e+02 | 2.848000e+03 |  35.29%| 303.97%d  1.2s|     1 |     0 |   5187 |   5187
Starting reduced cost pricing...


|     - |  51M|   0 |2550 |2983 | 102 | 102 |   0 | 7.050000e+02 | 1.016000e+03 |  35.29%|  44.11%
X
  1.6s|     1 |     0 |   5187 |   5187 |     - |  51M|   0 |2550 |3033 | 102 | 102 |   0 | 7.050000e+02 | 8.040000e+02 |  35.29%|  14.04%
   1.7s|     1 |     0 |   5187 |   5187 |     - |  51M|   0 |2550 |3083 | 102 | 102 |   0 | 7.050000e+02 | 8.040000e+02 |  35.29%|  14.04%
   1.8s|     1 |     0 |   5187 |   5187 |     - |  51M|   0 |2550 |3083 | 102 | 102 |   0 | 7.050000e+02 | 8.040000e+02 |  35.29%|  14.04%
   1.8s|     1 |     2 |   5187 |   5187 |     - |  51M|   0 |2550 |3083 | 102 | 102 |   0 | 7.050000e+02 | 8.040000e+02 |  35.29%|  14.04%
c  2.1s|     5 |     6 |   6968 |   6968 | 585.0 |  53M|   2 |2550 |3315 | 102 | 102 |   0 | 7.093889e+02 | 7.350000e+02 |   --   |   3.61%
*r 2.3s|     7 |     4 |   7395 |   7395 | 461.2 |  54M|   2 |2550 |3362 | 102 | 102 |   0 | 7.093889e+02 | 7.190000e+02 |   --   |   1.35%
*r 2.3s|     7 |     2 |   7534 |   7534 | 484.3 |  54M|   2 |2550 |3469 | 102 | 102 |   0 | 7.093889e+02 | 7.130000e+02 |   --   |   0.51%

SCIP Status        : problem is solved [optimal solution found]
Solving Time (sec) : 2.54
Solving Nodes      : 9
Primal Bound       : +7.13000000000000e+02 (9 solutions)
Dual Bound         : +7.13000000000000e+02
Gap                : 0.00 %

Summary

With that, we have seen the most important features to use GCG as a solver through the Python interface. In a different example, we will take a look at how GCG can be extended using Python code.