With IBP 1911 we added some new expert settings to the TS-based supply optimizer profile. In future these options will also be available for the constrained forecast run using optimizer in order-based planning.
In general, these expert settings should only be used when performance and/or numerical problems arise. Usually according messages in the business log can be seen in such cases.
Normally these parameters should be kept on their defaults and only be changed in case of performance issues and/or the product/development support recommends different settings.
Preferably first the numerical issues itself should be reduced (e.g. by reducing the cost range).
Please note that these settings can yield in a worse performance.
As stated in the help, this parameter sets the degree to which the optimizer tries to manage numerical issues.
- Automatic (default): The optimizer checks the numerical accuracy of intermediate results, but optimizer runtime takes the priority.
- Soft: Numerical accuracy is more important than with the default setting, but checks are less thorough than with the strong setting.
- Strong: This setting provides the highest level of numerical accuracy, at the expense of an increased runtime.
- Off: only basic checks on numerical accuracy.
In soft and strong setting the optimizer performs extended checks on the numerical precision and uses more accurate but slower arithmetic computations. Therefore, usually a longer runtime is expected. But if the numerical issues yield in trouble finding feasible solutions at all, these settings can result in a faster optimization run overall. Sometimes even only with these enhanced methods solutions can be found.
This parameter sets the degree to which the internal constraint matrix of the optimizer is scaled. Scaling often reduces the runtime as it reduces the numerical difficulties, but potentially leads to larger constraint violations in the original, unscaled matrix.
- Automatic (default): The optimizer applies scaling based on the scenario properties.
- Soft: With this setting, a medium level of scaling is applied.
- Strong: With this setting, a high level of scaling is applied.
- Off: Disables the scaling. Should only be used when recommended by support (e.g. to avoid large constraint violations).
Usually Numerical Focus and Numerical Scaling should be set to automatic. Based on the scenario-properties (e.g. cost range) the optimizer will then adapt these parameters itself.
To improve the automatic parameter selection, it is necessary to analyze according scenarios. Especially ones where the soft or strong setting have a high impact. To provide us such scenarios, please open an incident on component SCM-IBP-SUP-OPT to give us your consent and access to the necessary data. The scenarios will only be used anonymously afterwards.
Numerical pre-optimization splits up the optimizer run into two phases, of which pre-optimization is the first one.
In this first pre-optimization phase, the objective function only contains the pseudo-hard decisions or all decision with an assigned cost coefficient higher or equal to Cost Threshold (if that parameter was set). Accordingly, the optimizer is focused on these high-priority decisions and results in a solution with their best possible fulfillment.
In the second main-optimization phase, the results of the selected decisions from the pre-optimization are fixed, and the previously removed decisions are contained in the objective function.
The Overall Runtime Limit (%) (defaulted to 10%) defines the maximum time in percent (of the whole optimizer runtime) that the pre-optimization phase can consume.
This separate consideration of the pseudo-hard and the non-pseudo-hard decisions improves the numerical difficulties in each phase:
- In the pre-optimization phase the cost range is largely reduced. This allows a down-scaling of the objective function to improve the numerics.
- In the main-optimization phase the optimizer can focus on the non-pseudo-hard decisions. Typically, it can omit the fixed decisions in its internal pre-solving phases. Low-cost decisions which often yield to sub-optimal solutions due to their very low impact in the objective function receive a higher consideration.
The numerical pre-optimization helps when there are many pseudo-hard constraints which can’t be fulfilled. Quite often we see adjusted or minimum values which can’t be respected in total. E.g. due to limitations in resource capacities or material availability.
On the downside it may result in a longer runtime and in very rare cases the overall optimal solution can’t be reached.