I’ve been working with Supply Chain solutions since the early 2000s. It was a time that we did not many tool options but in general, the ones available, were very good.
The challenge at that time was more around user interface, integration with legacy systems, customization options, etc. (by the way, some of these problems are still present today in some cases)
One of the most difficult topics to deal with at that time was related to how setup your supply chain model, network, business rules and whether to use optimization or not in order to achieve best results.
At that time, I was mostly working as a consultant and had several clients in many different industries dealing with the same questions and trying to come up with a solution that was simple enough to maintain and still generating relevant data to support their decision-making process.
When looking to optimization methods, comparing many different parameters and variables to support business scenarios, it is never easy to find the right balance of complexity vs. expected results. Usually in order to get more relevant results, the number of parameters will grow and that impacts directly in the execution time.
Given that most companies run those optimization processes every night, along with data updates, backups, etc., the execution window gets limited and, in most cases, there is a need to limit the running time of those, to fit into the time dedicated to every process step. That also going to impact on the results since the optimizer most likely will not have enough time to come up with the optimal plan based on all parameters. Also, the required hardware to support those solutions was very expensive and usually a bottleneck that we had to take into consideration.
This balance between model complexity, parameters, execution time and expected results was always very difficult to handle, especially in large companies where the number of SKUs was very high.