Optimization Modeling: Everything You Need to Know

The history of optimization modeling goes back to the middle of the last century. This important field of science plays a major role in industry, commerce, engineering and space exploration.

By optimization modeling, we're referring to the use of mathematical techniques to solving problems based on certain characteristics by applying:

  • Linear programming (LP)
  • Mixed integer programming (MIP)
  • Nonlinear programming (NLP)
  • Constraint programming (CP)

These mathematical techniques have their roots in algebra, invented in 820 AD by a Persian named Muhammad ibn Musa al-Khwarizm. Used to solve practical mathematics as well as highly complex problems, modern civilization could not function without algebra.

What Is Optimization Modeling?

Optimization modeling is a form of mathematics that attempts to determine the optimal maximin or minimum value of a complex equation. A key aspect is that constraints such as resource limitations and the need to arrive at realistic solutions must be respected, something that isn't always an issue in complex mathematics. Mathematical optimization makes use of techniques (as noted above) to evaluate complex models that represent real-life planning and decision support business problems, such as logistics, scheduling, inventory control, network design, and more.

Optimization or Simulation - Inline

A Brief History of Linear Programming

Starting in the 1650s, mathematician Blaise Pascal laid down the foundation for the mathematical theory of probability. He was followed by Newton, Bernoulli and Lagrange, who all made important contributions to mathematical science.

In 1826, Jean-Baptiste-Joseph Fourier stated that certain problems could be defined as linear-programming problems, and Carl Friedrich Gauss proved that elementary row operations could be used to solve a set of linear equations. This work demonstrated that mathematics could be used for solving real-world problems. Unfortunately, the limiting factor at that time was that only small problems could be solved.

The beginning of linear programming and operations research

In the build-up to the Second World War, the British faced serious problems with their early radar systems and turned to what was the predecessor of operations research to solve these problems. While it did not involve modeling, the exercise combined mathematics, science, and common sense to help the Allies make intelligent decisions, positively contributing to the outcome of the war. By war's end, operations research teams existed in many spheres and were absorbed into other government functions. This didn't go unnoticed, especially by industry and academics, who soon began to apply these techniques and, along the way, defined many famous problems, such as how to optimize a traveling salesperson's route and perform Monte Carlo simulations.

Early mainframe computers

In 1947, Dr. George Dantzig invented the simplex algorithm to solve LP problems involving multiple equations and numerous variables. Using the Card Programmable Calculator, the National Bureau of Standards and the RAND Corporation were able to solve problems with as many as 45 constraints and 70 variables.

By the mid-1950s, IBM machines could solve problems with several hundred constraints. In the early ‘60s, these machines were capable of solving problems with more than 1,000 constraints, a fact that caused the oil industry to take notice.

In 1964, researchers discovered how to solve more complicated problems using the mixed-integer program (MIP) based on the branch-and-bound algorithm method. While the process was complicated and required tape storage, it was a breakthrough. Recognizing the benefits of MIP, researchers in various fields, including the process industry and the military, began seriously to consider optimization modeling.

During the 1970s, the capabilities of IBM mainframe computers advanced rapidly and, using new algorithms, were able to solve larger and more difficult LP/MIP problems. By the late ‘70s, portable code written in FORTRAN was introduced. Unfortunately, the costs were so high that solving large-scale optimization problems remained largely the domain of academia and well-funded consulting companies.

Optimization in IBM vs River Logic

Early PCs and algebraic modeling languages

The launch of the IBM PC in 1981 changed everything, and by 1983, early versions of LINDO and XpressMP languages became available. Early PCs were limited to smaller LP modeling with a maximum of 1,000 constraints and 1,000 variables and were significantly slower than mainframe computers. But as computing speed, memory advances, and solver refinement improved, the focus shifted to new methods capable of solving larger LP and MIP problems.

Along with established third-generation programming languages (3GLs) like Basic and FORTRAN, a new class of modeling software called Algebraic Modeling Languages (AMLs) emerged. Considered fourth-generation languages (4GLs), AMLs were created for operations research professions, and they had similar characteristics, including:

  • Modeling language specifically designed for large-scale mathematical problems
  • User interface
  • Ability to read/write data
  • Links to solvers

Original AML software vendors included General Algebraic Modeling System (GAMS), AIMMS, LINGO, AMPL, MathPro, and MPL; all of which remain in use.

The 1990s

The first commercial linear optimizer solver program, CPLEX, was released in 1988. This was followed by IBM's OSL solver, which was used for building models with AIX (UNIX) servers. Algorithms for CPLEX, OSL and Xpress solvers improved to the point that very large problems could be solved using PCs.

Although initially slower than FORTRAN-based languages, AML language packages improved significantly. Subsequently, vendors of packages and commercial solvers added application programming interfaces (APIs), creating viable optimization-based package solutions for their clients.

Moving from third- and fourth- generation to fifth-generation optimization modeling languages

Third-generation programming languages are high-level machine-independent languages that use callable libraries and include C, C++, and Java. They're relatively easy to read and maintain.

Fourth-generation (4GL) languages work at a higher level, using statements similar to those in human language. As such, they are easier to learn and use, especially for non-IT professionals. They include languages such as SQL and MATLAB. Some advanced 3GL programs, like Python and Ruby, combine 4GL abilities and libraries, which is why they are often referred to as 4GL languages.

Whereas third- and fourth-generation languages use algorithms written to solve problems, fifth-generation programming languages (5GL) work by solving constraints given to the program. Most 5GL languages use graphical or visual interfaces to create programs without the need to write source code. Examples include Prolog, Mercury and River Logic's Enterprise Optimizer.

Optimization Modeling Approaches

There are many possible approaches to optimization modeling. Some optimization modeling software solutions require the services of highly qualified operation research professionals, while others can be implemented in-house, provided requisite skills are available. While most vendors sell programming software, others sell industry-specific packages. Some offer additional support and will write optimization solutions for their clients.

Each solution has its strong and weak points, ranging from:

  • Implementation cost
  • Time to solution
  • Complexity
  • Skill requirements

There's no one right answer, but here are three common approaches:

  • Using spreadsheets for optimization modeling
  • Modeling with Python
  • Modeling with a fifth-generation programming language

Optimization decisions with spreadsheets

It's tempting to start dabbling with optimization modeling using one of the many Excel solver add-ins. The benefits include low initial cost and the fact that anyone with a reasonable knowledge of spreadsheets and math can do it. A number of guides and books are available.

Steps include creating the model, defining the objective function and specifying decision variables and constraints. Using built-in Excel solvers or third-party add-ons, it's possible to solve the model.

While relatively simple, it's important to be aware of several spreadsheet drawbacks. It may be problematic to define complex models with a spreadsheet, often complicated by the need to use multiple worksheets. Additionally, it's difficult to scale spreadsheet models to business-sized problems, especially those with large numbers of variables and constraints. The nature of spreadsheets means formula are hidden and errors can go unnoticed. You're often totally dependent on the person who prepared the model.

Spreadsheet modeling has its place, but it's not the right tool for large and complex optimization models.

Optimization modeling in Python

Python is a flexible and powerful programming language. It has numerous libraries available to help perform optimization and modeling. Given time and resources, Python can be used to create highly complex optimization models with large numbers of constraints and variables. Many optimization solvers, such as IBM's CPLEX and Gurobi, have Python interfaces.

Naturally, the capability of the model is largely dependent upon the knowledge and skill of the programmer, and model preparation falls within the scope of high-level operational research and data analytics. Also, there is no avoiding an inevitable black box scenario, and most end users will never understand the details of the model they depend on.

Optimization modeling in fifth-generation languages

The most obvious difference when using a fifth-generation language is the ease of model preparation, Instead of having to laboriously code every element, solutions such as Enterprise Optimizer employ an intuitive drag-and-drop approach that creates a visual model on the screen. There's no need for complicated mathematical coding, and it's possible to create a model in a fraction of the time taken by 4GL solutions, such as IBM's CPLEX Optimization Studio Platform.

With a drag-and-drop interface, it's possible to see how the model works and more easily make checks. Making changes is a relatively simple task, especially when compared to rewriting code.

The Value of Optimization Modeling

Optimization modeling is a key component of business success. Using what-if capabilities, it's possible to determine the best and most appropriate answers to achieve an organization's business objectives. Because optimized solutions represent the best compromise between a number of inter-related variables to achieve a specific goal, it's often impossible to determine the right solution by any other means. An early example of optimization was the design of the aircraft Charles Lindbergh used to cross the Atlantic where his three requirements, namely to reduce weight, ensure flight safety and maintain an adequate reserve of fuel, resulted in what was an unconventional design that succeeded.

What This Means

This is an exciting and opportune moment in the history of optimization modeling. The business planning market is beginning to understand what it needs. The trend away from desktop software and on-premise hardware means optimization modeling software is available in the cloud. Multi-user collaboration is a critical feature. 5GL optimization software solutions like Enterprise Optimizer are a core component for driving the value proposition and making it affordable. Being able to model and solve large and complex problems on its own is not enough. There's also a need for complementary components like data blending, scenario management, interactive dashboards and advanced analysis tools.

New call-to-action  

Supply Chain Brief