## Optimization: A notorious road to Structured Inefficiency and transition to Combinatorial Optimization & Machine Learning | by Meet Patel | Dec, 2020

[ad_1]

## Inefficiencies and limitations faced by companies using traditional optimization methods and how combinatorial optimization might be the future of logistic industries.

Title of the article is very oxymoronic: having an optimization and inefficiencies in the same context. But it is very true looking at the trend and current practices in the logistics industries. In this article, we are going to discuss how current practice of optimization is contributing significant inefficiencies for the organizations. And, how big firms (Like Amazon, Shopify, Uber) are taking an advantage of advancements in the field of combinatorial/mathematical optimization to identify the new opportunities, and winning the competition over thin margin, by creating a little wiggle room for the profit.

Pure mathematics in its own way, the poetry of logical idea.

-Albert Einstein

Few months ago, I was discussing with my friend about an idea of mathematical formulation for a kitchen which can be efficient enough to make 1500 entirely different recipes (not just a vegetables, sauces and cheese on bread or bun), with less than 100 ingredients in inventory, with the use of minimal kitchen appliances. Not only that, unlike other restaurants it does not rely on frozen food concept, but fresh produces only. And, can be served in less than 10 minutes from scratch. Is it possible? Definitely. It can be made possible using few modification in Travelling Salesman Problem (TSP). So, Mathematics has tremendous capacity to transform physical attributes and constraints into computational language. Not only in restaurant industry, but mathematics plays an integral role in optimization and innovation in every other sector.

Combinatorial optimization is a sub-field of mathematical optimization that deals with finding an optimal solutions from a finite sets of variables, and constraints. Applications of combinatorial optimization are not limited to finding the shortest path, scheduling crew or routing, but in every other fields like finding DNA sequence, and tracing global warming.

There are numerous algorithms available to solve an optimization problems as we know. But all can be classified with no more than two groups as far as I know: **Approximation (Heuristics, Meta-Heuristics)** and E**xact algorithms** widely known as NP-hard problems. Approximation methods uses nearest neighbor, and clustering algorithms to find an optimal solution, at no point guarantees an optimality, even after adding the layer of meta-heuristics methods such as simulated annealing and genetic algorithms. Small Problems might be an exception, but you will dine at the restaurant and take a walk in park in between most of the time (inefficiency). On the other hand, exact algorithms are guaranteed to find an optimal solution. So simple, use the exact methods, and make everyone happy. Unfortunately, it’s not that easy. You will need to pay more for the old rum. In terms of computational time, and data mining.

Around some 5 million years ago, when industry started integrating computational efficiency in their operations, they were able to solve an approximation algorithms having a lunch. While, to solve an exact algorithms, they will need to send their employees on safari trip in business class, with telescope if they wanted to do sky watching. Because, they had nothing else to do during that time. But as you know, time keeps clocking unless we travel faster than light, building an orthodox, and making everyone comfortable. And, Capacity of dealing with that orthodox distinguish between thriving and surviving industry. As mentioned, now with better computational efficiency, more advanced methods like machine learning, and scholars solving an exact algorithms in feasible time is much easier than before.

The table below depicts the pros and cons of using an exact algorithms over commercial optimization software’s.

Let’s see the difference between Approximation and exact algorithms output just for 10 nodes. Approximation algorithms developed for this problem is very efficient, finding an initial solution using nearest neighbor, and deploying that solution in simulated annealing algorithms to further optimize the solution. However, Solution still looks off compared to exact algorithm. Both problems uses same dataset, same constraints, and same physical attributes. This is just about 10. Think about problem, where we have 1000 nodes. Solution differs significantly sometimes as big as 20–35%. That’s where it generates inefficiencies in organization. Just as an FYI, here we are discussing about routing problem. But there can be other problems as well, which can be solved using an algorithms. Same difference can be noticed in all problems.

To supply food products at 10 restaurants are found with different method. Which are as below.

LP has taken 250.8 seconds and 45.1 mb of CPU memory to solve the model. However, time increase rapidly as number of variable increases in the model. But, still LP is giving us best solution as expected.

Long story short, most of the industries has not adapted the change with time understanding the advancement in the technology, more or less investing in working with software’s, rather than developing it.

I used to work with my father in our automobile workshop. And by nature, my father created experienced based process flow of like putting tools back to place from where it was picked up, having a scheduled vehicle pick-up and delivery time etc. It helped significantly as we can close workshop on time, and can easily estimate performance time. And sometimes when I ask my father a question behind certain activity, he always has a plausible explanation why it is being performed in a certain way. I can say he runs the workshop very efficiently. Right sizing the customer volume, repairing and washing operations, perfectly predicting an operation time etc. All we want to say is that, most optimal solution doesn’t lies in algorithms, but the experience. Algorithms are just tool where human capacity of processing the input stops, and Computers comes in the action. It doesn’t mean that it can substitute the role of experience and one’s logical capacity. And, I am not discussing anything out of box, but the roots of machine learning. In which based on the training data, machine starts building an experience and logical capacity, and moves toward more accurate and optimal solution with size of data.

Shortly, experience is very important. And, each industry has its own unique experience, their style of executing operations are different, all generating different kinds of waste, their process flow is different, and most importantly their work culture is different. With so much variability in their day to day operations, it is fascinating that, industries are still heavily rely on the commercial optimization software’s.

In recent years with the increase in computational resources, the machine learning is evolving at a phenomenal rate. We all heard success stories of Alpha Zero an AI model defeating World Chess Champions and how it is impossible for human chess players to defeat chess engines because they calculate millions of positions per second while human thinking capabilities are limited as compared to engine. Similarly, recent research work in the field of combinatorial optimization shows that machine learning has the potential to learn and design heuristics better than the traditional heuristics designed by humans.

A common problem faced by any transportation company is finding an optimal route to deliver goods to multiple locations with limited vehicle capacity. This combinatorial optimization problem is called a vehicle routing problem. Determining the optimal solution of VRP is an NP-hard type problem which is one of the hardest optimization problems to solve. Mathematical approaches like linear programming, branch, and bound algorithms can be used to find the optimal solution for small problems. Due to complexity, traditional optimization approaches cannot be used to solve large scale problems. The heuristics approach can be used to solve large-sized problems within a reasonable amount of but since heuristics are based on approximations only near-optimal solutions can be found.

When it comes to the use of an exact algorithms, data mining becomes truly expensive. In case of routing problems, requesting a data from a distance matrix API (using Google API, ARCGIS) platforms, and data preparation phase is more expensive, manual and tedious than it seems. On the other hand, Optimization software’s are good with that. Starting a solution using haversine formula, and adjusting to an accurate is a standard practice.

Great Graphical User Interface (GUI) allows great flexibility in daily management and reduce the manual efforts for commercial optimization platforms. While, you will need to patient and careful with the use of Exact methods as you write a code in software’s like Python, GUSEK or AMPL and export it in Optimization Solver like Gurobi, and CPLEX. Also, analyze an output in more manually than simple data export. Exact algorithms surely has a great advantage to fit with the need of any organization, and have an ability to consider special attributes of an industry practice unlike commercial solutions.

Both approaches have specific advantages which complements each other. Suitable combinations of exact algorithms and metaheuristics can benefit much from synergy and often exhibit significantly higher performance with respect to solution quality and time.

State-of-the-art algorithms rely on handcrafted heuristics to find near-optimal solutions for solving large-sized problems. For small companies Handcrafted traditional heuristics approaches are too expensive and time-consuming to design. Are you getting the point, no method on its own can serves the need of any organization? What’s the solution then? And, how some big firms and their engineers are finding an intermediate solutions to fit with the need of their organization. We can call them an outliers who competes based on different strategies, core values, deviating themselves from standard industry practice to look at the problems, and deriving a solutions.

This issue can be resolved if we can design heuristics without human intervention.Neural networks and reinforcement learning can offer endless possibilities to the idea of designing heuristic algorithms without human intervention. Machine learning and optimization are very closely related. The main principle of machine learning works on minimizing loss function which can be cost or distance, which is same in the case of optimization problem. Achievements in combinatorial optimization when machine learning is combined with meta-heuristics approaches like tabu search and LNS to learn and design heuristics.

Using machine model that learns to design own heuristics, instead of other traditional heuristic approaches can provide a significant advantage to the companies because there is no need to design the handcrafted heuristics and this models once trained can tackle very large problems, i.e., 1000 plus delivery locations and provide optimal solutions instantly, which traditional methods need hours to solve.

The image below shows the difference in optimal distance for 20 Customers.

It can be seen that machine learning model is more efficient and there is improvement of nearly 20%. The table below shows the difference in optimality when tested with other instances.

Read More …

[ad_2]