Designing a distribution network is both art and science?it requires pleasing finicky retailers while keeping costs in line. Though slick software helps, most of the work is in getting the numbers?and getting them right.
Peter Bradley is an award-winning career journalist with more than three decades of experience in both newspapers and national business magazines. His credentials include seven years as the transportation and supply chain editor at Purchasing Magazine and six years as the chief editor of Logistics Management.
The old aphorism about genius being 99 percent perspiration applies to many things in life and business —writing, research, parenting, invention and more.
One small example from the business logistics corner of the world is the design of a distribution network. Though distribution execs now have software to help them blast through the complexities and provide neatly packaged comparisons of multiple options, in the end, success depends on how well prepared you are going into the process.
For consumer packaged goods (CPG) companies in particular, getting the network right—achieving the right balance of inventory, material handling and transportation costs, while offering service levels that satisfy their retailer customers' unforgiving requirements—is crucial.
A number of experts who provide both software tools and consulting advice for customers involved in network design and implementation acknowledge that the bulk of the work comes up front, primarily in gathering and validating historical data and business forecasts that accurately reflect the business. Bruce Baring, director of strategic services for Peach State Integrated Technologies, says that anyone entering into a network analysis should expect to spend 60 to 80 percent of the time collecting, cleaning and validating data.
That can, of course, vary widely, depending on the project and on how much mastery the company has over its own data. While a typical network analysis project can take three to four months, others unfold much more rapidly. Bob Belshaw, chief operating officer of Insight, cites one such project with Motorola as an example. "The whole study took only six weeks," he says, which is about half the average time required. "It's really a case of whether the team has access to the data."
Only when the data have been crunched does the fun begin—when the modeling engines take over to develop a variety of options.
Be wise, optimize!
Given the difficulty of evaluating and optimizing an entire network—not to mention the time required for the project—it's not something most companies jump into without a compelling reason.
Baring lists several drivers. "A merger or an acquisition is always a good time to review the network," he says. "If you are acquiring manufacturing or distribution, your demand patterns may be changing, and that's a good time to look at your facilities."
A second reason, he says, is growth, particularly in specific geographic areas where growth may be exceeding historical norms.
"The other big indicator," he says, "is a significant change in sourcing."
Dan Sobbott, director of business development for Slim Technologies, a developer of optimization software with a number of retail and consumer goods customers, sees it much the same way. "There are maybe three big drivers that we see," he says. "Growth is one of them. It may be more stores or product launches or the product mix is changing. Planning those things through the distribution network is a motivating factor.
"Second, we see a fair amount of companies interested after a merger or acquisition. Strategic reasons drive companies together, but after that, they face the fundamental questions of how to bring two supply chains together.
"The third largest reason: cost cutting initiatives. They may have done some benchmarking and [have realized] they're sort of out of line with where they should be."
Belshaw, whose company's Sails network optimization engine is used by a number of major CPG players, argues that manufacturers should not necessarily wait for a precipitating event to look at their networks. "It's really something in all industries that needs to be looked at on a very frequent basis," he contends. He says that with the pace of business in most industries constantly accelerating, it makes sense to evaluate the network regularly to see if it needs fine tuning.
Overall, the CPG companies are constantly adding new products, almost as quickly as their high-tech brethren, he says. "They are constantly introducing new products, taking older products off the market, adding whole new brands or divesting. Those have significant impacts on the network … on warehousing or transportation or inventory positioning. When it comes to the supply chain, there is so much ripple effect. When you do something in manufacturing, it has huge implications for distribution and transportation."
Numbers, then more numbers
Whether the review is prompted by a major change in business or represents a regularly occurring event, the first step is the hardest step—gathering and validating data.
That's somewhat easier than it used to be, thanks to the enormous gains in data gathering capabilities in most industries over the last decade or so.
Sobbott points to the retail industry as an example. "Retailers historically have focused less on distribution and more on merchandising," he says. "Now, with substantial information available from point-of-sale (POS) data, they have the data for fact-based planning. So supply chain initiatives are of greater and greater interest."
Sobbott, Belshaw and Baring all agree on the crucial need for good data. In fact, Sobbott calls data readiness "the greatest challenge in a network optimization study. You have to have the right data."
But what data?
"The first thing to do is to develop data profiles," says Baring, "with demand analysis, inbound and outbound transportation costs, and fixed and variable warehousing costs.You have to pull all this historical data. Then you have to figure out where the business is going—you want to plan for the future, not the past.How is business going to evolve? How will the supply chain look three or five years from now? That's a pretty extensive step."
Next comes the validation step—testing the data before moving forward. Essentially, the data fed into the optimization engine are compared to the actual historical events— and they should match.
"We build a bubble map based on demand in different parts of the country," Baring says. "That provides a good visual validation. You typically should see bubbles where Wal-Mart or Target DCs are located." But he cautions users to make sure they're collecting and using "ship-to" and not "bill-to" addresses. "You shouldn't see a big bubble around Bentonville, Ark.," he says.
Sobbott explains, "The first thing you try to do is produce a baseline model that replicates a historical model. You want to include all the costs, volumes and activities. That allows you to understand if you have data that is well defined and that is replicating supply chain activities accurately. The validation model is constrained: We're trying to make it act like the historical time period."
For all the power of some of the optimization engines, the enormous volume of data in even a modest supply chain requires aggregation in appropriate ways in order to make it manageable.
"When you build the network optimization, there are only so many variables the software can handle efficiently," Baring says. "You may have demand to the three-digit ZIP code level, or aggregate by product types where you group together like products based on source, handling or similar cube-to-weight ratio. That simplifies the math in the optimization engine."
Multiple models
Once the validation is complete, it's time to take the constraints off the software and let it run.
"We unconstrain the model to do the optimization," Sobbott says. "We run a lot of scenarios." The options, he says, are almost unlimited, going from fine-tuning distribution using existing DCs, to closing some and opening others, to a complete green field analysis.
"We teach people to use a green field analysis," Belshaw says. "If you could put your distribution anywhere, where would that be?" The point, he says, is to see an optimal solution. "We never go there 100 percent," he says. But what it does is set some outlying goals for the potential of an efficient supply chain.
Scenarios can compare national versus regional distribution, making use of third parties, segregating some inventory such as slow movers, and more. "It's not uncommon to run dozens of scenarios," Sobbott says. "There may be a dozen runs on regional DC scenarios.
"There may be other scenarios," Sobbott says, "closing or opening DCs and actually changing assets.We may look at service where there's limited time to ship to the customer. Some may look at a regional DC strategy, so the average length of haul gets shorter.We look at overall costs.
"One thing we've started to see more with CPGs is looking at a centralized versus regional distribution strategy." The tradeoffs are obvious to any DC pro: Centralizing DCs means longer shipping distances but lower inventory costs.
Decentralizing and using more regional DCs, on the other hand, means faster customer service but higher inventory costs.What the analysis does is quantify those tradeoffs in dollars.
"You know what it will cost you," Sobbott says. "You will know optimally where to locate—where to put your DCs and how large they should be."
There are limitations, of course. The scenarios may not tell you much about real estate costs, or labor availability and rates, for instance. But they do provide solid information on where to concentrate your investigations after the optimization study is completed.
Belshaw says that in a typical study, the optimization engine will run 45 to 75 analyses. "We do lots of sensitivity runs," he says. The idea is to see how the proposed solutions would work if some of the forward-looking assumptions prove incorrect—for instance, if demand in a year or two exceeds projections used in the analysis. "You don't want to build a network that's so fragile that if business grows 12 percent, not 10 percent, you're in trouble."
Baring agrees. "Once we come up with the best solution, then we start to test sensitivities. If you said you'd be growing the market at a 20-percent pace but it only grows 5 percent —is it still the right strategy? Or what if the labor rate is 10 percent higher [than modeled], is it still the best solution? That gives an indication if the solution is fairly robust."
Human intervention
Analyzing the various scenarios and deciding how to proceed is the final step in this network analysis. "It is not always a case of the lowest cost," Sobbott says. "There are a number of key business factors and strategies [to consider]."
This is where business experience and understanding of strategic objectives is crucial. The software shows options, but human intelligence drives implementation decisions.
"The key thing in any network optimization is trying to balance the cost and service relationship," Baring says. "Network optimization is not the be all and end all, but a strategic tool to support business decisions. The decisions have to make sense. You really have to bring an operational bias to the exercise." For instance, scenarios may point to abandoning some geography or reducing service levels to a key customer—options that may be efficient, but are just plain bad business. "The answer may come back not to serve an area, but no CPG is going to do that," Belshaw says.
The network analysis is just the start. The implementation phase is where the heavy spending and effort take place, where timing and investment decisions are enacted, and the network may be most at risk as facilities are opened, revamped or closed. But all that requires a map—and that's where the heavy lifting up front pays off.
“The past year has been unprecedented, with extreme weather events, heightened geopolitical tension and cybercrime destabilizing supply chains throughout the world. Navigating this year’s looming risks to build a secure supply network has never been more critical,” Corey Rhodes, CEO of Everstream Analytics, said in the firm’s “2025 Annual Risk Report.”
“While some risks are unavoidable, early notice and swift action through a combination of planning, deep monitoring, and mitigation can save inventory and lives in 2025,” Rhodes said.
In its report, Everstream ranked the five categories by a “risk score metric” to help global supply chain leaders prioritize planning and mitigation efforts for coping with them. They include:
Drowning in Climate Change – 90% Risk Score. Driven by shifting climate patterns and record-high temperatures, extreme weather events are a dominant risk to the supply chain due to concerns such as flooding and elevated ocean temperatures.
Geopolitical Instability with Increased Tariff Risk – 80% Risk Score. These threats could disrupt trade networks and impact economies worldwide, including logistics, transportation, and manufacturing industries. The following major geopolitical events are likely to impact global trade: Red Sea disruptions, Russia-Ukraine conflict, Taiwan trade risks, Middle East tensions, South China Sea disputes, and proposed tariff increases.
More Backdoors for Cybercrime – 75% Risk Score. Supply chain leaders face escalating cybersecurity risks in 2025, driven by the growing reliance on AI and cloud computing within supply chains, the proliferation of IoT-connected devices, vulnerabilities in sub-tier supply chains, and a disproportionate impact on third-party logistics providers (3PLs) and the electronics industry.
Rare Metals and Minerals on Lockdown – 65% Risk Score. Between rising regulations, new tariffs, and long-term or exclusive contracts, rare minerals and metals will be harder than ever, and more expensive, to obtain.
Crackdown on Forced Labor – 60% Risk Score. A growing crackdown on forced labor across industries will increase pressure on companies who are facing scrutiny to manage and eliminate suppliers violating human rights. Anticipated risks in 2025 include a push for alternative suppliers, a cascade of legislation to address lax forced labor issues, challenges for agri-food products such as palm oil and vanilla.
That number is low compared to widespread unemployment in the transportation sector which reached its highest level during the COVID-19 pandemic at 15.7% in both May 2020 and July 2020. But it is slightly above the most recent pre-pandemic rate for the sector, which was 2.8% in December 2019, the BTS said.
For broader context, the nation’s overall unemployment rate for all sectors rose slightly in December, increasing 0.3 percentage points from December 2023 to 3.8%.
On a seasonally adjusted basis, employment in the transportation and warehousing sector rose to 6,630,200 people in December 2024 — up 0.1% from the previous month and up 1.7% from December 2023. Employment in transportation and warehousing grew 15.1% in December 2024 from the pre-pandemic December 2019 level of 5,760,300 people.
The largest portion of those workers was in warehousing and storage, followed by truck transportation, according to a breakout of the total figures into separate modes (seasonally adjusted):
Warehousing and storage rose to 1,770,300 in December 2024 — up 0.1% from the previous month and up 0.2% from December 2023.
Truck transportation fell to 1,545,900 in December 2024 — down 0.1% from the previous month and down 0.4% from December 2023.
Air transportation rose to 578,000 in December 2024 — up 0.4% from the previous month and up 1.4% from December 2023.
Transit and ground passenger transportation rose to 456,000 in December 2024 — up 0.3% from the previous month and up 5.7% from December 2023.
Rail transportation remained virtually unchanged in December 2024 at 150,300 from the previous month but down 1.8% from December 2023.
Water transportation rose to 74,300 in December 2024 — up 0.1% from the previous month and up 4.8% from December 2023.
Pipeline transportation rose to 55,000 in December 2024 — up 0.5% from the previous month and up 6.2% from December 2023.
Parcel carrier and logistics provider UPS Inc. has acquired the German company Frigo-Trans and its sister company BPL, which provide complex healthcare logistics solutions across Europe, the Atlanta-based firm said this week.
According to UPS, the move extends its UPS Healthcare division’s ability to offer end-to-end capabilities for its customers, who increasingly need temperature-controlled and time-critical logistics solutions globally.
UPS Healthcare has 17 million square feet of cGMP and GDP-compliant healthcare distribution space globally, supporting services such as inventory management, cold chain packaging and shipping, storage and fulfillment of medical devices, and lab and clinical trial logistics.
More specifically, UPS Healthcare said that the acquisitions align with its broader mission to provide end-to-end logistics for temperature-sensitive healthcare products, including biologics, specialty pharmaceuticals, and personalized medicine. With 80% of pharmaceutical products in Europe requiring temperature-controlled transportation, investments like these ensure UPS Healthcare remains at the forefront of innovation in the $82 billion complex healthcare logistics market, the company said.
Additionally, Frigo-Trans' presence in Germany—the world's fourth-largest healthcare manufacturing market—strengthens UPS's foothold and enhances its support for critical intra-Germany operations. Frigo-Trans’ network includes temperature-controlled warehousing ranging from cryopreservation (-196°C) to ambient (+15° to +25°C) as well as Pan-European cold chain transportation. And BPL provides logistics solutions including time-critical freight forwarding capabilities.
Terms of the deal were not disclosed. But it fits into UPS' long term strategy to double its healthcare revenue from $10 billion in 2023 to $20 billion by 2026. To get there, it has also made previous acquisitions of companies like Bomi and MNX. And UPS recently expanded its temperature-controlled fleet in France, Italy, the Netherlands, and Hungary.
"Healthcare customers increasingly demand precision, reliability, and adaptability—qualities that are critical for the future of biologics and personalized medicine. The Frigo-Trans and BPL acquisitions allow us to offer unmatched service across Europe, making logistics a competitive advantage for our pharma partners," says John Bolla, President, UPS Healthcare.
The supply chain risk management firm Overhaul has landed $55 million in backing, saying the financing will fuel its advancements in artificial intelligence and support its strategic acquisition roadmap.
The equity funding round comes from the private equity firm Springcoast Partners, with follow-on participation from existing investors Edison Partners and Americo. As part of the investment, Springcoast’s Chris Dederick and Holger Staude will join Overhaul’s board of directors.
According to Austin, Texas-based Overhaul, the money comes as macroeconomic and global trade dynamics are driving consequential transformations in supply chains. That makes cargo visibility and proactive risk management essential tools as shippers manage new routes and suppliers.
“The supply chain technology space will see significant consolidation over the next 12 to 24 months,” Barry Conlon, CEO of Overhaul, said in a release. “Overhaul is well-positioned to establish itself as the ultimate integrated solution, delivering a comprehensive suite of tools for supply chain risk management, efficiency, and visibility under a single trusted platform.”
Shippers today are praising an 11th-hour contract agreement that has averted the threat of a strike by dockworkers at East and Gulf coast ports that could have frozen container imports and exports as soon as January 16.
The agreement came late last night between the International Longshoremen’s Association (ILA) representing some 45,000 workers and the United States Maritime Alliance (USMX) that includes the operators of port facilities up and down the coast.
Details of the new agreement on those issues have not yet been made public, but in the meantime, retailers and manufacturers are heaving sighs of relief that trade flows will continue.
“Providing certainty with a new contract and avoiding further disruptions is paramount to ensure retail goods arrive in a timely manner for consumers. The agreement will also pave the way for much-needed modernization efforts, which are essential for future growth at these ports and the overall resiliency of our nation’s supply chain,” Gold said.
The next step in the process is for both sides to ratify the tentative agreement, so negotiators have agreed to keep those details private in the meantime, according to identical statements released by the ILA and the USMX. In their joint statement, the groups called the six-year deal a “win-win,” saying: “This agreement protects current ILA jobs and establishes a framework for implementing technologies that will create more jobs while modernizing East and Gulf coasts ports – making them safer and more efficient, and creating the capacity they need to keep our supply chains strong. This is a win-win agreement that creates ILA jobs, supports American consumers and businesses, and keeps the American economy the key hub of the global marketplace.”
The breakthrough hints at broader supply chain trends, which will focus on the tension between operational efficiency and workforce job protection, not just at ports but across other sectors as well, according to a statement from Judah Levine, head of research at Freightos, a freight booking and payment platform. Port automation was the major sticking point leading up to this agreement, as the USMX pushed for technologies to make ports more efficient, while the ILA opposed automation or semi-automation that could threaten jobs.
"This is a six-year détente in the tech-versus-labor tug-of-war at U.S. ports," Levine said. “Automation remains a lightning rod—and likely one we’ll see in other industries—but this deal suggests a cautious path forward."
Editor's note: This story was revised on January 9 to include additional input from the ILA, USMX, and Freightos.