Designing a distribution network is both art and science?it requires pleasing finicky retailers while keeping costs in line. Though slick software helps, most of the work is in getting the numbers?and getting them right.
Peter Bradley is an award-winning career journalist with more than three decades of experience in both newspapers and national business magazines. His credentials include seven years as the transportation and supply chain editor at Purchasing Magazine and six years as the chief editor of Logistics Management.
The old aphorism about genius being 99 percent perspiration applies to many things in life and business —writing, research, parenting, invention and more.
One small example from the business logistics corner of the world is the design of a distribution network. Though distribution execs now have software to help them blast through the complexities and provide neatly packaged comparisons of multiple options, in the end, success depends on how well prepared you are going into the process.
For consumer packaged goods (CPG) companies in particular, getting the network right—achieving the right balance of inventory, material handling and transportation costs, while offering service levels that satisfy their retailer customers' unforgiving requirements—is crucial.
A number of experts who provide both software tools and consulting advice for customers involved in network design and implementation acknowledge that the bulk of the work comes up front, primarily in gathering and validating historical data and business forecasts that accurately reflect the business. Bruce Baring, director of strategic services for Peach State Integrated Technologies, says that anyone entering into a network analysis should expect to spend 60 to 80 percent of the time collecting, cleaning and validating data.
That can, of course, vary widely, depending on the project and on how much mastery the company has over its own data. While a typical network analysis project can take three to four months, others unfold much more rapidly. Bob Belshaw, chief operating officer of Insight, cites one such project with Motorola as an example. "The whole study took only six weeks," he says, which is about half the average time required. "It's really a case of whether the team has access to the data."
Only when the data have been crunched does the fun begin—when the modeling engines take over to develop a variety of options.
Be wise, optimize!
Given the difficulty of evaluating and optimizing an entire network—not to mention the time required for the project—it's not something most companies jump into without a compelling reason.
Baring lists several drivers. "A merger or an acquisition is always a good time to review the network," he says. "If you are acquiring manufacturing or distribution, your demand patterns may be changing, and that's a good time to look at your facilities."
A second reason, he says, is growth, particularly in specific geographic areas where growth may be exceeding historical norms.
"The other big indicator," he says, "is a significant change in sourcing."
Dan Sobbott, director of business development for Slim Technologies, a developer of optimization software with a number of retail and consumer goods customers, sees it much the same way. "There are maybe three big drivers that we see," he says. "Growth is one of them. It may be more stores or product launches or the product mix is changing. Planning those things through the distribution network is a motivating factor.
"Second, we see a fair amount of companies interested after a merger or acquisition. Strategic reasons drive companies together, but after that, they face the fundamental questions of how to bring two supply chains together.
"The third largest reason: cost cutting initiatives. They may have done some benchmarking and [have realized] they're sort of out of line with where they should be."
Belshaw, whose company's Sails network optimization engine is used by a number of major CPG players, argues that manufacturers should not necessarily wait for a precipitating event to look at their networks. "It's really something in all industries that needs to be looked at on a very frequent basis," he contends. He says that with the pace of business in most industries constantly accelerating, it makes sense to evaluate the network regularly to see if it needs fine tuning.
Overall, the CPG companies are constantly adding new products, almost as quickly as their high-tech brethren, he says. "They are constantly introducing new products, taking older products off the market, adding whole new brands or divesting. Those have significant impacts on the network … on warehousing or transportation or inventory positioning. When it comes to the supply chain, there is so much ripple effect. When you do something in manufacturing, it has huge implications for distribution and transportation."
Numbers, then more numbers
Whether the review is prompted by a major change in business or represents a regularly occurring event, the first step is the hardest step—gathering and validating data.
That's somewhat easier than it used to be, thanks to the enormous gains in data gathering capabilities in most industries over the last decade or so.
Sobbott points to the retail industry as an example. "Retailers historically have focused less on distribution and more on merchandising," he says. "Now, with substantial information available from point-of-sale (POS) data, they have the data for fact-based planning. So supply chain initiatives are of greater and greater interest."
Sobbott, Belshaw and Baring all agree on the crucial need for good data. In fact, Sobbott calls data readiness "the greatest challenge in a network optimization study. You have to have the right data."
But what data?
"The first thing to do is to develop data profiles," says Baring, "with demand analysis, inbound and outbound transportation costs, and fixed and variable warehousing costs.You have to pull all this historical data. Then you have to figure out where the business is going—you want to plan for the future, not the past.How is business going to evolve? How will the supply chain look three or five years from now? That's a pretty extensive step."
Next comes the validation step—testing the data before moving forward. Essentially, the data fed into the optimization engine are compared to the actual historical events— and they should match.
"We build a bubble map based on demand in different parts of the country," Baring says. "That provides a good visual validation. You typically should see bubbles where Wal-Mart or Target DCs are located." But he cautions users to make sure they're collecting and using "ship-to" and not "bill-to" addresses. "You shouldn't see a big bubble around Bentonville, Ark.," he says.
Sobbott explains, "The first thing you try to do is produce a baseline model that replicates a historical model. You want to include all the costs, volumes and activities. That allows you to understand if you have data that is well defined and that is replicating supply chain activities accurately. The validation model is constrained: We're trying to make it act like the historical time period."
For all the power of some of the optimization engines, the enormous volume of data in even a modest supply chain requires aggregation in appropriate ways in order to make it manageable.
"When you build the network optimization, there are only so many variables the software can handle efficiently," Baring says. "You may have demand to the three-digit ZIP code level, or aggregate by product types where you group together like products based on source, handling or similar cube-to-weight ratio. That simplifies the math in the optimization engine."
Multiple models
Once the validation is complete, it's time to take the constraints off the software and let it run.
"We unconstrain the model to do the optimization," Sobbott says. "We run a lot of scenarios." The options, he says, are almost unlimited, going from fine-tuning distribution using existing DCs, to closing some and opening others, to a complete green field analysis.
"We teach people to use a green field analysis," Belshaw says. "If you could put your distribution anywhere, where would that be?" The point, he says, is to see an optimal solution. "We never go there 100 percent," he says. But what it does is set some outlying goals for the potential of an efficient supply chain.
Scenarios can compare national versus regional distribution, making use of third parties, segregating some inventory such as slow movers, and more. "It's not uncommon to run dozens of scenarios," Sobbott says. "There may be a dozen runs on regional DC scenarios.
"There may be other scenarios," Sobbott says, "closing or opening DCs and actually changing assets.We may look at service where there's limited time to ship to the customer. Some may look at a regional DC strategy, so the average length of haul gets shorter.We look at overall costs.
"One thing we've started to see more with CPGs is looking at a centralized versus regional distribution strategy." The tradeoffs are obvious to any DC pro: Centralizing DCs means longer shipping distances but lower inventory costs.
Decentralizing and using more regional DCs, on the other hand, means faster customer service but higher inventory costs.What the analysis does is quantify those tradeoffs in dollars.
"You know what it will cost you," Sobbott says. "You will know optimally where to locate—where to put your DCs and how large they should be."
There are limitations, of course. The scenarios may not tell you much about real estate costs, or labor availability and rates, for instance. But they do provide solid information on where to concentrate your investigations after the optimization study is completed.
Belshaw says that in a typical study, the optimization engine will run 45 to 75 analyses. "We do lots of sensitivity runs," he says. The idea is to see how the proposed solutions would work if some of the forward-looking assumptions prove incorrect—for instance, if demand in a year or two exceeds projections used in the analysis. "You don't want to build a network that's so fragile that if business grows 12 percent, not 10 percent, you're in trouble."
Baring agrees. "Once we come up with the best solution, then we start to test sensitivities. If you said you'd be growing the market at a 20-percent pace but it only grows 5 percent —is it still the right strategy? Or what if the labor rate is 10 percent higher [than modeled], is it still the best solution? That gives an indication if the solution is fairly robust."
Human intervention
Analyzing the various scenarios and deciding how to proceed is the final step in this network analysis. "It is not always a case of the lowest cost," Sobbott says. "There are a number of key business factors and strategies [to consider]."
This is where business experience and understanding of strategic objectives is crucial. The software shows options, but human intelligence drives implementation decisions.
"The key thing in any network optimization is trying to balance the cost and service relationship," Baring says. "Network optimization is not the be all and end all, but a strategic tool to support business decisions. The decisions have to make sense. You really have to bring an operational bias to the exercise." For instance, scenarios may point to abandoning some geography or reducing service levels to a key customer—options that may be efficient, but are just plain bad business. "The answer may come back not to serve an area, but no CPG is going to do that," Belshaw says.
The network analysis is just the start. The implementation phase is where the heavy spending and effort take place, where timing and investment decisions are enacted, and the network may be most at risk as facilities are opened, revamped or closed. But all that requires a map—and that's where the heavy lifting up front pays off.
Container traffic is finally back to typical levels at the port of Montreal, two months after dockworkers returned to work following a strike, port officials said Thursday.
Today that arbitration continues as the two sides work to forge a new contract. And port leaders with the Maritime Employers Association (MEA) are reminding workers represented by the Canadian Union of Public Employees (CUPE) that the CIRB decision “rules out any pressure tactics affecting operations until the next collective agreement expires.”
The Port of Montreal alone said it had to manage a backlog of about 13,350 twenty-foot equivalent units (TEUs) on the ground, as well as 28,000 feet of freight cars headed for export.
Port leaders this week said they had now completed that task. “Two months after operations fully resumed at the Port of Montreal, as directed by the Canada Industrial Relations Board, the Montreal Port Authority (MPA) is pleased to announce that all port activities are now completely back to normal. Both the impact of the labour dispute and the subsequent resumption of activities required concerted efforts on the part of all port partners to get things back to normal as quickly as possible, even over the holiday season,” the port said in a release.
The “2024 Year in Review” report lists the various transportation delays, freight volume restrictions, and infrastructure repair costs of a long string of events. Those disruptions include labor strikes at Canadian ports and postal sites, the U.S. East and Gulf coast port strike; hurricanes Helene, Francine, and Milton; the Francis Scott key Bridge collapse in Baltimore Harbor; the CrowdStrike cyber attack; and Red Sea missile attacks on passing cargo ships.
“While 2024 was characterized by frequent and overlapping disruptions that exposed many supply chain vulnerabilities, it was also a year of resilience,” the Project44 report said. “From labor strikes and natural disasters to geopolitical tensions, each event served as a critical learning opportunity, underscoring the necessity for robust contingency planning, effective labor relations, and durable infrastructure. As supply chains continue to evolve, the lessons learned this past year highlight the increased importance of proactive measures and collaborative efforts. These strategies are essential to fostering stability and adaptability in a world where unpredictability is becoming the norm.”
In addition to tallying the supply chain impact of those events, the report also made four broad predictions for trends in 2025 that may affect logistics operations. In Project44’s analysis, they include:
More technology and automation will be introduced into supply chains, particularly ports. This will help make operations more efficient but also increase the risk of cybersecurity attacks and service interruptions due to glitches and bugs. This could also add tensions among the labor pool and unions, who do not want jobs to be replaced with automation.
The new administration in the United States introduces a lot of uncertainty, with talks of major tariffs for numerous countries as well as talks of US freight getting preferential treatment through the Panama Canal. If these things do come to fruition, expect to see shifts in global trade patterns and sourcing.
Natural disasters will continue to become more frequent and more severe, as exhibited by the wildfires in Los Angeles and the winter storms throughout the southern states in the U.S. As a result, expect companies to invest more heavily in sustainability to mitigate climate change.
The peace treaty announced on Wednesday between Isael and Hamas in the Middle East could support increased freight volumes returning to the Suez Canal as political crisis in the area are resolved.
The French transportation visibility provider Shippeo today said it has raised $30 million in financial backing, saying the money will support its accelerated expansion across North America and APAC, while driving enhancements to its “Real-Time Transportation Visibility Platform” product.
The funding round was led by Woven Capital, Toyota’s growth fund, with participation from existing investors: Battery Ventures, Partech, NGP Capital, Bpifrance Digital Venture, LFX Venture Partners, Shift4Good and Yamaha Motor Ventures. With this round, Shippeo’s total funding exceeds $140 million.
Shippeo says it offers real-time shipment tracking across all transport modes, helping companies create sustainable, resilient supply chains. Its platform enables users to reduce logistics-related carbon emissions by making informed trade-offs between modes and carriers based on carbon footprint data.
"Global supply chains are facing unprecedented complexity, and real-time transport visibility is essential for building resilience” Prashant Bothra, Principal at Woven Capital, who is joining the Shippeo board, said in a release. “Shippeo’s platform empowers businesses to proactively address disruptions by transforming fragmented operations into streamlined, data-driven processes across all transport modes, offering precise tracking and predictive ETAs at scale—capabilities that would be resource-intensive to develop in-house. We are excited to support Shippeo’s journey to accelerate digitization while enhancing cost efficiency, planning accuracy, and customer experience across the supply chain.”
Donald Trump has been clear that he plans to hit the ground running after his inauguration on January 20, launching ambitious plans that could have significant repercussions for global supply chains.
As Mark Baxa, CSCMP president and CEO, says in the executive forward to the white paper, the incoming Trump Administration and a majority Republican congress are “poised to reshape trade policies, regulatory frameworks, and the very fabric of how we approach global commerce.”
The paper is written by import/export expert Thomas Cook, managing director for Blue Tiger International, a U.S.-based supply chain management consulting company that focuses on international trade. Cook is the former CEO of American River International in New York and Apex Global Logistics Supply Chain Operation in Los Angeles and has written 19 books on global trade.
In the paper, Cook, of course, takes a close look at tariff implications and new trade deals, emphasizing that Trump will seek revisions that will favor U.S. businesses and encourage manufacturing to return to the U.S. The paper, however, also looks beyond global trade to addresses topics such as Trump’s tougher stance on immigration and the possibility of mass deportations, greater support of Israel in the Middle East, proposals for increased energy production and mining, and intent to end the war in the Ukraine.
In general, Cook believes that many of the administration’s new policies will be beneficial to the overall economy. He does warn, however, that some policies will be disruptive and add risk and cost to global supply chains.
In light of those risks and possible disruptions, Cook’s paper offers 14 recommendations. Some of which include:
Create a team responsible for studying the changes Trump will introduce when he takes office;
Attend trade shows and make connections with vendors, suppliers, and service providers who can help you navigate those changes;
Consider becoming C-TPAT (Customs-Trade Partnership Against Terrorism) certified to help mitigate potential import/export issues;
Adopt a risk management mindset and shift from focusing on lowest cost to best value for your spend;
Increase collaboration with internal and external partners;
Expect warehousing costs to rise in the short term as companies look to bring in foreign-made goods ahead of tariffs;
Expect greater scrutiny from U.S. Customs and Border Patrol of origin statements for imports in recognition of attempts by some Chinese manufacturers to evade U.S. import policies;
Reduce dependency on China for sourcing; and
Consider manufacturing and/or sourcing in the United States.
Cook advises readers to expect a loosening up of regulations and a reduction in government under Trump. He warns that while some world leaders will look to work with Trump, others will take more of a defiant stance. As a result, companies should expect to see retaliatory tariffs and duties on exports.
Cook concludes by offering advice to the incoming administration, including being sensitive to the effect retaliatory tariffs can have on American exports, working on federal debt reduction, and considering promoting free trade zones. He also proposes an ambitious water works program through the Army Corps of Engineers.
ReposiTrak, a global food traceability network operator, will partner with Upshop, a provider of store operations technology for food retailers, to create an end-to-end grocery traceability solution that reaches from the supply chain to the retail store, the firms said today.
The partnership creates a data connection between suppliers and the retail store. It works by integrating Salt Lake City-based ReposiTrak’s network of thousands of suppliers and their traceability shipment data with Austin, Texas-based Upshop’s network of more than 450 retailers and their retail stores.
That accomplishment is important because it will allow food sector trading partners to meet the U.S. FDA’s Food Safety Modernization Act Section 204d (FSMA 204) requirements that they must create and store complete traceability records for certain foods.
And according to ReposiTrak and Upshop, the traceability solution may also unlock potential business benefits. It could do that by creating margin and growth opportunities in stores by connecting supply chain data with store data, thus allowing users to optimize inventory, labor, and customer experience management automation.
"Traceability requires data from the supply chain and – importantly – confirmation at the retail store that the proper and accurate lot code data from each shipment has been captured when the product is received. The missing piece for us has been the supply chain data. ReposiTrak is the leader in capturing and managing supply chain data, starting at the suppliers. Together, we can deliver a single, comprehensive traceability solution," Mark Hawthorne, chief innovation and strategy officer at Upshop, said in a release.
"Once the data is flowing the benefits are compounding. Traceability data can be used to improve food safety, reduce invoice discrepancies, and identify ways to reduce waste and improve efficiencies throughout the store,” Hawthorne said.
Under FSMA 204, retailers are required by law to track Key Data Elements (KDEs) to the store-level for every shipment containing high-risk food items from the Food Traceability List (FTL). ReposiTrak and Upshop say that major industry retailers have made public commitments to traceability, announcing programs that require more traceability data for all food product on a faster timeline. The efforts of those retailers have activated the industry, motivating others to institute traceability programs now, ahead of the FDA’s enforcement deadline of January 20, 2026.