Designing a distribution network is both art and science?it requires pleasing finicky retailers while keeping costs in line. Though slick software helps, most of the work is in getting the numbers?and getting them right.
Peter Bradley is an award-winning career journalist with more than three decades of experience in both newspapers and national business magazines. His credentials include seven years as the transportation and supply chain editor at Purchasing Magazine and six years as the chief editor of Logistics Management.
The old aphorism about genius being 99 percent perspiration applies to many things in life and business —writing, research, parenting, invention and more.
One small example from the business logistics corner of the world is the design of a distribution network. Though distribution execs now have software to help them blast through the complexities and provide neatly packaged comparisons of multiple options, in the end, success depends on how well prepared you are going into the process.
For consumer packaged goods (CPG) companies in particular, getting the network right—achieving the right balance of inventory, material handling and transportation costs, while offering service levels that satisfy their retailer customers' unforgiving requirements—is crucial.
A number of experts who provide both software tools and consulting advice for customers involved in network design and implementation acknowledge that the bulk of the work comes up front, primarily in gathering and validating historical data and business forecasts that accurately reflect the business. Bruce Baring, director of strategic services for Peach State Integrated Technologies, says that anyone entering into a network analysis should expect to spend 60 to 80 percent of the time collecting, cleaning and validating data.
That can, of course, vary widely, depending on the project and on how much mastery the company has over its own data. While a typical network analysis project can take three to four months, others unfold much more rapidly. Bob Belshaw, chief operating officer of Insight, cites one such project with Motorola as an example. "The whole study took only six weeks," he says, which is about half the average time required. "It's really a case of whether the team has access to the data."
Only when the data have been crunched does the fun begin—when the modeling engines take over to develop a variety of options.
Be wise, optimize!
Given the difficulty of evaluating and optimizing an entire network—not to mention the time required for the project—it's not something most companies jump into without a compelling reason.
Baring lists several drivers. "A merger or an acquisition is always a good time to review the network," he says. "If you are acquiring manufacturing or distribution, your demand patterns may be changing, and that's a good time to look at your facilities."
A second reason, he says, is growth, particularly in specific geographic areas where growth may be exceeding historical norms.
"The other big indicator," he says, "is a significant change in sourcing."
Dan Sobbott, director of business development for Slim Technologies, a developer of optimization software with a number of retail and consumer goods customers, sees it much the same way. "There are maybe three big drivers that we see," he says. "Growth is one of them. It may be more stores or product launches or the product mix is changing. Planning those things through the distribution network is a motivating factor.
"Second, we see a fair amount of companies interested after a merger or acquisition. Strategic reasons drive companies together, but after that, they face the fundamental questions of how to bring two supply chains together.
"The third largest reason: cost cutting initiatives. They may have done some benchmarking and [have realized] they're sort of out of line with where they should be."
Belshaw, whose company's Sails network optimization engine is used by a number of major CPG players, argues that manufacturers should not necessarily wait for a precipitating event to look at their networks. "It's really something in all industries that needs to be looked at on a very frequent basis," he contends. He says that with the pace of business in most industries constantly accelerating, it makes sense to evaluate the network regularly to see if it needs fine tuning.
Overall, the CPG companies are constantly adding new products, almost as quickly as their high-tech brethren, he says. "They are constantly introducing new products, taking older products off the market, adding whole new brands or divesting. Those have significant impacts on the network … on warehousing or transportation or inventory positioning. When it comes to the supply chain, there is so much ripple effect. When you do something in manufacturing, it has huge implications for distribution and transportation."
Numbers, then more numbers
Whether the review is prompted by a major change in business or represents a regularly occurring event, the first step is the hardest step—gathering and validating data.
That's somewhat easier than it used to be, thanks to the enormous gains in data gathering capabilities in most industries over the last decade or so.
Sobbott points to the retail industry as an example. "Retailers historically have focused less on distribution and more on merchandising," he says. "Now, with substantial information available from point-of-sale (POS) data, they have the data for fact-based planning. So supply chain initiatives are of greater and greater interest."
Sobbott, Belshaw and Baring all agree on the crucial need for good data. In fact, Sobbott calls data readiness "the greatest challenge in a network optimization study. You have to have the right data."
But what data?
"The first thing to do is to develop data profiles," says Baring, "with demand analysis, inbound and outbound transportation costs, and fixed and variable warehousing costs.You have to pull all this historical data. Then you have to figure out where the business is going—you want to plan for the future, not the past.How is business going to evolve? How will the supply chain look three or five years from now? That's a pretty extensive step."
Next comes the validation step—testing the data before moving forward. Essentially, the data fed into the optimization engine are compared to the actual historical events— and they should match.
"We build a bubble map based on demand in different parts of the country," Baring says. "That provides a good visual validation. You typically should see bubbles where Wal-Mart or Target DCs are located." But he cautions users to make sure they're collecting and using "ship-to" and not "bill-to" addresses. "You shouldn't see a big bubble around Bentonville, Ark.," he says.
Sobbott explains, "The first thing you try to do is produce a baseline model that replicates a historical model. You want to include all the costs, volumes and activities. That allows you to understand if you have data that is well defined and that is replicating supply chain activities accurately. The validation model is constrained: We're trying to make it act like the historical time period."
For all the power of some of the optimization engines, the enormous volume of data in even a modest supply chain requires aggregation in appropriate ways in order to make it manageable.
"When you build the network optimization, there are only so many variables the software can handle efficiently," Baring says. "You may have demand to the three-digit ZIP code level, or aggregate by product types where you group together like products based on source, handling or similar cube-to-weight ratio. That simplifies the math in the optimization engine."
Multiple models
Once the validation is complete, it's time to take the constraints off the software and let it run.
"We unconstrain the model to do the optimization," Sobbott says. "We run a lot of scenarios." The options, he says, are almost unlimited, going from fine-tuning distribution using existing DCs, to closing some and opening others, to a complete green field analysis.
"We teach people to use a green field analysis," Belshaw says. "If you could put your distribution anywhere, where would that be?" The point, he says, is to see an optimal solution. "We never go there 100 percent," he says. But what it does is set some outlying goals for the potential of an efficient supply chain.
Scenarios can compare national versus regional distribution, making use of third parties, segregating some inventory such as slow movers, and more. "It's not uncommon to run dozens of scenarios," Sobbott says. "There may be a dozen runs on regional DC scenarios.
"There may be other scenarios," Sobbott says, "closing or opening DCs and actually changing assets.We may look at service where there's limited time to ship to the customer. Some may look at a regional DC strategy, so the average length of haul gets shorter.We look at overall costs.
"One thing we've started to see more with CPGs is looking at a centralized versus regional distribution strategy." The tradeoffs are obvious to any DC pro: Centralizing DCs means longer shipping distances but lower inventory costs.
Decentralizing and using more regional DCs, on the other hand, means faster customer service but higher inventory costs.What the analysis does is quantify those tradeoffs in dollars.
"You know what it will cost you," Sobbott says. "You will know optimally where to locate—where to put your DCs and how large they should be."
There are limitations, of course. The scenarios may not tell you much about real estate costs, or labor availability and rates, for instance. But they do provide solid information on where to concentrate your investigations after the optimization study is completed.
Belshaw says that in a typical study, the optimization engine will run 45 to 75 analyses. "We do lots of sensitivity runs," he says. The idea is to see how the proposed solutions would work if some of the forward-looking assumptions prove incorrect—for instance, if demand in a year or two exceeds projections used in the analysis. "You don't want to build a network that's so fragile that if business grows 12 percent, not 10 percent, you're in trouble."
Baring agrees. "Once we come up with the best solution, then we start to test sensitivities. If you said you'd be growing the market at a 20-percent pace but it only grows 5 percent —is it still the right strategy? Or what if the labor rate is 10 percent higher [than modeled], is it still the best solution? That gives an indication if the solution is fairly robust."
Human intervention
Analyzing the various scenarios and deciding how to proceed is the final step in this network analysis. "It is not always a case of the lowest cost," Sobbott says. "There are a number of key business factors and strategies [to consider]."
This is where business experience and understanding of strategic objectives is crucial. The software shows options, but human intelligence drives implementation decisions.
"The key thing in any network optimization is trying to balance the cost and service relationship," Baring says. "Network optimization is not the be all and end all, but a strategic tool to support business decisions. The decisions have to make sense. You really have to bring an operational bias to the exercise." For instance, scenarios may point to abandoning some geography or reducing service levels to a key customer—options that may be efficient, but are just plain bad business. "The answer may come back not to serve an area, but no CPG is going to do that," Belshaw says.
The network analysis is just the start. The implementation phase is where the heavy spending and effort take place, where timing and investment decisions are enacted, and the network may be most at risk as facilities are opened, revamped or closed. But all that requires a map—and that's where the heavy lifting up front pays off.
Congestion on U.S. highways is costing the trucking industry big, according to research from the American Transportation Research Institute (ATRI), released today.
The group found that traffic congestion on U.S. highways added $108.8 billion in costs to the trucking industry in 2022, a record high. The information comes from ATRI’s Cost of Congestion study, which is part of the organization’s ongoing highway performance measurement research.
Total hours of congestion fell slightly compared to 2021 due to softening freight market conditions, but the cost of operating a truck increased at a much higher rate, according to the research. As a result, the overall cost of congestion increased by 15% year-over-year—a level equivalent to more than 430,000 commercial truck drivers sitting idle for one work year and an average cost of $7,588 for every registered combination truck.
The analysis also identified metropolitan delays and related impacts, showing that the top 10 most-congested states each experienced added costs of more than $8 billion. That list was led by Texas, at $9.17 billion in added costs; California, at $8.77 billion; and Florida, $8.44 billion. Rounding out the top 10 list were New York, Georgia, New Jersey, Illinois, Pennsylvania, Louisiana, and Tennessee. Combined, the top 10 states account for more than half of the trucking industry’s congestion costs nationwide—52%, according to the research.
The metro areas with the highest congestion costs include New York City, $6.68 billion; Miami, $3.2 billion; and Chicago, $3.14 billion.
ATRI’s analysis also found that the trucking industry wasted more than 6.4 billion gallons of diesel fuel in 2022 due to congestion, resulting in additional fuel costs of $32.1 billion.
ATRI used a combination of data sources, including its truck GPS database and Operational Costs study benchmarks, to calculate the impacts of trucking delays on major U.S. roadways.
There’s a photo from 1971 that John Kent, professor of supply chain management at the University of Arkansas, likes to show. It’s of a shaggy-haired 18-year-old named Glenn Cowan grinning at three-time world table tennis champion Zhuang Zedong, while holding a silk tapestry Zhuang had just given him. Cowan was a member of the U.S. table tennis team who participated in the 1971 World Table Tennis Championships in Nagoya, Japan. Story has it that one morning, he overslept and missed his bus to the tournament and had to hitch a ride with the Chinese national team and met and connected with Zhuang.
Cowan and Zhuang’s interaction led to an invitation for the U.S. team to visit China. At the time, the two countries were just beginning to emerge from a 20-year period of decidedly frosty relations, strict travel bans, and trade restrictions. The highly publicized trip signaled a willingness on both sides to renew relations and launched the term “pingpong diplomacy.”
Kent, who is a senior fellow at the George H. W. Bush Foundation for U.S.-China Relations, believes the photograph is a good reminder that some 50-odd years ago, the economies of the United States and China were not as tightly interwoven as they are today. At the time, the Nixon administration was looking to form closer political and economic ties between the two countries in hopes of reducing chances of future conflict (and to weaken alliances among Communist countries).
The signals coming out of Washington and Beijing are now, of course, much different than they were in the early 1970s. Instead of advocating for better relations, political rhetoric focuses on the need for the U.S. to “decouple” from China. Both Republicans and Democrats have warned that the U.S. economy is too dependent on goods manufactured in China. They see this dependency as a threat to economic strength, American jobs, supply chain resiliency, and national security.
Supply chain professionals, however, know that extricating ourselves from our reliance on Chinese manufacturing is easier said than done. Many pundits push for a “China + 1” strategy, where companies diversify their manufacturing and sourcing options beyond China. But in reality, that “plus one” is often a Chinese company operating in a different country or a non-Chinese manufacturer that is still heavily dependent on material or subcomponents made in China.
This is the problem when supply chain decisions are made on a global scale without input from supply chain professionals. In an article in the Arkansas Democrat-Gazette, Kent argues that, “The discussions on supply chains mainly take place between government officials who typically bring many other competing issues and agendas to the table. Corporate entities—the individuals and companies directly impacted by supply chains—tend to be under-represented in the conversation.”
Kent is a proponent of what he calls “supply chain diplomacy,” where experts from academia and industry from the U.S. and China work collaboratively to create better, more efficient global supply chains. Take, for example, the “Peace Beans” project that Kent is involved with. This project, jointly formed by Zhejiang University and the Bush China Foundation, proposes balancing supply chains by exporting soybeans from Arkansas to tofu producers in China’s Yunnan province, and, in return, importing coffee beans grown in Yunnan to coffee roasters in Arkansas. Kent believes the operation could even use the same transportation equipment.
The benefits of working collaboratively—instead of continuing to build friction in the supply chain through tariffs and adversarial relationships—are numerous, according to Kent and his colleagues. They believe it would be much better if the two major world economies worked together on issues like global inflation, climate change, and artificial intelligence.
And such relations could play a significant role in strengthening world peace, particularly in light of ongoing tensions over Taiwan. Because, as Kent writes, “The 19th-century idea that ‘When goods don’t cross borders, soldiers will’ is as true today as ever. Perhaps more so.”
Hyster-Yale Materials Handling today announced its plans to fulfill the domestic manufacturing requirements of the Build America, Buy America (BABA) Act for certain portions of its lineup of forklift trucks and container handling equipment.
That means the Greenville, North Carolina-based company now plans to expand its existing American manufacturing with a targeted set of high-capacity models, including electric options, that align with the needs of infrastructure projects subject to BABA requirements. The company’s plans include determining the optimal production location in the United States, strategically expanding sourcing agreements to meet local material requirements, and further developing electric power options for high-capacity equipment.
As a part of the 2021 Infrastructure Investment and Jobs Act, the BABA Act aims to increase the use of American-made materials in federally funded infrastructure projects across the U.S., Hyster-Yale says. It was enacted as part of a broader effort to boost domestic manufacturing and economic growth, and mandates that federal dollars allocated to infrastructure – such as roads, bridges, ports and public transit systems – must prioritize materials produced in the USA, including critical items like steel, iron and various construction materials.
Hyster-Yale’s footprint in the U.S. is spread across 10 locations, including three manufacturing facilities.
“Our leadership is fully invested in meeting the needs of businesses that require BABA-compliant material handling solutions,” Tony Salgado, Hyster-Yale’s chief operating officer, said in a release. “We are working to partner with our key domestic suppliers, as well as identifying how best to leverage our own American manufacturing footprint to deliver a competitive solution for our customers and stakeholders. But beyond mere compliance, and in line with the many areas of our business where we are evolving to better support our customers, our commitment remains steadfast. We are dedicated to delivering industry-leading standards in design, durability and performance — qualities that have become synonymous with our brands worldwide and that our customers have come to rely on and expect.”
In a separate move, the U.S. Environmental Protection Agency (EPA) also gave its approval for the state to advance its Heavy-Duty Omnibus Rule, which is crafted to significantly reduce smog-forming nitrogen oxide (NOx) emissions from new heavy-duty, diesel-powered trucks.
Both rules are intended to deliver health benefits to California citizens affected by vehicle pollution, according to the environmental group Earthjustice. If the state gets federal approval for the final steps to become law, the rules mean that cars on the road in California will largely be zero-emissions a generation from now in the 2050s, accounting for the average vehicle lifespan of vehicles with internal combustion engine (ICE) power sold before that 2035 date.
“This might read like checking a bureaucratic box, but EPA’s approval is a critical step forward in protecting our lungs from pollution and our wallets from the expenses of combustion fuels,” Paul Cort, director of Earthjustice’s Right To Zero campaign, said in a release. “The gradual shift in car sales to zero-emissions models will cut smog and household costs while growing California’s clean energy workforce. Cutting truck pollution will help clear our skies of smog. EPA should now approve the remaining authorization requests from California to allow the state to clean its air and protect its residents.”
However, the truck drivers' industry group Owner-Operator Independent Drivers Association (OOIDA) pushed back against the federal decision allowing the Omnibus Low-NOx rule to advance. "The Omnibus Low-NOx waiver for California calls into question the policymaking process under the Biden administration's EPA. Purposefully injecting uncertainty into a $588 billion American industry is bad for our economy and makes no meaningful progress towards purported environmental goals," (OOIDA) President Todd Spencer said in a release. "EPA's credibility outside of radical environmental circles would have been better served by working with regulated industries rather than ramming through last-minute special interest favors. We look forward to working with the Trump administration's EPA in good faith towards achievable environmental outcomes.”
Editor's note:This article was revised on December 18 to add reaction from OOIDA.
A Canadian startup that provides AI-powered logistics solutions has gained $5.5 million in seed funding to support its concept of creating a digital platform for global trade, according to Toronto-based Starboard.
The round was led by Eclipse, with participation from previous backers Garuda Ventures and Everywhere Ventures. The firm says it will use its new backing to expand its engineering team in Toronto and accelerate its AI-driven product development to simplify supply chain complexities.
According to Starboard, the logistics industry is under immense pressure to adapt to the growing complexity of global trade, which has hit recent hurdles such as the strike at U.S. east and gulf coast ports. That situation calls for innovative solutions to streamline operations and reduce costs for operators.
As a potential solution, Starboard offers its flagship product, which it defines as an AI-based transportation management system (TMS) and rate management system that helps mid-sized freight forwarders operate more efficiently and win more business. More broadly, Starboard says it is building the virtual infrastructure for global trade, allowing freight companies to leverage AI and machine learning to optimize operations such as processing shipments in real time, reconciling invoices, and following up on payments.
"This investment is a pivotal step in our mission to unlock the power of AI for our customers," said Sumeet Trehan, Co-Founder and CEO of Starboard. "Global trade has long been plagued by inefficiencies that drive up costs and reduce competitiveness. Our platform is designed to empower SMB freight forwarders—the backbone of more than $20 trillion in global trade and $1 trillion in logistics spend—with the tools they need to thrive in this complex ecosystem."