What Biden’s FLOW initiative needs to do right

To say that the supply chain and logistics industries are going through one of the toughest times in their history would be a massive understatement. But perhaps most concerning about the current situation is that many of the problems causing this current crisis are not new. Instead, they’ve been swept under the rug for decades.

Perhaps more than any other industry, the supply chain and logistics sectors are synonymous with aversion to change and slow adoption of technology. And while it’s true that many organizations in these spaces have accelerated their digital transformation efforts since the onset of COVID, these industries may be more fragmented than ever, as companies are at different stages of technological maturity. This is especially true when it comes to sharing information and data between stakeholders in these two industries.

And with that, one could perhaps argue that the Biden administration’s Freight Logistics Optimization Works (FLOW) initiative — which aims to alleviate friction over data sharing in the supply chain and logistics – couldn’t have come at a better time. However, for this pilot program to be right, there are several data considerations that need to be addressed early on, or else FLOW could end up as another failed attempt to modernize America’s stumbling supply chain.

Here are a few areas in particular that need to be considered first.

The role of AI and where does it fit?

As the popularity of AI continues to grow, the AI ​​supply chain market is expected to be worth over $14 billion by 2028. That said, with many supply chain companies and logistics that are just dipping their toes in AI waters, the Biden administration needs to find a way to reconcile the disparate data operations environment that exists in these industries today.

If not, it will create significant barriers to creating a reliable and standardized approach to sharing information and data. Additionally, if not managed properly, it will likely create a layered environment where some companies will be excluded from this new data sharing workflow while others can more easily benefit from it.


Data transparency remains a huge issue for many areas of the supply chain, especially when it comes to sourcing and on-time delivery (OTD). For the industry to operate as efficiently as possible, organizations must have access to everything the data they need to communicate clearly with each other and optimize their own performance. And unfortunately, due to a variety of factors – ranging from opaque brokers to siled internal technology infrastructures – this level of transparency simply does not exist in the supply chain today.

Focus on available capacity

Capacity is a very good starting point for determining which areas need better data and monitoring the most. The capacity landscape over the past two years has been defined by unpredictability – with long periods of low capacity suddenly interrupted by short periods of high capacity. This has made it incredibly difficult for the industry to anticipate and “get things done” with any semblance of consistency.

Additionally, this lack of capacity information has had repercussions, with carriers having to rethink their routes and fuel consumption, leading to new backups at loading docks and higher costs associated with all of these variables. . Therefore, finding ways to improve capacity data insights should be a top priority.

There is certainly a lot of work to be done to improve data efficiency in the supply chain and logistics space. However, if the FLOW initiative can put this new data infrastructure in place, it could prove to be a huge success in moving the supply chain forward for years to come.

Comments are closed.