Today, we’re massively excited to announce our investment in nextmv – that gives developers the building blocks to quickly create, test, and deploy supply chain algorithms. nextmv's decision modelling, simulation, and optimization tools are built for software engineers in the supply chain to make automated decisions related to routing, scheduling, and assignment that would normally require a deep understanding of data science.
We were initially introduced to Carolyn Mooney and Ryan O'Neil, the founders of nextmv, by our good friend Alex Iskold at 2048 Ventures who had invested in them only a few months earlier. Alex’s feedback on the team was exceptional - we were soon to agree with his opinion.
While it’s easy to highlight legacy businesses with highly siloed, spaghetti-code systems and their struggle to recruit and retain high-quality data scientists and engineers, emerging startups also find it hugely difficult to maintain complex optimization environments. Both legacy and emerging businesses in supply chain operate on exceptionally tight margins making optimization of supply chain critical to unit economics and building a profitable organization that delivers and exceeds customer expectations. Many businesses in the supply chain are losing money because of suboptimal resource allocation as a result of little to no optimization capability. As data continues to grow at an exponential pace, more and more businesses can become more algorithmically optimized by leveraging this data.
Broadly speaking, businesses across the supply chain are increasingly demanding real-time optimization to maximize their bottom line. Further, there is a knowledge gap between software engineers, the business need, and the “deep calculus” required for supply chain optimization. nextmv allows organizations to leverage optimization and simulations to improve and leverage existing predictions and forecasts with confidence.
nextmv helps improve a software engineer’s productivity but also creates greater transparency around how decisions are made. Every organization will have a different approach to optimization, each with different priorities and criteria. Being able to develop a “personalized” outcome will become increasingly more important rather than leverage an external “black box” solution.
Those companies that do try to optimize decisions tend to build custom experiment infrastructure in-house. Development frequently stagnates due to limited resources, lack of competency, and difficulty to support on an ongoing basis.
Carolyn and Ryan both have deep technical knowledge with direct experience of building decisioning engines most recently Grubhub – were they ran into the problem of running repeatable experiments at scale. This was very manual with long lead times that required building in-house infrastructure to streamline and optimize decision making models. By the way, Carolyn is literally a rocket scientist having worked in ballistic missiles engineering at Lockheed Martin while Ryan (or Dr. Ryan) has a PhD in Operations Research.
We could not be more excited for nextmv to be joining the Dynamo family and look forward to being part of their journey over the years to come.