SpanForge Failure Funnel™
An illustrative representation of where enterprise AI initiatives are typically lost across the lifecycle. Based on publicly reported adoption data from S&P Global Market Intelligence (2025). Specific drop-off rates at each stage are illustrative.
Five numbers that define the crisis
Where initiatives are lost
Most are lost before technical limitations are ever tested. The S&P Global data shows that on average, organisations scrapped 46% of their AI proof-of-concepts before reaching production. Cost, data privacy, and security risks were cited as top obstacles.
Illustrative representation based on publicly reported adoption data from S&P Global Market Intelligence (2025). Specific drop-off rates at each stage are illustrative. The funnel maps to the five stages of the SpanForge Exit Gate System™.
What the data actually shows
The S&P Global Market Intelligence survey of 1,006 enterprises covers sophisticated organisations with dedicated AI budgets, data science teams, and executive mandates. Approximately 42% reported abandoning the majority of their AI initiatives in 2025 — up from 17% in 2024. The doubling of the abandonment rate in a single year is not a trend — it is a structural collapse.
The McKinsey State of AI 2025 survey of 1,993 organisations across 105 countries reinforces this from a different angle: only 39% report any EBIT impact at the enterprise level. Widespread adoption has not translated into widespread delivery.
Initiatives in unstructured exploration phases — proof-of-concept work without defined success criteria — account for a disproportionate share of casualties. This pattern is consistent with the governance gaps identified across the sources cited in this analysis.
“AI is not failing in the lab — it is failing at the handoff to reality.”
The four accelerants
Four structural conditions explain why the abandonment rate is accelerating rather than declining as AI tooling matures.
Capability inflation
Model releases have been so rapid that organisations chronically restart pilots to incorporate the "latest" version, resetting the maturity clock each time.
Governance vacuum
Most enterprises bolted AI onto governance frameworks designed for deterministic software. Gartner (2025) found that 63% of organisations do not have or are unsure they have the right data management and governance practices to support AI — creating unresolvable approval loops and decision gaps.
Talent dilution
As demand for AI practitioners surged, methodology rigour has not kept pace with tool proliferation.
Expectation misalignment
Executive sponsors hold timelines disconnected from enterprise integration realities. When reality diverges, projects are cancelled rather than recalibrated.
References
S&P Global Market Intelligence. (2025). Voice of the Enterprise: AI & Machine Learning, Use Cases 2025. Survey of 1,006 midlevel and senior IT and line-of-business professionals across North America and Europe, conducted October–November 2024.
Gartner, Inc. (2024, July). Gartner Predicts 30% of Generative AI Projects Will Be Abandoned After Proof of Concept By End of 2025. Gartner Data & Analytics Summit, Sydney.
Gartner, Inc. (2025, February). Lack of AI-Ready Data Puts AI Projects at Risk.
McKinsey & Company. (2025). The State of AI in 2025: Agents, Innovation, and Transformation. Survey of 1,993 participants across 105 nations, conducted June–July 2025.