Pilot Purgatory - Part 6
Pilot Purgatory
A Story of AI Transformation
By Scott Weiner (AI Lead at NeuEon, Inc.), inspired by conversations with Erwann Couesbot (CEO of FlipThrough.ai)
Note: This is a work of fiction. All characters, companies, and events are fictional composites created for illustrative purposes. While the industry statistics cited are real and sourced, the narrative is designed to illuminate common patterns in enterprise AI adoption, not to depict any actual organization or individuals.
This is Part 6 of a serialized story exploring why enterprise AI initiatives fail, not from lack of technology or talent, but from invisible organizational dynamics that doom them from the start.
Reading the series for the first time? Start with Part 1: The Mandate
Previously in Pilot Purgatory…
The screenshot arrived at 2:47 PM on a Thursday.
Gerald Patterson, a maintenance supervisor who’d used Thornfield equipment for twelve years, had asked the AI chatbot a simple question: “Does the extended warranty cover firmware updates?”
The chatbot responded confidently. Detailed. Professional. Completely wrong.
It invented warranty terms that didn’t exist. Section 4.3 of the actual warranty explicitly excluded firmware coverage. The AI just… made it up.
Gerald took a screenshot. Forwarded it to procurement. Bought 14 controllers based on that response. Three weeks later, a firmware update bricked two units. Gerald called for warranty replacement. Thornfield told him firmware wasn’t covered.
Gerald pulled up his screenshot.
The remediation cost $93,000. The forum post got 400 shares. The efficiency gains evaporated overnight.
Sarah, the security engineer, had flagged this risk four months earlier. Her recommendations sat in a backlog marked “deferred.”
The team met to discuss lessons learned. The other pilots—code review AI, predictive maintenance—died quietly, victims of inconsistent results and unexplainable predictions.
And in a late-night email exchange, Marcus and Sarah both arrived at the same recognition: something structural had to change…
Chapter 6: The Weight
The laptop screen dimmed at 11:47 PM, the automatic power saver activating while Marcus stared at a paragraph he had read four times without absorbing.
He reached for his coffee. The cup was cold, a skin forming on the surface that told him he had been sitting here longer than he realized. Through the reflection in the darkened screen, he could see his own face: a man who hadn’t blinked in minutes, trying to force information into a brain that had stopped accepting input hours ago.
The article summarized industry research on AI adoption failures. McKinsey had found that only about one-third of AI projects ever scale beyond pilot phase. Gartner’s data was even more sobering—they predicted 30% of generative AI projects would be abandoned after proof of concept by the end of 2025. The patterns described were starting to feel uncomfortably familiar.
“The most common failure mode is not technical,” Marcus read for the fifth time. “It is structural. Organizations attempt to bolt AI capabilities onto existing technology leadership structures, expecting 10% of a CTO’s attention to produce 100% of an AI transformation’s requirements.”
He highlighted the sentence. Then he unhighlighted it. Then he closed the laptop and sat in the darkness of his home office, the familiar silence of a house where everyone else had long since gone to bed.
Sixteen months. The AI initiative was sixteen months old now, and he was reading papers at midnight trying to understand why it felt like pushing a boulder uphill.
The coffee had gone cold. He drank it anyway.
Monday morning brought the kind of chaos that had become routine.
Marcus arrived at 7:30 to find fourteen Slack messages waiting, all from the integration team, all flagged urgent. Quartzvane had pushed an update over the weekend. The changelog had been characteristically brief: “Minor improvements to API response handling.”
The minor improvements had broken three integrations.
“The authentication handshake is completely different,” the lead integration engineer explained during the 8 AM emergency call. “They changed the token format without telling anyone, and every system that talks to Quartzvane is now getting 401 errors.”
“How long to fix?”
“If we drop everything else? A week. Maybe two, depending on what other surprises we find.”
Marcus thought about the project timeline he had presented to David last month. The milestones that were already slipping. The board meeting that was now ten weeks away.
“Drop everything else,” he said. “Fix it.”
He hung up and stared at the ceiling. Minor improvements. Three broken integrations. Two weeks of unplanned work.
The vendor had made a unilateral decision that cost Thornfield a month of engineering time, and there was nothing in the contract that made them accountable for it. Linda had negotiated the best terms she could, but enterprise software contracts didn’t anticipate the chaos of AI platforms that were still figuring out what they were.
He added “vendor update issue” to his status report and tried to remember what he had planned to accomplish that day before the emergency derailed everything.
The cloud cost meeting happened at 2 PM.
Marcus had been dreading it since the monthly report landed in his inbox. The AI infrastructure costs were running forty percent over projection, and he couldn’t explain why.
“It’s the experimentation workloads,” the infrastructure lead offered. “The team is spinning up GPU instances for model training, running experiments, then leaving resources allocated when they’re done.”
“Why aren’t they shutting things down?”
“Because the Quartzvane documentation says to keep inference endpoints warm for optimal response times. Everyone’s keeping everything warm. All the time.”
“At $2,000 a day in unused compute?”
The infrastructure lead shrugged. It wasn’t his budget to defend. He was just reporting what the systems showed.
Marcus pulled up the cost breakdown: GPU instances, data transfer, storage for model artifacts, logging and monitoring. Each line item was reasonable in isolation, but together they added up to a variance that would require explanation.
“We need to implement cost governance,” he said. “Automatic shutdown policies, budget alerts, resource tagging.”
“That’ll take engineering time.”
“I know.”
Engineering time they didn’t have. Engineering time that was currently being consumed by vendor updates that broke things, data quality issues that never got resolved, and the endless overhead of keeping systems running that were never quite ready for production.
He approved the cost overrun and made a note to mention it to David before the finance team did. Bad news traveled better when it came from the source.
Sarah appeared in his doorway at 4:30 with a paper in her hand.
“I found something you should see,” she said.
Marcus waved her in. The stress ball was somewhere on his desk, buried under documents he hadn’t finished reviewing. He didn’t bother looking for it.
Sarah handed him the paper. “Governance-First AI: How Structure Enables Speed.” The same title he had seen in his email weeks ago, now printed and highlighted with annotations in Sarah’s precise handwriting.
“I’ve been tracking citations,” she said. “This research is getting picked up by a lot of enterprise publications. The findings are consistent across industries.”
Marcus skimmed the abstract, with its discussion of dedicated AI leadership, governance frameworks, and structure enabling speed. The words felt both obvious and impossible.
“What are you suggesting?”
“I’m not suggesting anything. I’m showing you what the research says.” She paused. “The organizations that succeed are the ones that treat AI as a distinct discipline, not a side project for existing technology leaders.”
“We can’t just create a new executive position.”
“I know. But we also can’t keep doing what we’re doing and expecting different results.”
She left the paper on his desk and returned to her office. Marcus read the abstract again, more carefully this time.
The research described companies that had hired Chief AI Officers. Dedicated leaders whose only job was AI outcomes. People who woke up every morning thinking about governance, strategy, and execution. People who didn’t have to squeeze AI attention between cloud migrations and security audits and board presentations.
People who weren’t him.
He put the paper on the pile with the others and turned back to the cost report he still had to finish.
Linda’s meeting request arrived with a subject line that told him everything: “Quartzvane Enhancement Request – Pricing Discussion.”
He took the call in his office with the door closed.
“They want $150,000,” Linda said. “For a feature we assumed was included.”
“What feature?”
“Native integration with our ERP system. When we signed the contract, they said they had connectors for all major enterprise platforms. Turns out ‘connectors’ means basic data export. Real integration, the kind that actually works with our workflows, is a premium add-on.”
Marcus felt the familiar weight settle on his shoulders. “What do we get for $150,000?”
“Custom development. Six-month implementation. And, I quote, ‘ongoing support at our standard enterprise rates.'”
“Which are?”
“Another $40,000 a year.”
He ran the math. The Quartzvane contract was already $800,000 for two years. Add the integration enhancement, add the ongoing support, add the cost overruns from unexpected complexity, and they were approaching the total budget he had requested for the entire AI initiative.
And they still didn’t have a single system in full production.
“Can we build it ourselves?” he asked.
“The integration team says they could do it in four months. But that’s four months they’re not spending on other priorities. And they’re already underwater from the authentication change.”
“What do you recommend?”
Linda was quiet for a moment. When she spoke, her voice had an edge he didn’t hear often. “I recommend we acknowledge that I bought the wrong solution. I evaluated thirty-two vendors and picked the one with the best demo. The demos lied. The contracts didn’t protect us. And now we’re locked in for eighteen more months with a platform that’s costing more to integrate than it would have cost to build from scratch.”
“You didn’t have a way to know that.”
“That’s the problem, Marcus. Neither of us had a way to know. We didn’t have the expertise to evaluate AI vendors. We applied the same frameworks we use for ERP systems and security tools, and those frameworks don’t work for this.”
He didn’t have an argument. She was right.
“Tell them we’ll consider the enhancement and get back to them,” he said. “I need to think about this.”
“Sure. But Marcus? They know we need this integration. They know we’re locked in. The price isn’t going to go down.”
Priya was still at her desk at 9 PM.
Marcus found her there on his way out, her monitors showing the same data quality dashboards he had seen in her presentations months ago. The numbers were different now. Worse.
“You’re here late,” he said.
“I’m always here late.” She didn’t look up from her screen. “The procurement AI recommended ordering 200 units of a component that was recalled three years ago. The recall notice was in a PDF that never got ingested into the structured data. The AI didn’t know.”
“Did anyone catch it?”
“I caught it. At 7 PM. After I finished fixing the other three data issues I found this week.”
Marcus leaned against the doorframe. The engineering floor was empty except for the two of them, the overhead lights dimmed to save energy after hours.
“How bad is it?”
“The data? The same as I said six months ago: fifteen years of accumulated inconsistencies, missing fields, conflicting information, products that don’t exist alongside products with wrong specifications and documentation that contradicts the database.” She finally looked at him. “The AI reads all of it and makes predictions with the confidence of a system that doesn’t know its inputs are garbage.”
“What would it take to fix?”
“The data? A dedicated team. Six to twelve months. A mandate to pause AI deployments until the foundation is solid.” She turned back to her screen. “None of which we have.”
“I’m sorry.”
“Don’t be sorry. Be honest. With David. With the board. With yourself.” She started typing again. “We’re not behind because the technology doesn’t work. We’re behind because we’re trying to build a house on quicksand and wondering why the walls keep cracking.”
Marcus wanted to argue. He wanted to point to the progress they had made, the models that were working, the efficiency gains that existed even if they were smaller than promised.
But standing in the dim light of an empty engineering floor, watching Priya clean up data errors that would generate new AI failures by the end of the week, he couldn’t find the words.
The fourth engineer left on a Thursday.
Kevin Rodriguez, the infrastructure specialist who had been managing the Quartzvane deployment. His exit interview was brief and polite, the kind of conversation that happened when someone had already made peace with their decision.
“I’m going to Strathmore,” he said. “They’ve got an AI team of twenty people with dedicated roles, clear ownership, and no context switching.”
“Strathmore.” The competitor. The one Marcus had heard about on the radio, the one announcing success while Thornfield was still trying to figure out the basics.
“They’ve got a Chief AI Officer,” Kevin continued. “Someone whose whole job is making AI work. Not a CTO trying to fit it in between everything else.”
Marcus nodded. There was nothing to say that he hadn’t already said to Daniel, to Maria, to James. The pattern was clear now, visible in departure after departure.
“Good luck,” he said.
Kevin shook his hand and left. The fourth engineer in eleven months. The fourth set of skills trained on Thornfield’s budget and deployed elsewhere.
Marcus sat in his office after Kevin was gone, staring at the empty chair across his desk. The stress ball was in his hand, but he wasn’t squeezing it—just holding it.
He thought about the paper Sarah had given him. About the 95% of AI projects that stalled. About the structural problem that couldn’t be solved with more effort or better planning.
He almost said it aloud: “I’ve been solving this like an infrastructure problem.”
But he didn’t. Not quite. The recognition was forming, but the words weren’t ready yet.
David’s calendar hold appeared on Friday morning: “Check-in: AI Initiative Status.”
The subject line was neutral. The timing was not. David had started scheduling these conversations more frequently, the casual drop-ins replaced by formal meetings that left documentation trails.
They met in David’s office at 3 PM. The CEO’s space was larger than Marcus’s, but David had a way of making it feel like a conversation rather than a summons.
“Talk to me,” David said. “Not the board version. The real version.”
Marcus considered his options. He could give the optimistic spin, the progress narrative that emphasized achievements over obstacles. He had been doing that for months, and it was getting harder to believe.
“We’re stuck,” he said.
David waited.
“The chatbot incident set us back three months, the vendor integration is more complex than anyone estimated, we’ve lost four engineers to competitors who can offer them pure AI work, and the data foundation isn’t ready for the models we’re trying to build on top of it.” He paused. “And I don’t have a clear path to production that I believe in.”
“What would make it different?”
The question echoed Sarah’s paper, Jennifer’s warning from the first board meeting, and everything Marcus had been avoiding since the pattern became clear.
“Focus,” he said. “Dedicated attention. Someone whose job is making AI work, not someone trying to fit it in around everything else.”
“Someone like a Chief AI Officer?”
Marcus looked at David sharply. “Jennifer’s been talking to you.”
“Jennifer talks to everyone. That’s her job.” David leaned back in his chair. “She sent me a research paper last month. Something about governance-first AI.”
“I’ve seen it.”
“What do you think?”
“I think the research is right. I think we’ve been approaching this wrong. And I think I don’t know how to fix it without changing something fundamental about how we’re structured.”
David was quiet for a long moment. The afternoon sun slanted through his windows, casting long shadows across a desk covered with the artifacts of running a manufacturing company.
“The board meeting is in eight weeks,” he said finally. “Jennifer will ask questions. Hard ones. We need answers.”
“I know.”
“Not the ‘we’re making progress’ answers. Real answers. About what went wrong and what we’re going to do differently.”
Marcus nodded. The conversation felt different than the ones before. Less about managing perceptions and more about confronting reality.
“I’ll have something,” he said. “Something honest.”
“That’s all I’m asking.”
Marcus left David’s office and walked back to his own. The engineering floor was busy with the usual Friday energy, people wrapping up tasks before the weekend, conversations about plans and projects and the work that never quite got finished.
He closed his door and pulled up the board presentation template. The slides from the last meeting were still there, filled with projections that hadn’t materialized and timelines that had already slipped.
He deleted them all and started with a blank page.
Eight weeks. He had eight weeks to figure out what he was going to say to Jennifer. Eight weeks to articulate a problem that was easier to feel than to name.
On his desk, the paper Sarah had given him sat on top of the pile. “Pilot Purgatory,” the title read.
Someone had named it. The thing they had been living for sixteen months. The pattern that explained everything.
He just hadn’t been ready to use the word until now.
To be continued…
What happens next:
The board meeting arrives. Marcus finally names the problem—$1.4 million, eighteen months, and zero systems in full production. Jennifer asks the question that changes everything: “If you could start over, knowing what you know now, what would you do differently?” Chapter 7 follows the moment when pilot purgatory finally breaks, and the path forward becomes clear.
Part 7 publishes February 11, 2026.
The 7 Patterns That Kill AI Initiatives
This chapter reveals several patterns that consistently predict AI failure:
- The Attention Trap: 10% of executive attention cannot produce 100% of AI transformation requirements
- The Vendor Lock-In: Demos that work become contracts that constrain
- The Cost Creep: Each reasonable line item adds up to unreasonable variance
- The Talent Exodus: Engineers leave for companies with dedicated AI structures
- The Data Quicksand: 15 years of inconsistencies can’t support AI predictions
- The Infrastructure Problem Fallacy: AI isn’t like ERP migration; it requires different leadership
- The Recognition Gap: The pattern is easier to feel than to name
Why we wrote this
Scott Weiner is the AI Lead at NeuEon, Inc., where he helps organizations navigate the complexities of AI adoption and digital transformation. This story draws from patterns observed across dozens of enterprise AI initiatives.
Erwann Couesbot is the CEO of FlipThrough.ai, specializing in AI strategy for professional services. His conversations with technology leaders inspired many of the dynamics explored in this narrative.
Reading the series for the first time? Start with Part 1: The Mandate
Missed Part 2? Read Chapter 2: Foundations
Missed Part 3? Read Chapter 3: Procurement
Missed Part 4? Read Chapter 4: Departure
Missed Part 5? Read Chapter 5: The Incident
Want to read the complete story?
Have your own AI transformation story? We’d love to hear it. Connect with Scott on LinkedIn or reach out to NeuEon at neueon.com/contact.
