74% of companies that have adopted AI report no measurable value from that investment. That's BCG's finding after surveying more than 1,000 executives across 59 countries (2024). Not because AI doesn't work. Because most businesses pick the wrong partner.
Most AI partners sell a platform or a methodology. They start with the tool, not your problem. That's exactly where things break down. Here's what to look for — and the warning signs you should recognize immediately.
Key takeaways
- 74% of companies get no measurable value from AI investments (BCG, 2024)
- 30% of GenAI pilots are abandoned after proof of concept (Gartner, 2024)
- BCG: 70% of implementation effort should go to people and processes, not technology
- The right AI partner starts with your problem — and tests manually before building anything
Why do most companies get nothing from AI?
Only 28% of AI projects meet their expected return on investment, according to Gartner research across 782 executives in late 2025. 20% fail outright. The rest stall somewhere in between — something got built, but nobody uses it.
NTT DATA put the number even higher: 70 to 85% of GenAI deployments fail to deliver expected outcomes. That's not bad luck. It's a pattern.
BCG investigated why. The answer was simpler than expected: companies put too much effort into technology and too little into the people and processes around it. Successful implementations followed a 70-20-10 split — 70% people and processes, 20% data and technology, 10% algorithms. Most failed implementations did the opposite.
That split explains why many AI partners don't deliver. They hand over the technology, leave you to figure out how it lands in your organization, and send an invoice.
What a good AI partner does differently
A good partner doesn't start with a demo of their platform. They start with questions. Which processes take the most time? Where do most errors happen? What would change if that problem was solved?
Only once the problem is clear does the question of which tool fits come up. Sometimes that's not an AI solution at all. Sometimes it's a simple automation using existing software. That's also an answer you should expect from a good partner.
How I approach it: Before a single line of code gets written, I test the process manually. I do it myself, with the AI tool, in the client's actual work environment. If it doesn't work smoothly by hand, we'd be building on a broken foundation. Most agencies skip that step.
Manual testing sounds obvious. But it saves a lot. Most pilot projects that fail were never tested in the real work environment of the business. They were tested in a demo environment, with clean data, by people who already knew the tool.
A good partner is also honest about timelines. Deloitte's 2025 research found the average payback period for AI investments sits between two and four years. Not the seven to twelve months that vendors prefer to quote. If someone tells you your investment pays back within a year, ask what assumption sits behind that number.
Five questions to ask before you commit
The OECD's December 2025 report on AI adoption by SMEs found that 43% of small businesses cite lack of technical maturity as a barrier, and 35% point to a shortage of skills and data literacy. That uncertainty makes it hard to evaluate a partner. These five questions cut through it.
1. How do you start a project?
The right answer: with an analysis of the problem, not a presentation of the solution. If the first meeting is a product demo, that tells you everything.
2. Can you give an example of a project that didn't work?
Every honest agency has those examples. What you want to hear is what went wrong and what they learned from it. If everything always succeeds, something doesn't add up.
3. What do you need from our team?
AI implementations require something from your people. Who's the point of contact? How many hours per week? If a partner says you barely need to be involved, remember BCG's 70%.
4. How do we measure success?
Good partners want to agree on this before they start, not after. They suggest a baseline measurement. If this question doesn't get a concrete answer, it'll be hard to judge later whether it worked.
5. What happens if it doesn't work?
Can you stop the engagement? Are there contractual commitments for a year or more? What's the exit scenario? A partner who won't discuss this builds their business model on your dependency.
Red flags to recognize immediately
Gartner predicted in 2024 that 30% of GenAI initiatives would be abandoned after the proof-of-concept phase. Poor data quality, escalating costs, and unclear business value were the three most cited reasons. Many of those projects could have been stopped during the selection phase, if the client had spotted these signals.
They lead with a platform pitch. If the first sentence in the conversation is about their own platform, their own methodology, or their own AI model, your processes will be shaped around their tool — not the other way around.
No references in your sector. AI implementation in a logistics company is different from a professional services firm. Processes, data, terminology — everything differs. Ask specifically for clients in your industry, not a general client list.
Guarantees on results. "Minimum 30% time savings, guaranteed" is a sales promise. Results depend on data quality, human adoption, and how well the problem was defined. Any honest partner says that.
Long-term contracts before the pilot. A serious partner is willing to start with a small project and no long-term obligation. If a year-long contract lands on the table before the first phase, the business model likely isn't built around your results.
What to expect in the first month
Good partnerships start small. One process, one problem, one measurement. Not the whole organization at once. If your partner comes out of the first meeting with a six-month project plan, the scope is bigger than what you actually know at that point.
After the first month, you should know: does the approach work in your real environment? Are you actually saving time on the chosen process? Does the working relationship feel right in terms of communication and pace?
If the answer to any of those is no, you want the option to stop or adjust without major contractual consequences. That flexibility in the first phase isn't a weakness of the partner — it's a sign of confidence in the approach.
What I see in companies that start well: They pick one administrative process that costs them at least five hours a week, they measure those five hours before starting, and they accept that the first version might only save three. Then they optimize. Companies that expect everything to work perfectly from day one stop the fastest.
Want to know what AI can do for your business?
I run a free scan of your website and give you an honest view of which processes have real potential. No sales pitch, no platform demo.
Book a free strategy callFrequently asked questions about choosing an AI partner
How do you know if an AI agency has SME experience?
Ask specifically for clients with fewer than 50 employees in a comparable sector. Large agencies typically work with enterprise clients and apply that same approach to smaller businesses — resulting in processes that are too heavy and costs that are too high. Also ask who you'd actually work with: a consultant or an account manager who passes things on?
Is hiring an AI partner more expensive than doing it yourself?
That depends on the time you have internally. The OECD's 2025 SME report found that 43% of small businesses cite lack of technical maturity as a barrier. If you don't have that expertise in-house, doing it yourself costs more time than bringing in a partner — especially one who works per process rather than per year.
Do you need a long-term contract with an AI partner?
No. A pilot or first project shouldn't require a long-term commitment. Contracts longer than three months are hard to justify for a first engagement. Serious partners let their work speak for itself and don't want to hold onto a client who isn't satisfied.
What if AI turns out not to work for my business?
That's also a valid answer. Not every business and not every process is ready for AI implementation. Data quality, process clarity, and team buy-in all influence whether implementation has a realistic chance of success. A good partner also says "not yet" when that's more honest than starting a project that won't work.
Conclusion
The question isn't whether AI works. That's settled. The question is whether you find the right partner to help you implement it properly. That means someone who starts with your problem, tests manually before building, is honest about timelines and costs, and doesn't profit from your dependency.
74% get no value. But 26% do. That 26% started with the right questions.
Read next: AI implementation for SMEs: an honest step-by-step guide that actually works