Something actually changed
Not the hype, which was already everywhere in 2024. What changed in late 2025 is different. AI tools crossed a threshold. They went from "mostly works" to "almost always does exactly what you want." Developers who use them every day started reporting this on their own, independently of each other.
One developer now writes 95% of his code from his phone. Another hasn't typed code manually since December 2025. Those aren't predictions. They're reports from people doing it.
That changes the question. It's no longer whether to use AI. The question now is harder: how do you lead a mixed team where people and AI agents are working side by side, mostly doing the right thing, but occasionally needing you to step in?
This article pulls together what 60+ research papers and reports say about that question, adds a layer of personality analysis, and ends with 32 conclusions worth knowing.
A lot of activity. Less actual progress.
Almost everyone is using AI. Most organizations are deploying AI agents. And yet the majority of CEOs say they haven't seen a real productivity gain. That gap between adoption and results is where the real conversation starts.
Sources: Microsoft Work Trend Index, BCG AI Adoption Report, World Economic Forum 2025/26
That last number is the most immediate problem. 84% of executives expect AI agents on their team. 26% of workers have had any training. The agents are coming. The preparation is not.
AI makes you faster. Not the team.
The individual numbers are real. People produce more. They write faster, research faster, think through more options. A two-year study of a software company found gains of around 50% in individual output. That's not nothing.
But the same study found something else. The team dynamics stayed exactly the same. The same accountability gaps. The same communication problems. The same trust issues between colleagues. AI made everyone faster. It didn't make them work better together.
What got better
- 50% more output per person
- 40% better on creative tasks
- Faster research and writing
- More analytical depth
What stayed the same
- Who owns decisions
- How people communicate
- Trust between colleagues
- Collaboration bottlenecks
The team problems you had before AI are still there. They just move faster now.
67% trust AI more than their own colleagues
That's a real finding from a 2025 survey across multiple countries. It matters because trust is one of the main levers that determines whether AI helps a team or creates problems. Too much trust in AI output, and people stop checking what it produces. Too little trust, and they spend more time verifying than they save.
Too much trust
- Accepting AI output without questioning it
- Gets worse under time pressure
- Output feels right before it is right
- Mistakes don't get caught
Too little trust
- Checking everything, eliminating the gains
- The bar too high for AI to actually help
- Not using AI in areas it would help most
- Same old bottlenecks, different justification
Both patterns appear in the same organization, in different people. The organizations getting ahead are working on getting the balance right, not setting a policy for or against.
Skills you stop using, you lose
Doctors who used AI to help with diagnoses performed measurably worse when the AI was removed. Essay writers couldn't remember what they'd written minutes after finishing AI-assisted work. This isn't a theory. It's what happens in practice, and it builds slowly without anyone noticing.
14% of heavy AI users already experience measurable cognitive fatigue. Junior employees stop developing when their senior colleagues use AI without them. These things don't show up in productivity metrics.
Culture is 70% of the result
BCG looked at ten thousand employees across eleven countries. Their conclusion: 70% of what determines whether AI transformation actually works is people, culture, and process. The technology itself is 30%. Most organizations get this backwards.
and process
and tools
The question for a founder isn't which AI tools to adopt. It's: what kind of team do we need to be for any of this to actually change something?
How you work with AI is how you already work
The way someone uses AI is not new behavior. It's an extension of patterns they already have with people, with information, and under pressure. The same blind spots show up. The same strengths show up. Just faster, and at larger scale.
Looking at 16 personality types from the PPA© model, four clear patterns emerge. One per quadrant. Each with its own way of over-using AI and one specific thing that goes wrong.
Four quadrants. Four things that go wrong.
The trust range across personality types is wide
Some types accept AI output with almost no critical eye. Others question everything. The types that actually get good results with AI sit in the middle. Not blindly trusting, not paranoid. They check what matters and move on.
What each quadrant actually needs to work on
Vision types (Identify, Enable, Imagine, Unite): Learn to commit to a direction before jumping to the next idea. AI gives you more options. You need fewer, not more. This is the same work with or without AI.
Strategy types (Activate, Form, Realize, Decide): The measure of good strategic thinking is not how sophisticated the plan is. It's how many people can pick it up and run with it. AI makes the gap between you and the rest of the team wider, not smaller.
Action types (Reveal, Excite, Act, Adapt): Speed is only worth it if the team can absorb what it learns. Slow down the review cycle, not the build cycle. The problem is not that you build too fast. The problem is that you don't stop to notice what the speed is telling you.
Structure types (Manage, Control, Sustain, Secure): The skill is knowing when to defend the current system and when to challenge it. AI cannot make that judgment. That one stays with you.
The leadership work is the same. The speed changed.
Here's what the research and the personality data say together: working on someone's usual pattern is working on their AI pattern. They are the same thing. If a Vision type learns to commit to a direction before jumping to the next idea, that holds whether they're working with a person or an AI model.
You're not managing a new relationship between your people and technology. You're managing the same dynamics you've always managed, now with something that accelerates them.
The founders getting ahead of this are not doing it with better tools. They're doing it by understanding their team more clearly, and acting on what they see.
AI makes the gaps in your team bigger
A team that's already short on commercial energy gets more short on it. A team that's already bad at challenging ideas gets worse at it. AI amplifies whatever's already dominant. The blind spots you had before are the same blind spots, moving faster.
Who covers the four quadrants in your team is more important now, not less.
Your culture decides what AI does here
A team that avoids hard conversations doesn't get more direct because you add AI. The avoidance gets faster, smoother, and harder to name. Teams with low trust produce AI-assisted work with the same accountability gaps and communication problems as before.
Founders who haven't taken a clear look at their team's dynamics will find AI revealing it for them, usually at an inconvenient speed. The tools don't create the problems. They just make them visible faster.
Culture is 70% of the transformation. That's BCG's number across ten thousand employees. It keeps getting ignored because it's harder to measure than a tool rollout.
Four things people won't hand over
Across all 16 personality types, four things stay irreplaceable. Every type holds at least one of these as a boundary. As a leader, those boundaries are worth paying attention to. They usually point at something real about what matters to that person.
What the research actually says
Short titles. One sentence each. No filler.
- 01No change89% of CEOs saw no real productivity gain from all this AI adoption.
- 02Collective gapAI improves individual performance. It does nothing for how teams work together.
- 03AmplificationAI makes existing team patterns faster, bigger, and harder to see.
- 04Same failureYour AI failure mode is your human failure mode. They are the same thing.
- 05Social roleAI performs the emotional and relational role of a teammate. It doesn't replace it.
- 06Culture firstWhat AI does in your company depends entirely on your culture.
- 0770 / 30Culture and process are 70% of the result. The technology is 30%.
- 08Problem firstWhen AI does the execution, how clearly you defined the problem is everything.
- 09Preparation gap84% of executives expect AI agents. 26% of workers have had any training for it.
- 10Who owns itWhen AI is involved, nobody agrees on who is responsible when something goes wrong.
- 11Trust gap67% of workers trust AI output more than their own colleagues.
- 12DeskillingSkills AI handles for you consistently, you gradually stop being able to do yourself.
- 13Cognitive load14% of heavy AI users now experience measurable cognitive fatigue.
- 14Junior impactJunior employees stop developing when senior colleagues use AI without them.
- 15Team diversityAI amplifies the dominant quadrant. Missing perspectives get worse, faster.
- 16Echo chamberVision types use AI to confirm what they already believe.
- 17MonumentStrategy types with AI build work that nobody else can follow or continue.
- 18Speed trapAction types adopt AI fastest and learn least from the output.
- 19Wrong fixStructure types use AI to perfect systems that should actually be changed.
- 20Human lineEvery person holds something they will not hand over to AI.
- 21MotivationAI collaboration improves output but reduces intrinsic motivation over time.
- 22Workload gap96% of C-suite expect AI to reduce workload. 77% of employees say it grew.
- 23Leader stress71% of leaders report more stress, not less, navigating AI transformation.
- 24Less cohesionTeams working with AI show lower cohesion than teams working without it.
- 25False signalAI output that looks good can hide the fact that the team isn't actually aligned.
- 26Agent ratioJensen Huang's projection: 100 people managing 10,000 AI agents in large organizations.
- 27Usage gap75% of leaders use AI regularly. Only 50% of frontline employees do.
- 28DefinitionWhen AI executes, how precisely you defined what you wanted determines everything.
- 29InfluenceAs technical skills become commodities, the ability to influence and persuade gets scarcer.
- 30GovernanceMost organizations build AI governance after the fact, not as part of the infrastructure.
- 31Review cost37-40% of the time AI saves gets spent checking and correcting its output.
- 32The real workThe technology moves fast. The human dynamics are the same as they've always been.
If this raises questions about your team
I work with scale-up founders and CEOs on leadership, team dynamics, and culture. This is the kind of work I do day to day. If you want to think through what this means for your situation, let's talk.
Paul Musters is the founder of emaho. He works with scale-up founders and their teams on leadership, team dynamics, and culture.
April 2026