← All articles

How to get employees to use Microsoft 365 Copilot

Deploying Copilot is straightforward. Getting people to use it, and keep using it - is where most organisations struggle. The problem is not the tool. It is the approach.

If your Microsoft 365 Copilot adoption is lower than you expected, you are not alone. Across UK organisations, the consistent pattern after a standard rollout is that somewhere between 10 and 25 percent of licensed users are genuinely using Copilot by the third month. The rest were briefly curious, tried it without much structure or support, produced underwhelming results, and quietly stopped.

The reason this happens is not that Copilot is a weak product or that your employees are resistant to change. It is that the standard approach to deploying enterprise software has never reliably produced lasting behaviour change, and there is no reason to expect it to work any better for AI tools.

Why the default approach fails

The typical Copilot rollout follows a sequence that feels reasonable: deploy licences, run a training session or record a demo, send communications encouraging people to try it, track usage in the admin dashboard, report back to leadership.

Each step makes logical sense in isolation. Collectively, they fail because they treat adoption as an information problem. The assumption is that people are not using Copilot because they do not know how, and that providing knowledge and access will resolve it.

But getting employees to use a new tool consistently is not an information problem. It is a behaviour change problem. And behaviour change requires fundamentally different conditions than information transfer.

A person who leaves a training session knowing how to use Copilot will, in most cases, not use it the following week. Not because they were not paying attention. Because they had no specific task to use it on when they got back to their desk, no deadline that required them to try, no accountability to anyone if they did not, and no feedback mechanism to tell them whether their first attempt produced a useful result.

The training session creates awareness. It does not create a habit. Everything needed to turn awareness into sustained behaviour is left to chance, and chance produces 10 to 25 percent adoption.

What habit formation actually requires

The conditions for a new behaviour to become automatic are well understood from the research on habit formation. For Copilot specifically, three things are non-negotiable.

A recurring cue. A habit needs something that reliably triggers it. For Copilot, the most effective cue is a weekly challenge: a named, specific task to attempt on a specific week, using real work the participant is already doing. Without a concrete prompt, the default behaviour, not using Copilot - has nothing to compete with.

Accountability. People sustain new behaviours at much higher rates when their participation is visible to peers. A leaderboard that updates weekly and is visible to a small group creates the mild social pressure that keeps engagement alive through weeks three to six, when novelty has faded but habit has not yet formed. This is the period where unstructured programmes collapse.

Progression. If the challenge level does not increase as skill develops, people plateau at their first successful level of use and stay there. Week-one behaviour - basic email drafting, simple document summarisation - is not the same as the week-nine behaviour that actually changes how someone works. A structured progression from foundational skills to advanced applications is what builds genuine depth of capability.

The nine-week structure that works

A nine-week programme with one weekly challenge is long enough to build a genuine habit and short enough to maintain momentum. Each challenge should be:

  • Completable in under thirty minutes
  • Applied to real work the participant is actually doing, not sample data or hypothetical scenarios
  • Progressively more complex than the previous week
  • Submitted to a shared channel where it is visible to a small pod of peers

The progression matters. Weeks one to three cover the foundations: prompting and querying, drafting in Outlook, summarising documents in Word. Weeks four to six build depth: Excel data analysis, Teams meeting intelligence, PowerPoint deck drafting. Weeks seven to nine push toward advanced application: custom instructions, multi-step workflows, and an introduction to Copilot agent creation for repetitive tasks.

By week nine, participants have attempted Copilot across nine different task types, in their real work, with feedback from peers. That produces something a training day cannot: an accurate, experience-based understanding of where Copilot genuinely helps and where it does not. That understanding is what drives spontaneous use, the unreported, habitual use that does not show up in the admin dashboard but represents the real productivity gain.

The facilitator role

A structured programme needs one person to manage the cadence. Their job is not facilitation in the training sense - they do not need to be a Copilot expert, and they do not need to teach. Their job is logistics: posting the weekly challenge on Monday, sending a midweek nudge on Wednesday, updating the leaderboard, and running a thirty-minute share-back conversation at the end of each two-week sprint.

The share-back is the most valuable thirty minutes in the programme. When participants talk to each other about what they tried, what worked, and what surprised them, learning consolidates in a way that individual practice alone does not produce. The facilitator's job is to create the conditions for that conversation, not to lead it.

One to two hours per week for nine weeks is the total facilitator commitment. That is manageable for an EA, a chief of staff, an L&D team member, or a line manager who wants to lead the programme for their own team.

What changes at the end

The goal of a nine-week programme is not a 100% completion rate or a perfect leaderboard. It is a cohort of people who have formed a genuine Copilot habit - who reach for the tool without being prompted, for specific tasks where they now know it helps.

Evidence from structured adoption programmes consistently shows habitual use rates of 75 to 90 percent among programme completers, compared to 10 to 25 percent from unstructured deployments. Self-reported time savings in the range of 30 to 60 minutes per day are common for people who complete the full programme.

Those figures also produce a better financial case for the licence investment. Our article on how to measure Copilot ROI walks through the calculation; the headline point is that moving from 20 to 75 percent adoption typically turns a licence spend that is not yet breaking even into one that demonstrably does.

For guidance on running a structured pilot before committing to a wider rollout, see our step-by-step guide to running a Microsoft 365 Copilot pilot programme.

If you are not sure where your current adoption stands or what is driving the gap, the free Copilot diagnostic takes five minutes and gives you a clear picture of where to focus.

The Copilot Bootcamp Kit gives you everything needed to run this programme: nine weeks of challenges, a facilitator guide, and a leaderboard tracker. One person, no consultants, set up in a weekend.

Get the kit