There's a moment, somewhere in the second year of legal practice, that experienced lawyers tend to describe in similar terms. The rote work, the tenth indemnity clause of the month, the three days of document review distilled into a page-long summary, the draft returned covered in red, somewhere in that repetitive friction, something shifts. Instincts develop that nobody taught explicitly. The ability to sense where a contract is thin, where a case's narrative breaks, where the facts are being papered over. That knowing didn't come from a lecture. It came from the work itself, done badly, then less badly, over enough time that the pattern recognition became automatic.
That process, a slow and uncomfortable apprenticeship baked into the structure of junior legal work, is now being disrupted faster than almost anyone is acknowledging. A new report from the 8am Legal Industry consultancy, drawing on data from 1,300 legal professionals, found that 69% of legal practitioners now use AI tools for work, more than double last year's figure of 31%. Firms with 20 or more lawyers have adopted general-purpose AI at a 58% rate. Productivity gains are real: 61% of respondents say AI increases their output, and nearly a quarter save six or more hours per week. These numbers keep climbing.
What's absent from those productivity figures is any accounting of what's being lost. Writing in Bar & Bench, Indian legal commentator Ananya Ghosh makes the case with uncomfortable clarity. The junior lawyer drafting an indemnity clause from scratch wasn't just completing a task. She was building a felt understanding of why clause structures work the way they do, what happens when they're wrong, where the risk concentrates. A junior lawyer who reviews an AI's draft instead is doing something cognitively different. She may be doing it at a higher level of abstraction, catching logical errors rather than grammatical ones, but whether that different kind of engagement builds the same instincts over time remains genuinely unknown.
The mechanism Ghosh identifies is productive friction: the discomfort of not knowing, of having to sit with a problem before an answer becomes available, of building an argument from raw research rather than from an AI's synthesised summary. That friction, she argues, is not a bug in the apprenticeship model. It's how legal judgment forms. Remove the friction and you don't just save time; you remove the training signal. Whether AI-assisted review work eventually builds the same depth of professional instinct as production work is an open empirical question, and the legal profession appears to be running the experiment without having thought through what a bad result would look like.
The ABA Journal's 2026 Legal Industry Report frames this moment as a crossroads for a profession that is, ironically, better trained to anticipate risk than to embrace change. The report notes that individual lawyers are adopting AI faster than their firms are, meaning many are using these tools without institutional guidance, ethical frameworks, or supervisory oversight. The access-to-justice dimension adds another layer of complexity: AI tools could, in principle, help close the enormous gap between legal need and legal service, giving underserved clients access to high-quality document drafting and research. But those benefits flow primarily to clients who can already afford the technology, and the profession hasn't yet worked out how to direct AI's productivity gains toward the people who most need legal help.
There's a parallel here to other knowledge professions where AI is most enthusiastically being adopted. Software engineers, medical residents, financial analysts: all are fields where the junior-level rote work is tedious by design, and where the tedium serves a purpose that the people assigning it rarely articulate explicitly. When a senior developer makes a junior engineer write boilerplate code from scratch rather than scaffolding it, part of the reason is that writing boilerplate teaches you to read it. The same logic applies to legal drafting. Whether AI can find a way to preserve that training effect while removing the drudgery is the right question to be asking. The profession doesn't yet know the answer, and in the meantime the scaffold is coming down.