Back
Back
industry
Back
Back
.jpg)

For a while, AI felt like something happening around legal: tools being tested, pilots being run, vendors pitching new capabilities. But now the questions are landing squarely inside the function. General counsel are asking how AI is being used, CEOs are asking what legal is doing about it, and legal operations (legal ops) teams are often the ones expected to make sense of it all.
At the same time, something else is happening in parallel. Legal ops itself is evolving: senior roles that had been paused or flattened in the past few years are coming back with more seniority and more strategic expectations. The mandate is changing from “implement and manage” to something closer to “design and lead.”
There is definitely more expectation to contribute strategically. What used to be a function focused on execution—implementing tools, managing vendors, keeping things running—is now being asked to think about direction. As Legal Operations Leader Mary O’Carroll put it in her March 25, 2026, interview, the role is evolving toward questions, such as: “How do we think about our resources? And what direction do we take? And who is our ultimate client? Is it just the legal department? Is it our internal stakeholders? Is it our external customers, clients, or whatever the organization does? And as we start to think about moving up that chain, the role can become something more than just tactical.”
That combination—AI pressure on one side, and a shift in role on the other—is creating a very unique challenge and opportunity for legal ops. It’s not simply about adopting AI responsibly or putting guardrails in place. The real challenge is figuring out how legal work should be structured when AI is part of the system.
It requires rethinking workflows that have been built over years (sometimes decades) and doing it while the ground is still moving underneath you. It requires stepping out of the day-to-day and looking at the function from a higher vantage point, understanding what leadership cares about, what the business needs, and how legal fits into that.
Legal ops teams have always been strong at “getting stuff done”, and that’s part of what makes this moment so interesting. They are used to seeing what needs to happen, organizing around it, and executing effectively. That skillset doesn’t go away with AI, but it does get stretched in new directions. Because once AI enters a workflow, the question is no longer just how to execute a task efficiently—it becomes a question of whether that task should exist in the same way at all.
Decisions that used to be implicit suddenly need to be made explicit:
What should be automated versus reviewed manually?
Where should human judgment sit, and where is it unnecessary?
Who owns the output, especially when AI has contributed to it?
These are not edge cases; they are becoming everyday questions. And they don’t have clean, one-size-fits-all answers.
This is where the tension starts to show up for legal ops. The function is being pulled out of its operational comfort zone and into something more strategic, but without the luxury of stepping away from the day-to-day. Strategy, in this context, isn’t about long-term planning in isolation; it’s about making structural decisions in real time, inside active workflows. That’s a very different kind of work, and one that most teams are still figuring out how to approach.
“It is definitely a mindset pivot. It is a leadership style pivot. It is a communications style pivot. So there [are] all these things that happen as you move up in your career that we have to move away from what’s comfortable to us, which again is getting stuff done.” - Mary O’Carroll
It’s easy to frame AI as the source of complexity, but in practice, it’s more of a spotlight. Legal teams have always dealt with fragmented workflows, inconsistent processes, and knowledge that lives in too many places. Those issues were manageable when work moved at a slower pace and relied more heavily on individual expertise. But once AI introduces speed and scale, those inconsistencies become much harder to ignore.
As one legal leader noted, AI allows organizations to execute their guidelines at scale across everything they do—risk management, outside counsel, legal work, and even people management. That kind of scale sounds like a benefit, and it is, but it also forces a level of consistency that many teams haven’t fully built yet. If different teams interpret workflows differently, or if there are no clear review points, scaling that system just amplifies the confusion.
That’s why governance is suddenly showing up as a priority. Not because legal teams didn’t care about risk before, but because the existing ways of working don’t hold up under scale. AI is making that visible in a way that’s hard to ignore, and legal ops is right in the middle of that realization.
“Over the coming year, legal departments will rapidly accelerate their responsible use of AI tools to improve efficiency, optimize processes, and maximize the talents of their legal professionals. These tools will graduate from ‘nice to haves’ to necessary elements of any high-functioning department.” - Bruce Byrd, executive vice president and general counsel at Palo Alto Networks
There’s another (more empowering) layer on top of this, and it’s one that doesn’t get talked about as much. Thanks to purpose-built AI platforms like Regology, where the latest global regulations and the ability to narrow down, interpret, and analyze legal content live under one roof, in-house legal teams are starting to operate with more control over their work, particularly in how they engage with outside counsel. That doesn’t mean reducing reliance entirely, but it does mean becoming more deliberate—choosing the right firms for the right matters (e.g., labor law firms, IP law firms, etc.), setting clearer expectations, and managing those relationships more actively.
AI is playing a role in that shift by giving internal teams more access to information and more confidence in their own analysis. It doesn’t replace external expertise, but it does change the dynamic. Legal teams are coming into conversations better prepared, with more context, and with a clearer sense of what they need. That leads to more pointed discussions and, in many cases, less default dependence.
As responsibility moves inward, so does accountability. When decisions are made internally, there’s an expectation that they can be explained and justified. That includes how AI was used, what inputs were considered, and where human judgment was applied. Governance, in this context, isn’t about limiting usage—it’s about making decisions visible and defensible.
If the first phase of AI was about accelerating work, the second phase is all about delegating it—Agentic AI. There are now solutions, like Regology’s Agentic Workflows, that don’t just assist with tasks but actually execute them in a more autonomous way. These systems can be trained and operate with a level of independence that begins to resemble how we think about employees.
That changes the nature of the questions legal ops needs to answer. It’s no longer just about whether a tool is appropriate—it’s about what work is being handed off, how that work is supervised, and where accountability ultimately sits.
Most legal teams are not fully set up for that model yet, and that’s where the gap becomes clear. Governance, at this stage, is less about setting boundaries around tools and more about understanding how work flows through a system that includes both humans and AI. That’s a more complex problem, and one that requires a deeper level of operational design.
“The discipline that legal ops brings, like technology evaluation, workflow standardization, vendor management, metrics and compliance monitoring, maps to capabilities required for AI governance and will protect in-house from mistakes that cost their organization.” - Jenny Hamilton, Corporate Compliance Insights
This is where the conversation often drifts into frameworks and policies, but that’s not where legal ops teams are feeling the pressure. The challenge isn’t defining what good governance should look like—it’s figuring out how to make it real inside the way work actually happens. Documents and guidelines don’t solve that on their own.
In practice, governance shows up in the structure of workflows. It’s reflected in how knowledge is organized, how decisions are reviewed, and how outputs move from one stage to another. It becomes visible in whether teams are aligned on how to use tools, whether there’s clarity around ownership, and whether there’s a record of what was done and why. These are operational details, but they’re what make governance tangible.
It can look something like this:
Anchor roles in what actually happens:
If ownership doesn’t map to real actions, it won’t hold.
Instead of writing broad rules upfront, map where AI is already being used:
For each:
Governance should follow the flow of work, not sit above it.
Don’t govern “ChatGPT vs. Copilot.” Govern what the output is used for:
Low impact
Internal summaries, first drafts, knowledge retrieval
Moderate impact
Contract language suggestions, workflow automation, internal analysis
High impact
Legal advice, investigations, regulatory interpretation
Externally impactful
Anything customer-facing, regulator-facing, or rights-affecting
The same tool can sit in all four categories. The use case determines the control.
This is where most governance breaks.
Every approved use case should clearly answer:
If you can’t answer these, you don’t have governance—you have assumptions.
“Human-in-the-loop” is too vague. Be specific and introduce mandatory validation at points where:
Governance works when review is placed at decision points, not added as a blanket rule.
If people are bypassing approved tools, it’s usually because:
Fix:
The safest system is the one people actually use.
Every AI tool should be evaluated as part of the governance model:
Governance doesn’t stop at your boundary—it includes your vendors.
A governance framework should leave behind a clear trail:
This is what turns governance into something you can defend in an audit or investigation.
Legal ops should track:
If you’re not measuring it, you can’t improve it—or justify it.
What’s becoming clear is that legal ops is uniquely positioned to take this on, but only if it leans into the shift from execution to design. That means moving beyond implementing tools and optimizing individual processes, and instead focusing on how those pieces fit together into a coherent workflow system. It requires a more intentional approach to how work is structured, how decisions are made, and how information flows across the function.
This also means accepting that the work is ongoing. There isn’t a final state where governance is “done,” because the environment itself is still evolving. Tools are changing, expectations are shifting, and new use cases are emerging all the time. As one leader noted, roadmaps are being rewritten constantly—not because they were wrong, but because the context keeps changing. That kind of adaptability is becoming part of the role.
Legal ops teams that embrace this are starting to move into a different position within the organization. They’re not just supporting the function—they’re shaping how it operates. And in doing so, they’re defining what AI governance actually looks like in practice.
Practical Guide for Building an AI Governance Framework
What Banking Leaders Say About AI Governance