SPONSORS

SPONSORS

Influence Without Authority

 

Leading AI Modernization Across Boundaries

 

ADVISORY ARTICLE

By Madhusudan Bangalore Nagaraja

Texas, USA


Many of the leaders most responsible for AI modernization in government hold no formal authority over the agencies, vendors, or systems they depend on. Programme delivery leads, technical advisors, and cross-agency coordinators are accountable for outcomes they cannot mandate. This brief answers a practical question: How do you lead an AI modernization programme when you cannot tell people what to do?

The Accountability Gap

The accountability gap—the distance between what a leader is responsible for delivering and the formal authority they hold to direct it—is a defining feature of large technology programmes. Consider a cross-agency AI data integration programme. The technical lead is accountable for delivery but has no authority over the three agencies supplying the data. Each agency operates under its own governance rules, risk appetite, and legal obligations. When a decision is needed—say, whether to delay a milestone to resolve a data quality problem—the technical lead cannot simply decide. They must create the conditions under which all parties reach a defensible conclusion together. That is the accountability gap in practice.

This gap widens during AI adoption, where unfamiliar technology amplifies stakeholder anxiety and resistance. Without a clear mechanism for reaching shared decisions, transformation stalls—not because solutions are unavailable, but because the right people cannot agree on the risks they are collectively accepting.

Decision Transparency as the Primary Lever

In the absence of formal authority, decision transparency—making the rationale behind every key choice explicit, visible, and consistent across stakeholders—becomes the most powerful tool available. When a leader states the trade-off clearly—delay two weeks to fix a data quality issue, or meet the statutory deadline and accept that risk—the decision is no longer personal. It belongs to everyone in the room. Disagreement shifts from who is right to what risk are we collectively accepting. That shift is what makes progress possible without formal control.

Figure 1 illustrates the three behaviours through which decision transparency operates in practice. Together they enable aligned stakeholders, unblocked decisions, and accountable AI delivery—even where authority is fragmented.

More…

To read entire article, click here

How to cite this article: Nagaraja, M. B. (2026).  Influence Without Authority: Leading AI Modernization Across Boundaries, PM World Journal, Vol. XV, Issue V, May. Available online at https://pmworldjournal.com/wp-content/uploads/2026/05/pmwj164-May2026-Nagaraja-Influence-Without-Authority.pdf


About the Author


Madhusudan Bangalore Nagaraja

Texas, USA

 

Madhusudan Bangalore Nagaraja, PMP, SAFe 6 RTE, PMI-ACP, IEEE Senior Member, is Technical Delivery Manager at eSystems Inc. in Irving, Texas, USA. He leads AI adoption and digital modernization programmes across public-sector, banking, and healthcare environments. Madhusudan has published peer-reviewed research on agentic AI and serves on the PMI Infinity Advisory Committee—PMI’s flagship AI tool. His practice centres on accountable governance and people-centred transformation in regulated, high-stakes delivery contexts. Madhusudan can be contacted at  madhunagaraja@ieee.org   and  LinkedIn: linkedin.com/in/madhusudannagaraja