Executive summary
The U.S. Department of the Treasury recently released new guidance to strengthen cybersecurity and risk management for AI in the financial sector. The initiative marks a shift in how regulators expect financial institutions to govern AI. Through six coordinated deliverables, regulators are signaling that AI risk should be embedded in existing risk and compliance frameworks, rather than treated as a standalone technology issue. As financial services companies, banks and credit unions accelerate their AI adoption, they should align their AI governance frameworks with this guidance to improve AI controls and decision-making across the enterprise.
Banking AI oversight is moving from concepts to execution
As financial institutions scale AI across functions, they need tested and provable governance to back it up. Grant Thornton’s 2026 AI Impact Survey of banking leaders found that the lack of centralized, tested governance is holding banks back from measurable AI performance. Just 18% said they were fully confident in their ability to pass an independent review of their AI controls in the next 90 days, and 50% said governance and compliance barriers limit AI performance.
By aligning AI governance with new Treasury guidance, financial institutions can strengthen their AI governance and controls and drive measurable AI results.
With the publication of the Financial Sector AI Deliverable Reference and Application Guide (PDF - 136.67KB), Treasury and industry participants provided six interrelated resources designed to support practical, regulator‑informed AI risk management. Collectively, these deliverables signal that regulators no longer view AI as an emerging or isolated technology concern, but as a core component of operational, compliance and consumer-protection oversight. Financial institutions should align their AI governance framework to this guidance accordingly.
Resources outline how financial institutions can strengthen AI risk management
Each of the six Treasury‑supported deliverables addresses a specific AI risk consideration, but they are intentionally designed to work together. Taken as a whole, they span the full AI risk lifecycle — from shared understanding and governance foundations to operational controls and emerging threat response.
AI governance-focused deliverables for financial institutions
- AI Lexicon (PDF - 261.92KB): Defines key AI-related terms based on definitions from various industry standards and government resources, with a focus on frequently used terms that have a specific meaning in the context of AI use in financial services.
- Financial Services AI Risk Management Framework: An operationalization of the National Institute of Standards and Technology’s AI Risk Management Framework specifically tailored for financial services, providing a scalable and adaptable approach for institutions to conduct assessments, address gaps and prioritize mitigation efforts to develop a more resilient control posture across various stages of AI adoption.
- Identity and Authentication deliverables: Includes Mitigating AI-Powered Attacks Against Identity and Authentication (PDF - 355 KB) and Recommendations for Policy Makers (PDF - 259.46 KB), outlining three attack sources: deepfake-driven social engineering and impersonation, synthetic identity creation and AI agents as attack surrogates, with a maturity model for controls to mitigate AI-powered attacks.
- AI and Explainability in Finance (PDF - 633.66 KB): Explainability Challenges, Practices and Recommendations: Includes best practices for financial institutions to fulfill explainability objectives as they develop, implement and support AI capabilities.
- The Data Nutrition Labeling deliverable (PDF - 573.6 KB): Provides a structured approach for the evaluation of data quality related to AI solutions in the financial sector.
- The AI Enhanced Fraud deliverable (PDF - 361.5 KB): Provides guidance for AI fraud education and awareness programs, and incident response and reporting considerations.
At the center of Treasury’s AI guidance is the Financial Services AI Risk Management Framework (FS AI RMF), which provides a maturity‑based approach for evaluating how AI risks are governed across an organization and translating high‑level AI principles into actionable governance and control expectations. The framework is anchored by the AI Adoption Stage Questionnaire, which helps institutions identify where AI is being used across the enterprise and assess governance maturity.
The remaining Treasury‑supported deliverables reinforce the FS AI RMF by addressing specific risk areas such as explainability, fraud, identity and data transparency. The AI and Explainability in Finance resource emphasizes transparency and accountability in customer‑impacting decisions, positioning explainability and data transparency as governance and oversight issues rather than purely technical concerns. In parallel, the AI‑Enhanced Fraud and Identity and Authentication materials underscore how threat actors are using AI to enable impersonation, automate attacks and exploit identity controls — reinforcing the need for AI risk assessments that address both internal use cases and external threat vectors.
How we can help you
INDUSTRY
SERVICES
What Treasury guidance means for banking AI governance
For financial institutions, Treasury’s initiative provides clearer signals about supervisory expectations. AI risk is expected to be governed through existing frameworks, integrated into enterprise risk assessments and supported by documented oversight and controls. Institutions that lack this level of governance face heightened regulatory and reputational risk. By contrast, organizations that can demonstrate tested, well‑designed AI controls are better positioned to support reliable AI outcomes, stronger model performance, and sustained competitive differentiation.
In working with institutions on AI governance efforts, teams are increasingly using Treasury‑supported frameworks as a way to bring structure to AI risk conversations that were previously fragmented. An assessment‑driven approach helps organizations understand where AI exists, evaluate governance maturity and align AI oversight with established compliance and risk management processes.
“What we’re increasingly seeing is that AI risk assessments grounded in Treasury’s guidance help institutions understand where AI exists, how well it’s governed and what needs to evolve,” said Oliver Dennison, Grant Thornton Regulatory Compliance Solutions Partner. “Organizations that take this approach aren’t just managing regulatory risk, they’re creating the conditions for more consistent AI performance and competitive differentiation.”
AI governance actions financial institutions can take now
Treasury’s AI guidance suggests that the focus has shifted from whether institutions are using AI to how effectively that use is governed. As AI adoption expands across third‑party relationships and customer‑facing activities, scalable and integrated oversight will become increasingly important.
“In our work with financial institutions, we see the biggest challenges arise when AI governance remains conceptual rather than operational,” said Leslie Watson‑Stracener, Grant Thornton Regulatory Compliance Solutions Partner. “Treasury’s guidance gives institutions a clear framework to assess where AI is being used, evaluate governance maturity and embed AI risk into existing risk and compliance programs — enabling more consistent, regulator‑aligned oversight as AI adoption scales.”
Contacts:
Partner, Regulatory Compliance Solutions, Risk Advisory Services
Grant Thornton Advisors LLC
Leslie Watson-Stracener is a Partner and Regulatory Compliance Capability Leader at Grant Thornton Advisors LLC
Dallas, Texas
Service Experience
- Advisory Services
- Risk Advisory
Partner, Regulatory Compliance Solutions, Risk Advisory Services
Grant Thornton Advisors LLC
Oliver is a Managing Director in Grant Thornton’s Financial Services Advisory Practice. Based in the New York office, he delivers integrated services to global banking clients, with a focus on regulatory changes, risk and compliance.
Charlotte, NC
Service Experience
- Advisory Services
- Risk Advisory
Partner, Risk Advisory Services
Grant Thornton Advisors LLC
Erin is a partner in the Forensic Advisory Services practice located in Charlotte, NC with over 12 years of experience.
Charlotte, North Carolina
Service Experience
- Advisory Services
Manager, Regulatory Compliance Solutions, Risk Advisory Services
Grant Thornton Advisors LLC
Content disclaimer
This Grant Thornton Advisors LLC content provides information and comments on current issues and developments. It is not a comprehensive analysis of the subject matter covered. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC. All relevant facts and circumstances, including the pertinent authoritative literature, need to be considered to arrive at conclusions that comply with matters addressed in this content.
Grant Thornton Advisors LLC and its subsidiary entities are not licensed CPA firms.
For additional information on topics covered in this content, contact a Grant Thornton Advisors LLC professional.
Share with your network
Share