Search

Media & Entertainment insights: 2026 AI Impact Survey Report

 

Innovation is accelerating, commercial value must keep up

 

Eighty-seven percent of media and entertainment boards have approved major AI investments. Fifty-four percent say their frontline workforce — writers, editors, production staff — is where adoption most urgently needs support.

 

This report explains how media companies both lead and fall short of other industries in their AI adoption — and what organizations must do to move from leadership commitment to execution excellence.

 

M&E boards committed early, front lines still finding their way

 

M&E boards have moved faster on AI than almost any other sector, according to our latest AI Impact survey. Eighty-seven percent have approved major AI investments. The harder work is what comes next: translating that commitment into performance where content is made, rights are managed, and audience relationships are built. That distance between board authorization and frontline execution in media and entertainment carries specific commercial consequences that most organizations have not fully accounted for.

 

The media and entertainment-specific data from Grant Thornton’s 2026 AI Impact Survey of 950 business leaders shows a sector that is innovating quickly but governing unevenly. M&E leaders report real gains from AI in creativity, speed and quality. However, commercial returns are still building and accountability frameworks are lagging behind adoption.

 

 
 

Board engagement does not reach the work

 
87%

of M&E boards have approved major AI investments vs. 75% overall

54%

say frontline employees need the most AI adoption support vs. 37% overall

 

M&E organizations outperform the full survey sample on every board-level governance indicator. Eighty-seven percent of boards approved major AI investments, versus 75% overall. Sixty-eight percent integrated AI risk and opportunity into ongoing oversight, versus 54% overall. The question is whether that strength reaches the studios, streaming platforms and production operations where consequential AI decisions are made every day.

 
 

In M&E, AI reaches directly into core commercial assets: content libraries that generate licensing and streaming revenue, talent agreements that determine how a performance can be used, and rights structures that govern what a studio can sell and to whom. Writers, editors, post-production staff, and ad operations teams work inside those assets every day.

 

Fifty-four percent of M&E respondents identified frontline employees as the part of the organization most in need of AI adoption support — the highest industry-specific rate in our survey and 17 points above the market average. Forty-eight percent cite talent and upskilling as the primary scaling barrier, 13 points above the market average.

 
 

Frontline adoption in M&E encounters problems in implementation, not upskilling. Writers, editors and production staff need to understand what AI does with the content they touch — not just how to operate the tools. When AI draws from protected material, it creates rights exposure; when its output violates a union agreement or a talent contract, the organization is liable regardless of which vendor supplied the model.

 

That understanding requires role-specific workflow design. Sarbanes-Oxley — the federal law governing financial disclosure, approval controls, and audit trails for public companies — is effectively the sector's primary statutory compliance framework. M&E’s real governance exposure lies in intellectual property and rights protection, with civil lawsuits, revenue losses and reputational damage when it goes wrong.

 

“Every media executive I speak with is asking about AI. The ones asking the right questions are not asking whether to use it. They are asking who owns the outcome when it does something they did not expect. In an industry built on rights, talent relationships and audience trust, that question determines whether AI creates value or erodes it.”

Deborah Newman

Media & Entertainment Industry Leader,

Grant Thornton Advisors LLC

 
 

Accountability does not come with the vendor contract

 
32%

of organizations in our full survey are primarily buying AI solutions rather than building them

20 %

are primarily building AI solutions this year

 

Across our full survey, nearly one in three organizations is primarily buying rather than building AI solutions. Among M&E respondents, directional data from those in finance functions aligns with or exceeds that rate. The industry logic is sound: external tools are improving faster than any organization can replicate internally, and buying accelerates deployment significantly.

 

Increasingly, M&E organizations are going further — acquiring companies with proven AI-trained production capabilities to deploy faster and at a greater scale than internal development allows. Industry analysts project more than $80 billion in M&E M&A activity in 2026, with AI capabilities at the top of acquisition wish lists. The approach accelerates production workflows and improves output quality without requiring organizations to rebuild from scratch.

 
 

That speed comes with accountability obligations that buy-first strategies don't automatically resolve. When third-party tools surface unlicensed content or draw from protected archives, deploying companies have typically carried that responsibility, not the vendor. Recent litigation across the industry is testing where that accountability ends — with media companies pursuing legal action against AI vendors over unlicensed content use. Early licensing partnerships between publishers and AI platforms show that organizations moving proactively on accountability are finding workable frameworks — and a stronger foundation for scaling AI with confidence.

 

Most M&E organizations have not yet built the control architecture that buy-first deployment requires: vendor frameworks that specify what third-party AI can do with content, monitoring that detects when outputs create rights exposure, and documented evidence that governance is operating at the workflow level.

 

Governance and compliance failures are the leading cause of AI underperformance in our full survey. In M&E, the failure surfaces through rights agreements, talent contracts and union structures that AI workflows can breach without anyone noticing until revenue is at risk. Organizations that have deployed AI tools quickly without establishing accountability boundaries are risking unintentional intellectual property breaches that could result in costly lawsuits.

 
 

 

Ready to talk? We’re ready to listen.

 

Request a meeting -->

 

Agentic AI is scaling faster than the controls

 
17%

of M&E organizations have fully integrated agentic AI into enterprise workflows vs. 9% overall

34 %

cite regulatory or compliance uncertainty as an agentic AI concern vs. 44% overall

 

Seventeen percent of M&E organizations report agentic AI fully integrated into enterprise workflows, nearly double the full survey rate. Governance of what those agentic systems do — and what happens when they behave unexpectedly — has not kept pace with deployment.

 

Production studios are already deploying AI-driven tools that operate with significant autonomy: scheduling workflows and generating scene alternatives. The most visible example is AI performers: the AI-created “actress” Tilly Norwood is being positioned for roles that would previously have required a human talent contract and the rights structure that comes with it. In reaction, SAG-AFTRA accuses her creator of using union members' work to train the AI character without consent.

 
 

The concern in M&E differs from the full survey in a telling way. Regulatory and compliance uncertainty tops agentic AI concerns across all sectors at 44%. In M&E, it registers at 34% because the sector operates under lighter statutory regulation than banking or insurance, where formal compliance regimes dominate. When agentic workflows automate content recommendation, ad placement or streaming personalization at scale, the risk of undetected brand harm or rights violations scales with the speed of deployment. In M&E, the governance mechanism is contractual and reputational, meaning failures are slower to surface and harder to contain.

 
 

Build now. M&E’s governance window is narrowing

 

The M&E organizations building durable AI performance are precise about where AI runs, who reviews its outputs before they reach an audience, and how innovation connects to the streaming metrics, advertising yield and production economics that fund the business.

 
Key steps
  1. Step 1: Establish accountability for AI that touches rights-sensitive assets.

    For every AI tool operating inside content production, streaming recommendation, ad placement, or talent management workflows, define who owns the outcome when it behaves unexpectedly. In M&E, the accountability architecture has to match the commercial value of the assets at risk.

  2. Step 2: Build workflow design, not just training programs.

    Fifty-four percent of M&E respondents say frontline employees are where AI adoption support is most urgently needed. Training alone will not address that. Define which roles use AI in which tasks, what decisions require human review before outputs reach production, and what appropriate use looks like for writers, editors and post-production staff working inside rights-sensitive content.

  3. Connect AI activity to the metrics that fund the business.

    M&E leads in accelerated innovation but trails on revenue growth and cost reduction. That distance will close as AI-driven restoration of content libraries, acceleration of post-production workflows and streaming personalization convert to commercial outcomes. Identify the specific levers where AI operates and build the instrumentation that makes the connection visible and fundable.

 

For media and entertainment companies pursuing growth through acquisition, AI governance cannot be an afterthought or a centralized policy exercise. Grant Thornton works with M&E organizations and active M&A teams to assess how AI operates inside rights‑sensitive workflows managing high-value assets, where accountability breaks down, and which controls must scale alongside newly acquired capabilities.

 

Our teams help leaders translate AI ambition into execution-ready controls, integration plans and value metrics that travel with the deal, not trail behind it. To understand where your organization’s AI exposure and opportunity truly sit, request a meeting with our M&E specialists for a targeted assessment or roadmap aligned with your deal strategy.

 

Methodology

 

Between Feb. 23 and March 18, 2026, Grant Thornton surveyed 950 business leaders, a group restricted to CFOs, CIOs/CITOs, COOs, and VPs, department heads, and directors who report directly to the C suite. The media and entertainment-specific subgroup comprised 100 respondents.

 

Contact:

 
 

Content disclaimer

This content provides information and comments on current issues and developments from Grant Thornton Advisors LLC and Grant Thornton LLP. It is not a comprehensive analysis of the subject matter covered. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC and Grant Thornton LLP. All relevant facts and circumstances, including the pertinent authoritative literature, need to be considered to arrive at conclusions that comply with matters addressed in this content.

For additional information on topics covered in this content, contact a Grant Thornton professional.

Grant Thornton LLP and Grant Thornton Advisors LLC (and their respective subsidiary entities) practice as an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable law, regulations and professional standards. Grant Thornton LLP is a licensed independent CPA firm that provides attest services to its clients, and Grant Thornton Advisors LLC and its subsidiary entities provide tax and business consulting services to their clients. Grant Thornton Advisors LLC and its subsidiary entities are not licensed CPA firms.

 

 

Tax professional standards statement

This content supports Grant Thornton Advisors LLC’s marketing of professional services and is not written tax advice directed at the particular facts and circumstances of any person. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC. If you are interested in the topics presented herein, we encourage you to contact a Grant Thornton Advisors LLC tax professional. Nothing herein shall be construed as imposing a limitation on any person from disclosing the tax treatment or tax structure of any matter addressed herein.

The information contained herein is general in nature and is based on authorities that are subject to change. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC. This material may not be applicable to, or suitable for, the reader’s specific circumstances or needs and may require consideration of tax and nontax factors not described herein. Contact a Grant Thornton Advisors LLC tax professional prior to taking any action based upon this information.

 

Changes in tax laws or other factors could affect, on a prospective or retroactive basis, the information contained herein; Grant Thornton Advisors LLC assumes no obligation to inform the reader of any such changes. All references to “Section,” “Sec.,” or “§” refer to the Internal Revenue Code of 1986, as amended.

Grant Thornton Advisors LLC and its subsidiary entities are not licensed CPA firms.

2026 AI Impact Survey Report

The AI proof gap: See why AI isn’t delivering the performance leaders expected

Get practical insights on AI performance

See how disciplined adoption of AI into everyday workflows delivers measurable results
 

Trending topics