90 Days Gen AI Risk Trial -Start Now
Book a demo
AI Compliance Hub/Regulations/SEC AI Disclosure
UpcomingUnited StatesLawEffective: 2025-06-01

SEC AI Disclosure Requirements

Emerging SEC requirements for public companies to disclose material AI risks, AI governance practices, and AI-related impacts in securities filings.

📋 Overview

The US Securities and Exchange Commission (SEC) has been increasingly focused on artificial intelligence disclosures by public companies, driven by the materiality of AI-related risks and opportunities for investors. While there is no single "SEC AI Disclosure Rule," the SEC has been using existing disclosure frameworks, enforcement actions, and proposed rulemaking to establish expectations for how public companies should disclose AI-related matters.

The SEC's interest in AI disclosures operates on multiple fronts. First, the existing principles-based disclosure framework requires companies to disclose material risks, including those related to AI. SEC Chair Gary Gensler emphasised that companies should not engage in "AI washing" — making misleading claims about AI capabilities — and the SEC has brought enforcement actions against companies for misleading AI-related statements.

In March 2024, the SEC charged two investment advisers with making false and misleading statements about their purported use of AI in investment processes. These enforcement actions signal that the SEC treats AI claims as material statements subject to anti-fraud provisions of the securities laws.

The SEC has also indicated that existing disclosure requirements under Regulation S-K already capture many AI-related disclosures. Item 105 (Risk Factors) requires disclosure of material AI risks, including cybersecurity risks associated with AI, risks of AI model failure, regulatory risks from emerging AI laws, and competitive risks from AI disruption. Item 101 (Description of Business) may require disclosure of material AI use in business operations. Item 103 (Legal Proceedings) requires disclosure of AI-related litigation.

The SEC's proposed rules on cybersecurity disclosure (adopted in 2023) also impact AI governance, as AI systems are subject to cybersecurity risks and incidents involving AI systems may trigger disclosure requirements under the new cybersecurity rules.

For companies using AI in financial services, the SEC proposed Rule 15l-2 in July 2023, which would require broker-dealers and investment advisers to eliminate or neutralise conflicts of interest associated with using predictive data analytics and AI in investor interactions. While this proposal faced significant industry opposition and its future is uncertain, it reflects the SEC's focus on AI's impact in financial services.

The SEC's examination priorities have also included AI governance, with the Division of Examinations indicating that it will review firms' AI disclosures, AI governance practices, and AI-related risk management as part of routine examinations.

For public company compliance professionals, the practical impact is clear: AI-related risks, governance, and use must be evaluated for materiality and disclosed appropriately in periodic filings. Boards should have oversight of AI strategy and risks, and companies should ensure that any public claims about AI capabilities are accurate and not misleading.

The intersection of AI disclosure with ESG reporting is also notable, as investors increasingly view responsible AI as a component of good governance. AI governance disclosures may become a standard element of annual reports and proxy statements.

⚖️ Key Requirements

1

Disclose material AI-related risks in SEC filings (10-K Risk Factors)

2

Ensure AI-related claims in public statements are accurate and not misleading

3

Disclose material use of AI in business operations where relevant

4

Report AI-related legal proceedings and regulatory actions

5

Comply with cybersecurity disclosure rules for AI-related incidents

6

Address AI governance in corporate governance disclosures

7

Evaluate and disclose conflicts of interest in AI-driven financial services (if applicable)

8

Board oversight of AI strategy, risks, and governance

9

Internal controls over AI-related financial reporting and disclosures

10

Monitor SEC guidance and rulemaking on AI-specific disclosure requirements

📅 Key Dates & Timeline

2023
SEC signals increased focus on AI disclosures and 'AI washing'
July 2023
SEC proposes rules on predictive data analytics/AI in financial services
March 2024
SEC brings first enforcement actions for misleading AI claims
2024
SEC examination priorities include AI governance review
2025
Expected further SEC guidance on AI disclosure expectations
2025–2026
Potential finalisation of AI-related financial services rules

🏢 Who It Affects

  • All SEC-reporting public companies using or developing AI
  • Broker-dealers and investment advisers using AI in investor interactions
  • Companies making public claims about AI capabilities
  • Boards of directors with oversight of AI strategy and risk
  • CFOs and disclosure committees responsible for SEC filings
  • Investor relations teams communicating AI strategy to the market

Frequently Asked Questions

Is there a specific SEC rule requiring AI disclosure?

There is no single AI-specific disclosure rule yet. However, existing principles-based disclosure requirements (Regulation S-K) already require disclosure of material AI risks and uses. The SEC has also brought enforcement actions for misleading AI claims under existing anti-fraud provisions.

What is 'AI washing' and why does the SEC care?

AI washing refers to companies making exaggerated or misleading claims about their use of AI to attract investors. The SEC views this as a material misrepresentation that can mislead investors and has brought enforcement actions against firms for false AI claims.

Do boards need to oversee AI?

While there is no explicit SEC requirement for board-level AI oversight, good governance practice and investor expectations increasingly demand it. AI strategy and risks are material matters that fall within the board's oversight responsibilities.

Empowering businesses with safe, secure, and responsible AI adoption through comprehensive monitoring, guardrails, and training solutions.

Copyright ©. Aona AI. All Rights Reserved