Emerging SEC requirements for public companies to disclose material AI risks, AI governance practices, and AI-related impacts in securities filings.
The US Securities and Exchange Commission (SEC) has been increasingly focused on artificial intelligence disclosures by public companies, driven by the materiality of AI-related risks and opportunities for investors. While there is no single "SEC AI Disclosure Rule," the SEC has been using existing disclosure frameworks, enforcement actions, and proposed rulemaking to establish expectations for how public companies should disclose AI-related matters.
The SEC's interest in AI disclosures operates on multiple fronts. First, the existing principles-based disclosure framework requires companies to disclose material risks, including those related to AI. SEC Chair Gary Gensler emphasised that companies should not engage in "AI washing" — making misleading claims about AI capabilities — and the SEC has brought enforcement actions against companies for misleading AI-related statements.
In March 2024, the SEC charged two investment advisers with making false and misleading statements about their purported use of AI in investment processes. These enforcement actions signal that the SEC treats AI claims as material statements subject to anti-fraud provisions of the securities laws.
The SEC has also indicated that existing disclosure requirements under Regulation S-K already capture many AI-related disclosures. Item 105 (Risk Factors) requires disclosure of material AI risks, including cybersecurity risks associated with AI, risks of AI model failure, regulatory risks from emerging AI laws, and competitive risks from AI disruption. Item 101 (Description of Business) may require disclosure of material AI use in business operations. Item 103 (Legal Proceedings) requires disclosure of AI-related litigation.
The SEC's proposed rules on cybersecurity disclosure (adopted in 2023) also impact AI governance, as AI systems are subject to cybersecurity risks and incidents involving AI systems may trigger disclosure requirements under the new cybersecurity rules.
For companies using AI in financial services, the SEC proposed Rule 15l-2 in July 2023, which would require broker-dealers and investment advisers to eliminate or neutralise conflicts of interest associated with using predictive data analytics and AI in investor interactions. While this proposal faced significant industry opposition and its future is uncertain, it reflects the SEC's focus on AI's impact in financial services.
The SEC's examination priorities have also included AI governance, with the Division of Examinations indicating that it will review firms' AI disclosures, AI governance practices, and AI-related risk management as part of routine examinations.
For public company compliance professionals, the practical impact is clear: AI-related risks, governance, and use must be evaluated for materiality and disclosed appropriately in periodic filings. Boards should have oversight of AI strategy and risks, and companies should ensure that any public claims about AI capabilities are accurate and not misleading.
The intersection of AI disclosure with ESG reporting is also notable, as investors increasingly view responsible AI as a component of good governance. AI governance disclosures may become a standard element of annual reports and proxy statements.
Disclose material AI-related risks in SEC filings (10-K Risk Factors)
Ensure AI-related claims in public statements are accurate and not misleading
Disclose material use of AI in business operations where relevant
Report AI-related legal proceedings and regulatory actions
Comply with cybersecurity disclosure rules for AI-related incidents
Address AI governance in corporate governance disclosures
Evaluate and disclose conflicts of interest in AI-driven financial services (if applicable)
Board oversight of AI strategy, risks, and governance
Internal controls over AI-related financial reporting and disclosures
Monitor SEC guidance and rulemaking on AI-specific disclosure requirements
There is no single AI-specific disclosure rule yet. However, existing principles-based disclosure requirements (Regulation S-K) already require disclosure of material AI risks and uses. The SEC has also brought enforcement actions for misleading AI claims under existing anti-fraud provisions.
AI washing refers to companies making exaggerated or misleading claims about their use of AI to attract investors. The SEC views this as a material misrepresentation that can mislead investors and has brought enforcement actions against firms for false AI claims.
While there is no explicit SEC requirement for board-level AI oversight, good governance practice and investor expectations increasingly demand it. AI strategy and risks are material matters that fall within the board's oversight responsibilities.

Empowering businesses with safe, secure, and responsible AI adoption through comprehensive monitoring, guardrails, and training solutions.
Copyright ©. Aona AI. All Rights Reserved