ISO/IEC 42001:2023 is the world's first international management system standard for artificial intelligence, published in December 2023 by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). It provides a structured framework for organizations to establish, implement, maintain, and continually improve an AI Management System (AIMS).
The standard follows the common ISO management system structure (Annex SL/Annex L) and covers: organizational context and stakeholder requirements, leadership commitment and AI policy, planning for AI risks and opportunities, resource management including AI competence, operational planning and control of AI systems, performance evaluation and monitoring, and continual improvement of AI management.
Key areas addressed include AI risk management (identifying, assessing, and treating risks specific to AI systems), governance and accountability (assigning clear ownership of AI-related decisions), transparency (documenting AI system purpose, data usage, and decision logic), and responsible AI practices across the full system lifecycle.
**Certification process:** Organizations seeking ISO 42001 certification follow a structured path: defining the scope of the AI Management System (which AI systems or business units are covered), conducting a gap analysis against the standard's requirements, implementing the required controls and documentation, undertaking an internal audit, and then engaging an accredited third-party certification body to perform a two-stage external audit. Certification is typically valid for three years with annual surveillance audits.
**Who needs it:** ISO 42001 is relevant to any organization developing, deploying, or operating AI systems — including technology vendors, financial institutions, healthcare providers, government agencies, and professional services firms. It is particularly valuable for organizations in regulated industries where demonstrating responsible AI governance to customers, regulators, and auditors is essential.
**Relationship to ISO 27001:** ISO 42001 is designed to complement ISO 27001 (Information Security Management Systems), not replace it. ISO 27001 addresses the confidentiality, integrity, and availability of information assets broadly. ISO 42001 addresses the specific governance, risk, and accountability challenges introduced by AI systems — such as model bias, opacity, data drift, and unintended outputs. Organizations with ISO 27001 certification can integrate ISO 42001 into their existing management system, leveraging shared controls and documentation.
**Australian relevance:** ISO 42001 aligns closely with emerging Australian AI governance requirements. The Australian Government's Voluntary AI Safety Standard (2024) and the Department of Industry's responsible AI framework share ISO 42001's emphasis on risk management, human oversight, transparency, and accountability. Australian organisations adopting ISO 42001 are well-positioned as AI-specific regulation develops domestically, and the standard provides a credible framework for demonstrating compliance with the Australian Privacy Act's requirements around automated decision-making.
ISO 42001 also complements the NIST AI RMF and EU AI Act requirements, creating a comprehensive international governance ecosystem for organizations operating across multiple jurisdictions.