b'Deploying AI Systems Securely TLP:CLEARexisting cybersecurity frameworks and guidance to protect against the most common and impactful threats, tactics, techniques, and procedures. Visit CISAs Cross-Sector Cybersecurity Performance Goals for more information on the CPGs, including additional recommended baseline protections. Secure the deployment environment Organizations typically deploy AI systems within existing IT infrastructure. Before deployment, they should ensure that the IT environment applies sound security principles, such as robust governance, a well-designed architecture, and secure configurations. For example, ensure that the person responsible and accountable for AI system cybersecurity is the same person responsible and accountable for the organizations cybersecurity in general [CPG 1.B]. The security best practices and requirements for IT environments apply to AI systems, too. The following best practices are particularly important to apply to the AI systems and the IT environments the organization deploys them in. Manage deployment environment governance If an organization outside of IT is deploying or operating the AI system, work with the IT service department to identify the deployment environment and confirm it meets the organizations IT standards.Understand the organizations risk level and ensure that the AI system and its use is within the organizations risk tolerance overall and within the risk tolerance for the specific IT environment hosting the AI system. Assess and document applicable threats, potential impacts, and risk acceptance. [3], [4] Identify the roles and responsibilities for each stakeholder along with how they are accountable for fulfilling them; identifying these stakeholders is especially important should the organization manage their IT environment separately from their AI system. Identify the IT environments security boundaries and how the AI system fits within them. Require the primary developer of the AI system to provide a threat model for their system.TLP:CLEARU/OO/143395-24 | PP-24-1538 | April 2024 Ver. 1.03'