New DOJ Compliance Program Guidance Addresses AI Risks, Use of Data Analytics | Insights
The Criminal Division of the U.S. Department of Justice (DOJ) recently updated its Evaluation of Corporate Compliance Programs (ECCP) policy document, which prosecutors rely on to evaluate the effectiveness of compliance programs for purposes of determining whether to bring charges, assess monetary penalties or to require certain ongoing compliance obligations, such as under a corporate integrity agreement. Notably, the Sept. 23, 2024, revisions to the ECCP require prosecutors to evaluate whether a company’s compliance program includes safeguards to ensure that its deployment of new technologies, including artificial intelligence (AI), will not result in “deliberate or reckless misuse” that violates criminal laws or the company’s Code of Conduct.
Background
The Justice Manual (formerly known as the United States Attorneys’ Manual) describes specific factors prosecutors should consider in exercising their enforcement discretion against a business organization in the event of a criminal investigation. One key factor is the effectiveness of the company’s corporate compliance program. The ECCP, for its part, sets forth the criteria prosecutors utilize to evaluate whether a program is effective. The ECCP is important as both a reference document for the design and implementation of corporate compliance programs and guidance document for understanding how DOJ makes prosecutorial decisions. As a general matter and at minimum, businesses involved in the healthcare industry should ensure their compliance programs meet the criteria set forth in the ECCP.
Assessment of Risks from Emerging Technologies/AI
In March 2024, Deputy U.S. Attorney General Lisa Monaco directed the DOJ Criminal Division to incorporate the assessment of risks presented by new technologies, including AI, into the ECCP. Following this directive, the ECCP now includes criteria that prosecutors are to utilize in assessing whether a company has adequate controls in place to mitigate the risks associated with the use of AI. Federal prosecutors should ask:
- Does the company have a process for identifying and managing emerging internal and external risks that could potentially impact the company’s ability to comply with the law, including risks related to the use of new technologies?
- How does the company assess the potential impact of new technologies such as AI on its ability to comply with criminal laws?
- Is management of risks related to use of AI and other new technologies integrated into broader enterprise risk management (ERM) strategies?
- What is the company’s approach to governance regarding the use of new technologies such as AI in its commercial business and compliance program?
- How is the company curbing any potential negative or unintended consequences resulting from the use of technologies, in both its commercial business and compliance program?
- How is the company mitigating the potential for deliberate or reckless misuse of technologies, including by company insiders?
- To the extent that the company uses AI and similar technologies in its business or as part of its compliance program, are controls in place to monitor and ensure its trustworthiness, reliability and use in compliance with applicable law and the company’s code of conduct?
- Do controls exist to ensure that the technology is used only for its intended purposes?
- What baseline of human decision-making is used to assess AI? How is accountability over use of AI monitored and enforced?
- How does the company train its employees on the use of emerging technologies such as AI?1
The criteria above collectively demonstrates that a company that utilizes AI as a core part of its business operations should have specific policies and procedures in its compliance program to ensure its AI software is deployed and continues to function properly. This would include policies and procedures for the auditing of AI performance, training employees on the proper use of AI and detecting misuse of AI or any other new technologies.
Data Resources and Access
The September 2024 revisions to ECCP also include new criteria relating to the company’s use and utilization of data analytics in its compliance program. For example, the updated ECCP asks prosecutors to assess whether compliance personnel have access to relevant data systems from which the personnel can properly monitor the effectiveness of the company’s compliance program. Additional questions in the updated ECCP include the following:
- Is the company appropriately leveraging data analytics tools to create efficiencies in compliance operations and measure the effectiveness of components of compliance programs?
- How is the company managing the quality of its data sources?
- How is the company measuring the accuracy, precision, or recall of any data analytics models it is using?2
The new criteria conveys an expectation on the part of DOJ that companies utilize data tools as part of their compliance efforts and that they provide their compliance staff access to all data necessary to enable staff to implement the compliance program.
Takeaways
As Monaco stated, “Fraud using AI is still fraud.” The revised ECCP confirms that DOJ is taking seriously the potential risks associated with the misuse of AI and that it will scrutinize corporate compliance programs to ensure these risks are addressed and mitigated. In addition, the revised ECCP clarifies that DOJ expects 1) compliance staff to have access to relevant data sources necessary to carry out their responsibilities under the compliance program and 2) companies devote sufficient resources, including data analytic tools, to their compliance efforts.
Notes:
1 ECCP pp. 3-4.
2 ECCP p. 13.
link