A Tale of Two Regulators: The SEC and FCA Address AI Regulation for Private Funds – The National Law Review

author
4 minutes, 30 seconds Read

2023’s excitement for generative artificial intelligence (AI) prompted the SEC to respond on multiple fronts – stump speeches, rulemaking, new exam priorities and sweeps and previewing potential enforcement actions. SEC Chair Gary Gensler raised concerns regarding potential conflicts and investor harm resulting from the proliferation of AI and warned that an AI-caused financial crisis is nearly unavoidable absent regulation. The SEC adopted a number of initiatives in 2023 to respond to these perceived risks. 

In the UK, the UK’s Financial Conduct Authority (the FCA) will be watching firms use of AI closely but largely sees its existing regulatory regime as being fit for purpose, with enforcement action in AI-related matters likely to be taken under Senior Managers and Certification Regime and the new Consumer Duty

These regulatory efforts will be accelerated in 2024, and fund managers should therefore be aware that:

  • The SEC has proposed rules to address the potential conflicts of interest presented by the use of AI technologies in the securities industry. 

The proposed Predictive Analytics Rules would require broker-dealers and registered investment advisers to eliminate or neutralize the effect of certain conflicts of interest associated with their use of AI and other technologies – “Covered Technologies” – in any “Investor Interaction,” both of which terms are very broadly defined leading many to question the scope of the proposed rule. “Covered Technology” includes a broad range of technology used to “guide” investment-related behavior and appears to extend well beyond that involving AI. Further, the requirement to eliminate or neutralize the conflict does not permit firms to mitigate such conflict through disclosure, which appears to be inconsistent with the SEC’s traditional disclosure-centered approach to conflicts of interest.

Proposed in July 2023, the comment process closed in October 2023, with a final vote on the proposed rules tentatively targeted for April 2024 (although the final rules if adopted in the current form may be challenged in court). 

  • The SEC’s Division of Examinations launched a wide-ranging AI sweep in August 2023 focusing on how private fund advisers use AI and potential AI-linked conflicts of interest, looking at topics from AI-related marketing and disclosures to AI models and techniques used to manage client portfolios or make investment decisions and supervisory procedures and controls governing the use of AI. The Divisions’ Examination Priorities for 2024 include an express focus on AI and other emerging technologies, and the Division established a specialized team to better understand such emerging issues and risks.
  • AI clearly is top of mind with the Division of Enforcement in 2024, which will apply traditional tools and theories to potential enforcement actions involving AI. As noted by Chair Gensler on a number of occasions, if a human is using AI to defraud investors, they will “likely be hearing from the SEC.” Gensler has further warned businesses against “AI washing,” or making misleading AI-related claims – similar to the greenwashing claims that have been a focus of recent SEC enforcement actions. Just last month the SEC announced its first-ever settlements with two investment advisers for making false and misleading statements about their use of AI in providing investment advice,
  • In the UK, the FCA recently emphasized that regulated firms are – and remain – responsible for their own operational resilience, and must be alert to the security, data and service risks posed by all “frontier technology.” It appears that the FCA does not see AI as posing new risks to the financial services sector but is likely to accelerate and amplify existing challenges to stability, consumer protection, data protection and market integrity.
  • The FCA will soon regulate “critical third parties” (CTPs) – the providers of critical technologies, including AI, to authorized financial services entities, whose failure would severely impact the regulated entities (and their customers) who rely on their technology and services. The necessary legislation has been passed in the form of the Financial Services Markets Act 2023 and the consultation on the implementing regulation closes on 15 March 2024.
  • In addition, providers of AI systems in the UK and the US will of course need to ensure compliance with non-financial services focused regimes.

Given the hype surrounding AI and the regulators’ natural inclination to focus on emerging risks so as not be caught flat footed, investment advisers looking to utilize AI should be mindful of the SEC’s stated concerns. It will be even more important to address such things as the accuracy of AI-related disclosures and conflicts related to the use of AI. In the UK, the FCA will inevitably also be watching firms’ use of AI closely, albeit largely within the existing regulatory framework. Firms will be required to update their existing policies, procedures and risk tools to ensure careful assessment of risks and compliance.

It is not all risk and threat. The FCA publicly recognizes the potential benefits of all types of AI in financial services, running a AI sandbox for firms to test the latest innovations and appreciating the opportunity of AI to improve productivity, customer service, outcomes and regulatory compliance, if the right guardrails are in place. 

Whether operating under the regulatory regime of the UK, the US or both, it will be especially important for fund managers to stay abreast of evolving developments in the regulation of AI.

Joshua M. Newville, Todd J. Ohlms, Robert Pommer, Seetha Ramachandran, Robert Sutton, Jonathan M. Weiss, Julia Alonzo, William D. Dalsen, Isaiah D. Anderson, James Anderson, Julia M. Ansanelli, Adam L. Deming, Adam Farbiarz, Reut N. Samuels and Hena M. Vora also contributed to this article.

This post was originally published on this site

Similar Posts