This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

The US Securities & Exchange Commission Targets AI on Multiple Fronts (Part 1 of 3)

The U.S. Securities & Exchange Commission (“SEC”) is highly focused on regulating the use of artificial intelligence (“AI”) by financial services providers. A recent Mintz client alert discussed newly proposed rules for conflicts of interest in the use of AI, predictive data analytics, and similar technologies by broker-dealers and investment advisers (the “PDA Conflicts Rules”). If adopted as proposed, liberally construed, and strictly interpreted, the PDA Conflicts Rules could fundamentally alter the regulatory landscape for advisers and broker-dealers and have a substantial impact on AI service providers.

The PDA Conflicts Rules are still in the proposal stage, but they represent just one aspect of the SEC's expanding AI regulatory efforts. As discussed in this note, SEC Chairman Gary Gensler has highlighted specific risks associated with AI in recent public statements. Two future notes will discuss an SEC “sweep” examination of private fund advisers focused on key risks that appears to be underway, and the potential for AI-related SEC guidance and enforcement actions. 

Chair Gensler’s Speech

Part of the SEC’s recent focus on AI may reflect a view that it “will be at the center of future . . . financial crises.” Shortly before making this statement to the New York Times, Chair Gensler expressed a similar sentiment in a speech to the National Press Club, noting that AI could “play a central role in the after-action reports of a future financial crisis.” Chair Gensler used his National Press Club speech to highlight the following concerns about AI – echoes of which resurfaced about a week later in a proposing release for the PDA Conflicts Rules:

  1. Data Privacy and Intellectual Property. Chair Gensler noted that AI systems rely on enormous and diverse datasets, which can lead to concerns about data ownership, control, and rights. Concerns also exist about the privacy of personal details in these datasets and the misuse of intellectual property sourced, possibly without authorization, from various sources.
  1. Financial Stability. Chair Gensler expressed concern about AI intensifying existing financial vulnerabilities. For example, if many large financial institutions come to rely on a handful of AI models or data sources, their trading systems could act simultaneously on negative news or erroneous market indicators. Simultaneous selling of a publicly traded security by large financial institutions due to AI prompts could damage the US financial system and be amplified by the global financial system's intrinsic interconnectedness.
  1. Deception. The growing use of AI can also lead to its misuse for deceptive purposes. Chair Gensler pointed to AI-generated deepfake images, stories, and other false information to illustrate his concerns. He also emphasized that regardless of the tool or method, fraudulent practices remain illegal under federal securities regulations and that there is always a human programmer or other individuals behind any AI program.
  1. AI Bias. Chair Gensler noted that AI models are intricate and fed by vast datasets, which can make their decision-making processes inscrutable. The inability to understand an AI’s reasoning can lead to concern about equitable results, especially if the data that predictive tools rely on reflect historical prejudices or unintentionally discriminate against protected attributes. Not knowing why AI takes certain actions may also result in inadvertent conflicts of interest, as discussed further below and in the SEC release proposing the PDA Conflicts Rules.
  1. Potential for Conflicts of Interest. Chair Gensler voiced concern about advisers or broker-dealers intentionally or unintentionally prioritizing their own interests over their clients when using AI applications. This concern about adviser and broker-dealer conflicts of interest served as a central justification for the PDA Conflicts Rules, as further discussed in a recent Mintz client alert.

While the SEC may lack authority to issue regulations addressing all of the concerns above, understanding Chair Gensler’s views may help advisers and broker-dealers contextualize future regulatory and enforcement activity by the agency.

"AI may play a central role in the after-action reports of a future financial crisis."

Tags

artificial intelligence, broker-dealer, investment adviser, financial regulation, investment management, private funds, venture capital, private equity, sec, securities and exchange commission