AI and the SEC

The SEC has pending a proposed regulation, anticipated for adoption before the end of this year, requiring brokers and advisers to remove data analytics which would put the firm’s interests before the interests of investors.  Separately, while speaking this week at Yale Law School, SEC Chair Gensler also admonished these firms to beware AI programs that might lead to spoofing or front-running.

Front-running effects a trade for the firm account (based in this case presumably on AI analytics) prior to affording that trade to clients, while spoofing may distort the market by placing buy or sell orders for future contracts and then withdrawing them prior to execution.

The SEC message is that firms need fully to test AI trading programs to ensure that what is generated is not only factual (not an hallucination) but also must work in the first instance to benefit customers and not the firm’s proprietary trading and to preserve a market driven by fair practices.  However, the requirements here have to do with discharging fiduciary obligations, do not address the purposeful release of false data (clearly a fraud), and relate more closely to avoiding negligence or recklessness on the part of the broker or advisor.  It is not certain, however,  that committing either of these SEC sins will in fact create harm to clients or other marketplace investors, an issue that Gensler himself  believes will be resolved by the courts in a manner he described as “unpredictable.”

And this doesn’t get to court without the SEC bringing suit so … you be the judge as to where this is heading.

Comments are closed.