We are probably still years away from Wall Street being overrun by actual robots. Nonetheless, artificial intelligence (AI) tools are divisively integrating into all aspects of society—from the classroom to the courtroom. Many broker-dealers have also implemented AI-assisted analytics and technology. Indeed, over the last several years, many firms have made investing more easily accessible and user-friendly through “robo-advisors.” No one is questioning the “pros” of AI. But many are still concerned about the risks. The SEC is no different, nor are they any less divided. Here, the SEC has honed in on conflicts of interest that may arise through the use of AI.
On July 26, the Securities and Exchange Commission (SEC) proposed a regulation under the Securities Exchange Act of 1934 and Investment Advisors Act of 1940 to combat what it sees as conflicts of interest arising from using predictive data analytics by broker-dealers and investment advisors. In strong language, the proposed rule seeks to “to eliminate, or neutralize the effect of, certain conflicts of interest associated with broker-dealers’ or investment advisers’ interactions with investors through these firms’ use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors or outcomes.” The proposed regulation would require broker-dealers and investment advisors to take steps to address potential conflicts of interest from predictive analysis and similar technologies that interact with investors to prevent firms from placing their own interests ahead of the investors’ interests.