Stay informed with free updates
Simply sign up to the Artificial intelligence myFT Digest — delivered directly to your inbox.
Regulators must quickly find a way to manage risks posed to financial stability by the concentration of power in artificial intelligence platforms, the chair of the US Securities and Exchange Commission has urged.
Gary Gensler told the Financial Times that without swift intervention it was “nearly unavoidable” that AI would trigger a financial crisis within a decade.
Shaping AI regulation would be a tough test for US regulators, the SEC chair said, as potential risks cut across financial markets and stem from models crafted by tech companies that sit outside the remit of Wall Street watchdogs.
“It’s frankly a hard challenge,” Gensler said. “It’s a hard financial stability issue to address because most of our regulation is about individual institutions, individual banks, individual money market funds, individual brokers; it’s just in the nature of what we do. And this is about a horizontal [matter whereby] many institutions might be relying on the same underlying base model or underlying data aggregator.”
The SEC in July proposed a rule addressing potential conflicts of interest in predictive data analytics, but it focused on individual models deployed by broker dealers and investment advisers.
Even if current measures were updated, “it still doesn’t get to this horizontal issue . . . if everybody’s relying on a base model and the base model is sitting not at the broker dealer, but it’s sitting at one of the big tech companies”, Gensler said. “And how many cloud providers [which tend to offer AI as a service] do we have in this country?”
He added: “I’ve raised this at the Financial Stability Board. I’ve raised it at the Financial Stability Oversight Council. I think it’s really a cross-regulatory challenge”.
Regulators worldwide are grappling with how to police AI, as tech groups and their models are not naturally captured by specific watchdogs. The EU has moved quickly, drafting tough measures over the use of AI in a groundbreaking law that is set to be fully approved by the end of the year. The US, however, is reviewing the technology to determine which aspects of it require new regulation and what is subject to existing laws.
Wall Street has already adopted AI in a number of ways, from robo advising to account opening processes and in brokerage apps.
But Gensler is concerned that parties basing decisions on the same data model may lead to herd behaviour that would undermine financial stability and unleash the next crisis.
“I do think we will in the future have a financial crisis . . .[and] in the after action reports people will say ‘Aha! There was either one data aggregator or one model . . . we’ve relied on’. Maybe it’s in the mortgage market. Maybe it’s in some sector of the equity market,” Gensler said.
AI’s powerful “economics of networks” makes it “nearly unavoidable,” he added, predicting that a crisis could happen as soon as the late 2020s or early 2030s.
Lawmakers and regulators in Washington have heightened scrutiny of AI, raising concerns around market stability, data protection and antitrust. The Federal Trade Commission in July launched a review of ChatGPT maker OpenAI looking at consumer harm and data security. Antitrust agencies have warned that AI’s structural dependence on scale could lead to tech monopolies.
Gensler, who has tackled concentration in capital markets to promote efficiency, believes that AI could generate competition issues in that area. “Might this lead to more concentration of market makers?” he said.
The SEC is also finalising a much expected rule, proposed in March 2022, that requires public companies to disclose their direct emissions and emissions derived from energy that they purchase, known respectively as scope 1 and scope 2. Under the proposal, scope 3 emissions, a broad measurement that includes products a business buys from third parties, would need to be reported only if they were deemed “material” or part of companies’ climate targets.
Gensler declined to comment on whether scope 3 disclosures, which were welcomed by investors but slammed by corporate America, would be included in the rule’s final version.
However, he said he would do “the right thing by the American public by doing a rule that’s within the law and is sustained by the courts [and] thinking that through based on the comments [the SEC has received], based on the economics, based on trying to bring some consistency to what’s already happening”.
He noted that in 2021, 55 per cent of companies in the Russell 1000 already disclosed scope one and scope two emissions.
The climate proposal has angered Republican lawmakers and attorneys-general, two dozen of whom have threatened to sue the SEC on the basis it is overstepping its authority, a claim Gensler denies. The chair’s active rulemaking agenda is facing other legal challenges, including a lawsuit from a coalition of private equity, venture capital and hedge fund groups seeking to block sweeping new measures for private fund managers.
Read the full article here