The Muskoverse is a cacophonous place. Competing for our attention today there’s the informational fog of war, some antisemitic tropes, “terminal monkeys” and a seemingly ever-growing cast of oddly named children.
It takes something big to cut through all this noise, or at least something bombastic: something like a 66 page research report from Morgan Stanley analyst Adam Jonas and six of his colleagues.
Yes! Ha ha ha . . . Yes!
Investors have long debated whether Tesla is an auto company or a tech company. We believe it’s both, but see the biggest value driver from here being software and services revenue. The same forces that have driven AWS to reach 70% of AMZN total EBIT can work at Tesla, in our view, opening up new addressable markets that extend well beyond selling vehicles at a fixed price. The catalyst? Dojo, Tesla’s custom supercomputing effort in the works for the past 5 years. Version 12 of Tesla’s full self driving system (OTA by year-end) and Tesla’s next AI day (early 2024) are worth watching.
We believe that Dojo can add up to $500bn to Tesla’s enterprise value, expressed through a faster adoption rate in Mobility (robotaxi) and Network Services (SaaS). The change drives our PT increase to $400 vs. $250 previously. We upgrade to Overweight and make Tesla our Top Pick.
Maybe you’ve not heard of Tesla Dojo. Maybe you’ve been relying on things like Tesla’s 2022 annual report and its 10-K filing, neither of which mention Dojo even once.
Morgan Stanley applies a wider frame of reference:
Here’s the gist of the argument. Teslas “are sensor encrusted robots making life and death decisions in highly unpredictable environments and situations.” Their next-gen proprietary brain will be the Dojo chip, being developed in-house by Tesla for the specific purpose of ingesting lots of data.
Whereas ordinary chipmakers have to think about whether their new silicon will still be able to run Apache Spark and FIFA 23, Tesla’s GPU team has had a Mr Miyagi-like focus on advanced driver assist systems. Single-purpose specialisation is what will make its supercomputers superior, doing to AI what mining ASICs did to crypto:
With a highly experienced semiconductor team, Tesla has built a custom AI ASIC chip, that, due to its core function of processing vision-based data for autonomous driving use cases, can operate more efficiently (energy consumption, latency) than the leading cutting-edge general-purpose chips on the market (NVIDIA’s A100s and H100s), potentially at a fraction of the cost.
Tesla is not the first tech player to attempt to build a custom silicon system in-house, but given the company’s deep understanding of ADAS (pioneer in the EV market), vast network of data that is constantly increasing (400k FSDs on the road already collecting data from 300+ million miles traveled), a world class design team, and expansive resources, in addition to the underlying need to diversify away from over-reliance on NVDA, we believe Dojo may prove competitive in its customized solution.
Evidence for Dojo’s potential typically comes from Tesla presentations, such as the company’s 2021 and 2022 AI Days where it was first announced, and the Q2 call where Musk said they’d be spending north of $1bn on project R&D over the following year.
Citing these updates, Morgan Stanley presumes the Dojo chip will deliver performance six times better than the current-but-one generation of Nvidia A100 GPU boxes, at less than the current $200k-per-unit cost of a single Nvidia box. Fewer boxes might also cut energy use because they’ll need less cooling.
Even this analysis has to add a note of caution about Tesla’s record of meeting targets, or even just setting them. The forecasts for Dojo reaching 100 exaflops (meaning one quintillion floating point operations per second) given at Tesla’s second-quarter update implied a compute cost nearly twice a good as those presented at the 2022 AI day, for example.
Number fudge, “when considered holistically, may bring to light some definitional inconsistencies as well as a wide range of investor interpretations”, says Morgan Stanley, with a politeness some investors might not consider to be earned.
Joe Moore, Morgan Stanley’s semiconductor analyst, gets a useful cameo halfway through the note to explain what AI silicon actually means. He starts by describing a market in two distinct segments — training, the time-consuming and power intensive compiling of data models; and inference, the process of calling on those models to do things.
ASICs for AI are coming soon, with Google, Amazon, Microsoft and Meta all having announced their own projects. But Nvidia has proven very difficult to shift. Making a lot of money from gaming cards, a high-volume business, means Nvidia can iterate its existing designs regularly and see an immediate return from the $8bn a year it spends on R&D.
Competitors are doing everything from scratch. Any technological lead they find tends to be temporary, because Nvidia is never far behind. Inference chips are the boring workhorses of AI where performance mostly means efficiency so are likely to see a bit of competition, assuming customers can get comfortable with software compatibility, but training’s a different story.
Building a machine-learning model has huge upfront cost so task optimisation and specialism matter. Among Nvidia’s challengers only Google, an AI pioneer having invented transformers a decade ago, has understood customer needs enough to carve out a niche in AI world building.
Can Tesla do the same? Moore rates highly the company’s in-house chip team, which was built by industry superstar Jim Keller between January 2016 and his departure for Intel in April 2018. Moore also notes that Tesla has an edge over many less-well-funded start-ups because it knows both hardware and software. The end product “may prove competitive in its tailored use case”, he says:
Tesla is not competing to make a better chip. Tesla is optimizing for a single purpose that can in turn drive an improved total output, at greater efficiency and lower cost. NVIDIA used the demanding performance of gaming to develop the world’s most powerful GPU chips. Can Tesla use the demands of autonomous cars/FSD to become a global leader in custom AI chips?
Yes! say Jonas et al. Yes it can! Look!
And look!
And look!
That’s Tesla on an EV of 28.3 times 2025 ebitda, per Morgan Stanley forecasts, which is more expensive than Nvidia’s 25.3 times ebitda. “We believe growth in the latter half of the decade via Dojo synergies justifies the valuation,” the team says:
As Tesla begins to unlock Dojo synergies in the back half of the decade and beyond 2030, we expect to see meaningful EBITDA margin expansion. We forecast Network Services to deliver a 65% EBITDA margin, and that it will represent 62% of Tesla’s total EBITDA in 2040. We can thus imply a 35% total company EBITDA margin in FY40e, up from 15% in FY23e and 24% FY30e.
And Edward Stanley, Morgan Stanley’s head of equity strategy, is drafted in to deliver a separate note on Tesla Dojo’s possible contribution to Bessembinderism:
As we have written before, just 2.3% of all equities have generated $73trn of net shareholder returns over 30 years. Finding plausible, scalable Moonshots that have not been fully discounted by the markets should be of interest to all investors. We believe Tesla’s project Dojo could tick all the boxes. [ . . . ]
Although Dojo is still early in its development, we believe that if the Moonshot is successful, its applications longer-term can extend beyond the auto industry. Dojo is designed to process visual data that can lay the foundation for vision-based AI models such as robotics, healthcare and security. In our view, once Tesla makes headway on autonomy and software, third party Dojo services can offer investors the next leg of Tesla’s growth story. Humanoid robotics and the potential to displace workers and social workers was one of our original and adjacent Moonshots but which would be substantially accelerated by Dojo’s success.
Jonas has previously hung Tesla advice on all sorts of stuff, including battery production, charging stations and insurance, AI hadn’t had the same appeal previously, with the note accompanying his June 22, 2023, downgrade to “equal weight” explaining:
While we understand why Tesla gets a serious mention in an AI conversation, we believe a re-rating on this theme is in the realm of the non-disprovable bull case. Autonomous driving and generative AI still remain, in our view, two very different technological disciplines. While the market may want to dream on the AI theme, we’d prepare to wake up to the sound of a blaring car horn.
Beep beep.
Further reading:
Tesla’s Springfield Gorge trajectory (FTAV)
The best of Morgan Stanley’s Adam Jonas (FTAV)
Read the full article here