GP Bullhound's weekly review of the latest news in public markets.
This week’s update covers the LLM market landscape, Workday and ASMI’s capital markets days, and Huawei’s launch event.
Market: September volatility was happily less present this week – the market is still in a bit of wait-and-see mode ahead of Q3 numbers while still digesting the higher-for-longer comments from the US Fed.
Portfolio: We made no significant changes to the portfolio this week.
AI arms race – LLMs, data, chips – what really matters?
Lots of AI news this week – Amazon announced that it would invest in Anthropic; the Meta Connect event showcased its Llama LLM; ChatGPT announced new features; and OpenAI is raising money from SoftBank’s Masa Son to build the “iPhone of AI” with Jony Ive. It meant we spent a lot of time thinking about the AI landscape – large language models, AI business models, what it means for chips, cloud providers and the stocks we own. And, importantly, where in the value chain should we think about investing (short answer: chips).
Anthropic is a foundational model company – its most famous competitor is OpenAI. They both own and operate large language models. The relationship between the cloud providers (Amazon AWS, Microsoft Azure, Google GCP) and the large language model businesses (like OpenAI and Anthropic) is twofold: the cloud providers want to offer access to LLMs for their customers, and the LLMs need to use the compute capacity and chips of the cloud services companies to train their models as well as to then use the model (inference).
Microsoft is perhaps the most clear cut: it has its partnership and investment with OpenAI – that means it generates revenue from OpenAI/ChatGPT via its cloud business, and Microsoft Azure’s customers have access to OpenAI. And Microsoft can implement OpenAI into its products (we think Copilot is still predominantly based on OpenAI). For OpenAI, it has critical access to Microsoft’s compute capacity – and the current hottest commodity: Nvidia GPUs.
Meta has its own LLM, Llama, trained on Nvidia GPUs (after its extraordinary couple of years capex spend, Meta is very likely one of the biggest owners of Nvidia chips). And to the extent that Meta has been able to apply its AI capability to its extensive existing ads business (to mitigate ATT) it has been one of the biggest AI beneficiaries in the short term.
Llama is open source – available for free for commercial and research use, except that any company with more than 700 million monthly active users needs to request a license from Meta, which we believe each of Microsoft, Amazon and Google has now done.
Google is undoubtedly the most advanced in terms of its chips. Its TPU and TPU architecture is the most competitive with Nvidia. Google has its own LLM, the basis for Bard, which it integrates into its Google services.
Amazon develops its own chips, Trainium and Inferentia. However, the performance metrics are very far from Nvidia/Google’s TPUs, and we know Amazon is still doing anything to get its hands on Nvidia GPUs. What makes the Amazon/Anthropic deal interesting is that from an LLM perspective, AWS has primarily sold itself so far as being LLM agnostic –so that customers and developers just need to bring their datasets to any of the pre-trained models that are available on AWS. CEO Andy Jassy said the following on the earnings call back in August:
This is what our service Bedrock does and offers customers all of these aforementioned capabilities with not just one large language model but with access to models from multiple leading large language model companies like Anthropic, Stability AI, AI21 Labs, Cohere and Amazon’s own developed large language models called Titan. If you think about these first two layers I’ve talked about, what we’re doing is democratising access to generative AI, lowering the cost of training and running models, enabling access to large language model of choice instead of there only being one option, making it simpler for companies of all sizes and technical acumen to customise their own large language model and build generative AI applications in a secure and enterprise-grade fashion, these are all part of making generative AI accessible to everybody and very much what AWS has been doing for technology infrastructure over the last 17 years.”
It’s worth noting OpenAI and Llama are absent from that list – Amazon doesn’t have direct access to OpenAI (despite the name, OpenAI is not open source), but it announced yesterday Llama would now be available on Bedrock (since Amazon has more than 700 million customers, we assume it has signed a licensing deal/revenue share with Meta).
There are still many unknowns around large language models – how many will there be, will there be a winner? One assumption might be that the underlying data the model has been trained on might be a competitive advantage, and so Google and Meta might be at an advantage over OpenAI with their proprietary data. In reality, we’re not as sure this is the case – even those very vast proprietary data sets are likely lost in the sea of internet history LLMs are trained on.
Much more likely is that by effectively letting the pretrained model access your own proprietary data (whether that’s enterprise data, or customer data), companies can gain a competitive advantage. That’s how we think about the likes of Microsoft, Salesforce, ServiceNow et al. as being able to offer AI services to their customers.
For Anthropic and the other standalone LLMs, their ultimate success will likely be whether they are the underlying source models for other apps. And that’s where the open source LLMs – like Llama – are likely to benefit from being applied to a whole swathe of new use cases and systems on which it can run in the short term – simply because they’re free.
Meta’s motivation to make its model available for free (rather than explicitly monetise it like OpenAI does with ChatGPT) isn’t totally clear – it could ultimately be supporting its future competitors. But there are benefits – its business is built on content, and Llama will ultimately produce a lot of content; and if their hardware business (the VR headset – which relaunched this week) comes back into focus, owning the developer LLM of choice (can Meta be to AI what Apple is to smartphones?) will surely be valuable.
The other point is that Llama 2 likely leads to total commoditisation of LLMs – Llama 2 is as good as GPT 3.5, but not as good as GPT 4. Llama 2, though, is free. This means that the whole LLM space is likely to experience pricing pressure.
In that context, OpenAI’s reported $90bn valuation (or about 80x revenue) this week needs it to be a consumer app and not just an LLM. And Amazon’s investment in Anthropic (it’s investing up to $4bn though not clear at what valuation) may well be more about (1) guaranteeing Anthropic availability for Bedrock and (2) running LLMs in Amazon end devices (like Alexa) – and for Anthropic it is much more about access to cheap money, compute capacity and chips, and perhaps finding that use case that enables mass distribution.
For us in the portfolio, commoditisation of LLMs isn’t bad – it ultimately means we’ll likely get to those new feasible revenue and business models much more quickly, which will drive more cloud compute and more sustainable investment in AI chips. We own Microsoft, Amazon and Google who we think should see more demand within their (highly profitable) cloud services business. We also own Nvidia, which is the company most obviously downstream of their capex (and TSMC and semicap which also benefit).
There is upside for the enterprise software companies that are able to effectively connect their own vast proprietary data sets to pre-trained LLMs, and offer AI tools and features to drive ARPU growth. And on Jony Ive’s “iPhone of AI”, it goes without saying that any mass-adopted AI consumer product, which we assume would be built with some hefty smartphone GPUs would be very positive indeed for TSMC and Semicap equipment.
A long intro this week – onto the newsflow.
Recommended by LinkedIn
Enterprise spend still experiencing headwinds
Portfolio view: On Workday, which we own, a mid-term target reset is never great (a reminder to us about the will of new management teams), but taking a step back, it is still compounding earnings in the 20%. And we still view Workday as being one of the sticky platform businesses able to add AI features into its product and drive continued sustained growth in customers (Workday continues to gain share from legacy on-premise) and ARPU. Accenture is a late-cycle business rolling over (after two years of double-digit growth) – and while we view it as a long-term quality compounder (it continues to deliver margin growth and strong FCF), we think there are better opportunities elsewhere in tech.
Memory market bottoming but still painful write-downs to get through
Portfolio view: This memory downturn is the worst the market has seen in over 10 years. While much of this is the result of extraordinary circumstances (pandemic, inflation), some of it is also the nature of the industry – we don’t own any memory players in the portfolio as the commoditised nature and reliance on each player staying rational on supply means that for us it doesn’t match with our sustainable return on invested capital process.
There is limited new news on capex for our semicap equipment exposure (LAM research particularly memory-spend exposed) – we continue to expect a relatively weak year for memory spend, but we increasingly think there’s upside risk of a stabilisation of DRAM spend helped by the AI build out (and increased die sizes of HBM). The commentary around a bottoming of the inventory correction and the cycle is a clear positive for the broader semis industry. Things might not be improving, but hopefully, they’ve stopped worsening.
Semicap debate continues, but long-term drivers remain intact
Portfolio view: We own ASML, LAM Research, Applied Materials and KLA – all peers of ASM International (all have very high market shares in each stage of the semiconductor manufacturing process). There are clearly moving parts in semicap and capex spend – by its nature capex is cyclical and semicap is exposed to those trends. However, we continue to believe in the long-term growth drivers of the industry – we spoke above about AI and LLMs and we need more, faster, bigger GPUs and CPUs – that’s a long-term structural driver, as well as the tech leadership race between Intel, TSMC and Samsung which we spoke about last week. Today, Intel will livestream its European fab opening – the first European fab to use ASML’s EUV tools in mass production. And while spend can be cyclical in the short term, these are all businesses that have shown an ability to generate strong FCF and profitability in a downturn – which is key in our investment process.
Huawei is back – what happens to China’s smartphone market shares?
Portfolio view: We own a small position in Apple, where Huawei represents a credible threat in its home market (also to the extent that the political situation creates a local bias). Another new EV entrant will likely mean the aggressive price competition continues. We own semi suppliers into electric vehicles – Infineon and NXP – which we view as benefiting from the move to EV (semiconductor content significantly increases) and, unlike the auto OEMs, have the pricing power to sustain high levels of margin/returns over time.
Meta Connect – what a difference a year makes
Portfolio view: We don’t own Meta, given the headwinds we perceive from new competition in digital advertising (though we do recognise it has done a great job of applying AI to its advertising business to mitigate the ATT impacts and drive back growth). As discussed above, Llama as an open source LLM is interesting (and might make Meta’s end devices interesting too). It still needs to be made clear if Meta will be to AI what Apple is to smartphones. But we wait and see (from the sidelines for now).
Amazon Prime ads – a slight offset to content wars
Portfolio view: We sold Disney in the portfolio earlier this year and don’t own Netflix. For us in the portfolio, we’re still sceptical about the long-term returns in streaming, and it’s hard not to view the presence of Amazon, Apple, and YouTube as a negative to the path to profitability and the sustainable level of profitability. Advertising revenue is an upside but will take time to ramp up and will be a minimal offset to overall ballooning content costs. Now, our only real exposure to streaming is through Sony, which is the perfect non-streaming streaming player, benefitting from the content inflation in its Pictures business without running a cash flow-negative streaming platform.
For more information about the latest trends and forecasts, please visit our official Tech Thoughts page.
We provide investors with access to category-leading technology companies, globally. Our assets under management have a total value of more than $1bn, and our limited partners include institutions, family offices and entrepreneurs. Learn more about our funds here.