Opportunities with hyperscalers

By Mike Glöckner, Analyst in the Research Team at DJE Kapital AG 

The main driver behind the strong share price performance of hyperscalers such as Microsoft and Amazon was the topic of artificial intelligence (AI). Hyperscalers are investing heavily in AI: estimated direct purchases from NVIDIA are USD 33 billion in 2023, USD 57 billion in 2024 and USD 71 billion in 2025 (source: Bloomberg, JP Morgan). In addition, they have been able to generate initial sales in the billions and provide a platform for virtually all companies worldwide to experiment with AI. 

In a recent study, Goldman Sachs assumes that more than one trillion US dollars will be invested in AI infrastructure (servers, chips, data centres, energy supply) in the coming years. However, this assumes that generative AI companies will change entire industries and society as a whole in the future. The CIO survey by Morgan Stanley and others have come to the conclusion that AI has become the top technological topic in the corporate world. This is where most investments are being channelled. However, it is well recognised that not all AI projects will lead to success. However, the first signs of disillusionment are emerging. Market observers do not expect the first AI successes to go into production until after 2025. It has been recognised that AI is often not delivered as a trivial matter and is then immediately ready for use, but that it involves very demanding IT development work. AI specialists are also extremely rare. 

NVIDIA takes centre stage 

Without question, the chip company NVIDIA is at the centre of attention and AI developments. NVIDIA currently has a market share of more than 90% of the chips (GPUs) used for AI. The company started developing a programming platform called CUDA before 2010, twelve years before its competitors, so that customers can control and use their graphics processing unit (GPU) chips in the best possible way. Based on this platform, NVIDIA has also created a library that has grown considerably over the years in order to facilitate development for as many use cases as possible. This increases the already considerable gap to the competition and allows NVIDIA to establish its GPUs even further in the market. Not only is NVIDIA launching new, improved chip architectures on the market at a high frequency, but these are also very easy to control for developers in all sectors. Over five million developers use CUDA for AI programming. Switching to other chips requires a change in programming, which causes considerable migration effort. NVIDIA is not only driving AI developments in the Large Language Model (LLM), but also in areas such as medicine and genetics in research into new drugs, self-driving cars, weather forecasting, personal digital assistants, sales consultants, robotics for industrial production and humanoid robots (also with a high market share here). 

It feels like anything is possible when it comes to AI, but it is almost impossible to predict when breakthroughs will be achieved and in which field. OpenAI achieved a breakthrough in the field of speech at the beginning of 2023. NVIDIA will continue to play a leading role in global AI development thanks to its enormous size, research lead and growing moats. When evaluating the company, however, it is important to take a close look at when to enter the market and when to exercise caution. 

How much further can the AI rally go? 

The AI rally has corrected twice since last year: by -8.7% in autumn 2023 and by -13.7% in spring 2024. In hindsight, both corrections proved to be good entry opportunities for investments in companies with a large share of AI sales. A further correction is conceivable now that prices and valuations have soared. Should this occur, entry opportunities are likely to arise again for various stocks.  

AI infrastructure - the boom within the boom 

During the AI boom, the segment's infrastructure, which tends to operate in the background, also benefited. This includes chip designers such as Broadcom, AI server suppliers, component testers, PCB manufacturers, memory chip manufacturers and chip packaging specialists. But also semiconductor suppliers such as ASML, the networking sector and, of course, the chip companies that generate major sales with AI (NVIDIA, TSMC). Most recently, the boom has expanded to include companies that focus on energy and data centre cooling, as the sharp increase in AI servers means that ever larger data centres with increasing energy requirements are being built. Although the chips are becoming more and more energy-efficient, they require increasingly powerful cooling. 

AI applications still with weak sales 

The situation is more modest on the side of companies that use AI to offer end customers real added value. Software solutions such as AI co-pilots, AI assistants for tax software or programmes for process optimisation have not yet been able to report any major increases in turnover. Of course, there are also numerous start-ups working on the topic of AI. However, experience shows that most start-ups have to give up their business at some point, only a few make it. Even the hyperscalers, which have invested a total of over USD 100 billion in AI in 2023 and 2024, have so far hardly been able to generate more than a single-digit percentage of their turnover from the AI business. 

Hyperscalers are still in a race for the best AI offering, the best AI data model, the best pre-trained models and the best-functioning AI chat or assistant. This is a very costly competition. The question is whether hyperscalers will be forced to reduce their investments sooner or later because revenues are not yet materialising as expected. It is also possible that the race towards "artificial general intelligence" (AGI) will overshadow everything and justify the immense investments. AGI is understood to mean intelligence that equals that of humans. The size of large language models is currently doubling approximately every six months (currently around two trillion parameters). This development improves the accuracy of the results. However, according to Bank of America, there is still a considerable way to go before a structure similar to that of the human brain with 100 trillion synapses, i.e. connections between nerve cells, is achieved. 

Caution with excessive valuations 

Overall, it can currently be said that a lot is happening on the development and infrastructure side of AI and a huge amount is being invested. On the other hand, disproportionately little of this has yet reached customers and end users. This was initially the case with almost all major structural developments, be it the development of the railway network or the internet. From an investor's point of view, it is important to keep checking how far AI development has progressed, who benefits from it and how long this can go on without sufficient revenue on the customer side. It cannot be ruled out that the AI boom has led to inflated valuations and a bubble. If it bursts, it will probably primarily affect the former AI high-flyers. However, the field of opportunities opened up by AI seems too overwhelming to ignore. Cautious participation is advisable here after a proper review of investments, also in conjunction with partial exits and profit-taking. We are most likely at the beginning of a new technology decade. Investment in AI has been faster than in previous structurally changing technologies. However, what has not changed: Some stocks may have run too hot by now, anticipating a development too quickly that is not yet justified by reality. 

 

Note: Marketing advertisement - All information published here is for your information only and does not constitute investment advice or any other recommendation. The statements contained in this document reflect the current assessment of DJE Kapital AG. These may change at any time without prior notice. All statements made have been made with due care in accordance with the state of knowledge at the time of preparation. However, no guarantee and no liability can be assumed for the correctness and completeness.