Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Under the Banyan Tree - China and global AI rollout image

Under the Banyan Tree - China and global AI rollout

HSBC Global Viewpoint
Avatar
21 Plays1 year ago
Herald van der Linde is joined by Head of Asia Technology Research, Frank Lee, and Head of China Internet Research, Charlene Liu, to discuss the rate of AI development in China and whether Chinese tech giants can rival key US players. Disclaimer: https://www.research.hsbc.com/R/61/SKDBKS2 Stay connected and access free to view reports and videos from HSBC Global Research follow us on LinkedIn https://www.linkedin.com/feed/hashtag/hsbcresearch/ or click here: https://www.gbm.hsbc.com/insights/global-research.

Hosted on Acast. See acast.com/privacy for more information.

Recommended
Transcript

Introduction to HSBC Global Viewpoint Podcast

00:00:02
Speaker
Welcome to HSBC Global Viewpoint, the podcast series that brings together business leaders and industry experts to explore the latest global insights, trends, and opportunities.
00:00:13
Speaker
Make sure you're subscribed to stay up to date with new episodes.
00:00:16
Speaker
Thanks for listening.
00:00:17
Speaker
And now onto today's show.
00:00:24
Speaker
This is a podcast from HSBC Global Research, available on Apple Podcasts and Spotify.
00:00:30
Speaker
However you're listening, analyst notifications, disclosures and disclaimers must be viewed on the link attached to your media player.

Focus on Asian Markets and AI Rollout in China

00:00:46
Speaker
Welcome to Under the Banyan Tree, where we put Asian markets and economics in context.
00:00:51
Speaker
I'm Harold van der Linde, Head of Asian Equity Strategy here at HSBC.
00:00:56
Speaker
On today's podcast, we're going to bring you up to date with where we are in terms of the artificial intelligence AI rollout.
00:01:03
Speaker
But more importantly, we're going to focus on China's role within it.
00:01:07
Speaker
How do Chinese tech companies plan to make money from AI?
00:01:10
Speaker
And are they really in a position to rival US firms in this space?
00:01:14
Speaker
Joining me to help answer those questions are Frank Li, Head of Asia Technology Research, who joins us from Taiwan, and Shaolin Liu, who leads our China Internet Research Team from Singapore.
00:01:25
Speaker
Let's get the conversation started right here, under the banyan tree.
00:01:35
Speaker
Frank, Charlene, thanks for coming on to the podcast.
00:01:38
Speaker
Thanks, Harold.
00:01:38
Speaker
Thanks for having

Stages of AI Industry Development

00:01:39
Speaker
us.
00:01:39
Speaker
Frank, we want to get a good understanding of what is happening in AI and in particular, how the Chinese are stacking up versus the international place.
00:01:48
Speaker
Now, before we answer that question, I really want to take a step back.
00:01:52
Speaker
In the past, you spoke about that you see the development of the AI industry in three different stages.
00:01:59
Speaker
Can you remind me a little bit what those stages were and where we are?
00:02:03
Speaker
Sure, Harold.
00:02:04
Speaker
Right now, I would say, you know, everyone's investing in an AI, especially all the cloud providers.
00:02:10
Speaker
So what we kind of think right now is that this year is really the first stage.
00:02:14
Speaker
And the first stage is really about investing in the large language models and how you train them.
00:02:21
Speaker
And what's driving this training of the large-angle models is basically the usage of what they call GPUs.
00:02:29
Speaker
GPUs are graphic processing chips, right?
00:02:32
Speaker
So, for example, in a video game, you see something on your screen, a car that's driving or something like that.
00:02:38
Speaker
That's something that's processed by those particular kind of chips, right?
00:02:41
Speaker
Yeah, that's right, Harold.
00:02:42
Speaker
So actually, if you think about the GPU, it started off really being used heavily by gamers because the GPU chip allowed you to enhance and improve your gaming functionality.
00:02:54
Speaker
But then what they've discovered is that, you know, this GPU, because it's very high performance computing,
00:03:00
Speaker
has been used now for data centers, and it's also very functional to be used in AI servers, given the computing power that you need.
00:03:08
Speaker
And that explains why, for example, originally video gaming companies such as Nvidia is so prominent in making these chips, right?
00:03:16
Speaker
Yeah, I mean, I think right now, as far as stage one, NVIDIA in particular is almost a monopoly in terms of its dominance this year because it's really the sole provider of these GPUs and AI services this year.
00:03:30
Speaker
So the stage one is really, you're building out the chips and you're feeding the data into the service so that AKIs can use it.
00:03:39
Speaker
What is stage two and three then?

The Future of AI Technology: Inference and Edge AI

00:03:41
Speaker
Yeah, so I think next year is going to be interesting because you might probably start to see the beginnings of stage two, which we think is going to be more about an inference stage.
00:03:50
Speaker
So if you think about it, this year, everyone is building up their large line of models and needs to be trained.
00:03:55
Speaker
Next year, you're going to start putting in the inference capability because ultimately to use AI, you have to be able to infer from it.
00:04:03
Speaker
Right.
00:04:04
Speaker
And when it comes to inference, there's going to be potentially more possibilities.
00:04:08
Speaker
You know, the chip that's being potentially going to be used in inference is called an application specific IC, an ASIC chip.
00:04:16
Speaker
So an ASIC chip is something that is really specific to doing a particular action, right?
00:04:20
Speaker
So if I need AI to make me, for example, maps in a map making program or so, there's a particular chip that allows these processes to run in it.
00:04:29
Speaker
And that's what you're referring to, right?
00:04:31
Speaker
Yeah, I mean, typically ASIC chips are not as complex as the GPU and they have very specific functions.
00:04:36
Speaker
And that's also why if you look at the cloud providers, especially a lot of the US cloud providers, they are working on developing their own chip in-house as well.
00:04:45
Speaker
So while there is nothing they can do about the first stage, which is about the training, very dependent on NVIDIA, there is increasingly a push now to try and see if they can develop some of these ASIC chips in-house for their own use as you go into stage two.
00:04:59
Speaker
And the third stage, the last stage, what is that?
00:05:03
Speaker
Yeah, the third stage, I call this the blue sky scenario, right?
00:05:07
Speaker
And the reason why I call it blue sky scenario is that while stage one and stage two are very positive for the AI supply chain, it doesn't really help the overall hardware supply chain that much because the demand in terms of AI servers is
00:05:24
Speaker
in terms of number of units, while it takes up a lot of value and it's very expensive, the volume isn't that big.
00:05:31
Speaker
I mean, ultimately, you know, the smartphone PCs are much bigger markets in terms of units.
00:05:37
Speaker
And as we all know, smartphones and PCs have been challenging, but this is where the edge AI comes in.
00:05:41
Speaker
Edge AI, the idea is basically to provide AI functionality directly on your local device.
00:05:49
Speaker
So imagine if you will, that you can actually do AI functionality on smaller scale language models directly on your phone without the use of the cloud.
00:05:59
Speaker
So that ultimately could drive, I think, you know, a big replacement market again in smartphones and PCs.
00:06:05
Speaker
Okay, so I want to move to Charlene now, but just to quickly recap, basically, we now need chips that just can feed a lot of information and gather it and store it.
00:06:16
Speaker
We're going to a second stage whereby using that data, using specialty chips, you could say, to run all sorts of different applications of it.
00:06:23
Speaker
And later on, it's really, you're getting it onto a completely new iPhone or mobile phone.
00:06:29
Speaker
That's a later stage.
00:06:31
Speaker
That's broadly speaking, the kind of development that you're looking forward, right?
00:06:34
Speaker
Yes, that's right.

AI Monetization Strategies and Challenges in China

00:06:35
Speaker
Now, Charlene, you're looking at some of the Internet companies in China and clearly they've come out and have all sorts of monetization strategies.
00:06:43
Speaker
They want to make money out of this, right?
00:06:45
Speaker
Broadly speaking, what is the approach that these companies take in China?
00:06:50
Speaker
Sure.
00:06:52
Speaker
I think from our perspective, there probably comes down to three key monetization avenues for generative AI technology.
00:07:00
Speaker
The first one being AI computing service.
00:07:04
Speaker
And what it does is it supports and empowers AI applications from the large language models training to various AI embedded software services operations.
00:07:15
Speaker
One of such examples would be a photo editing app.
00:07:19
Speaker
And, you know, it has the intention to, for example, to speed up is image generation processing ability.
00:07:27
Speaker
And what they could do is they can then buy GPU usage from one of the cloud players, call it AliCloud.
00:07:35
Speaker
and pay for such services.
00:07:37
Speaker
Second would be model as a service.
00:07:41
Speaker
What it does is it enables developers and enterprises to deploy AI models.
00:07:48
Speaker
They can also design, tweak, and customize these models if they needed to do so.
00:07:53
Speaker
And obviously, internet platforms like Baidu, Alibaba, etc.
00:07:57
Speaker
Tencent would charge a fee for this kind of services.
00:08:01
Speaker
Last but not least, it's AI-generated content, AIGC applications.
00:08:07
Speaker
And this refers to how content generated by AI is being used.
00:08:11
Speaker
and help to enhance existing business to run more efficiently and obviously, you know, in turn to create more new revenue opportunities, so to speak.
00:08:19
Speaker
And I think one example is improving which online audience are being targeted and therefore attract more ad spending.
00:08:28
Speaker
Okay, so broadly speaking, either you make money out of it by saying, listen, I have the computing capabilities, you can rent that from me.
00:08:35
Speaker
Or companies just say, listen, we have all of this.
00:08:38
Speaker
Now I know exactly who and when to target to get them to buy a concert ticket or whatever that may be, right?
00:08:45
Speaker
So those are the models.
00:08:46
Speaker
Are these Chinese companies adopting all of these models or are they focusing on something else?
00:08:51
Speaker
And how do they differ when you look to, say, international companies?
00:08:56
Speaker
Sure.
00:08:56
Speaker
I mean, all these avenues that we laid out are ways that I think all platforms are monetizing to a certain extent.
00:09:04
Speaker
Obviously, some could be doing stronger in certain avenues than the others.
00:09:09
Speaker
But if we were to kind of stack them against the overseas players, I think one sort of obvious bottleneck for Chinese players is the gap in GPU capabilities and inventories.
00:09:21
Speaker
Overseas players have far bigger inventories than the Chinese counterparts.
00:09:25
Speaker
So if we were to kind of look at these three avenues from an AI computing power standpoint, then that would be a gap.
00:09:32
Speaker
And on model service, I think it's fair to say that both overseas and Chinese players offer a very similar range of services, but overseas players are one step ahead in terms of monetization.
00:09:45
Speaker
And in terms of AIGC applications, I would say that global peers are probably leading Chinese players in terms of addressable user base and also in terms of software as a service monetization and integration of large language models into existing products.
00:10:03
Speaker
And what about regulation,

Regulatory Landscape and Chip Dependency

00:10:04
Speaker
Charlene?
00:10:04
Speaker
What sort of differences do we see there between China and the AI names in the US and Europe?
00:10:10
Speaker
I think, you know, governments are all moving very rapidly, but at slightly different speeds.
00:10:18
Speaker
On the China front, China obviously had made the first announcement about the regulatory framework back in April 2023, and they started implementing by August 2023.
00:10:30
Speaker
And in September, they have already approved 12 large language models.
00:10:34
Speaker
Hmm.
00:10:36
Speaker
And on the Europe side, the EU AI Act has already been approved since May this year.
00:10:43
Speaker
And on US lawmakers' side, they're also seeking public comments around potential accountability measures for AI systems.
00:10:51
Speaker
Obviously, we see a few common threads here emerging around respective proposal made by Chinese and European regulators, and that's probably centered around accuracy and objectivity among the focus areas.
00:11:05
Speaker
Yeah, I can imagine that particular lawmakers are all over the place in terms of what kind of conclusions people can draw out of the information that AI is accessing yet.
00:11:15
Speaker
So broadly speaking, there is a regulatory changes.
00:11:18
Speaker
You say that some of the Chinese companies struggle to actually get the chips that they require now in order to store all that data and get AI running.
00:11:28
Speaker
And that comes back to Frank, right?
00:11:29
Speaker
Frank, how do the Chinese companies stack up in your hardware view on AI?
00:11:35
Speaker
Yeah, that's a good question, Harold.
00:11:37
Speaker
Especially now, as we look at the AI infrastructure and hardware development, it's still very much one-sided in the sense that most of the large Langmuid models today in China are all being driven by U.S. chips, right?
00:11:53
Speaker
U.S. companies providing that core chip that goes into it.
00:11:57
Speaker
Now, I think there's going to be more debate about what development's going to look like going forward now that they...
00:12:05
Speaker
I mean, what you're referring to is basically that America is trying to limit access of these chips to China, right?
00:12:11
Speaker
Because it wants to slow the development of the AI industry in China down and so that America as a nation, say, or Europe, stay ahead of the game.
00:12:20
Speaker
Yeah, so I think that's definitely a key issue.
00:12:23
Speaker
But aren't the Chinese making their own chips, new companies emerging that front?
00:12:27
Speaker
Well, the thing is, like the technology, when it comes to the process technology, the node technology that the Chinese are able to do and what they currently depend on, it's a big gap.
00:12:37
Speaker
So that's why I said earlier, like, you know, pretty much all the larger language models are being, all the chips that are going into it are all U.S. companies providing those chips.
00:12:47
Speaker
So basically what we're seeing is we're in the early stages of a build-out of AI.
00:12:51
Speaker
The model over the years will change away from just feeding AI with a lot of data, but towards making use of that data and finding all sorts of applications.
00:13:01
Speaker
The Chinese, like other markets, are trying to monetize on this in various ways, as Charlene has indicated.
00:13:07
Speaker
But the problem for them is that they're still very much dependent on supply of Western chips.
00:13:11
Speaker
And to a certain extent, that is...
00:13:13
Speaker
limited or maybe more difficult for them to obtain, right?
00:13:16
Speaker
And that slows down the development and the ability to monetize on this as well.
00:13:21
Speaker
Yes, I think that that's a good way of putting it, Harold.
00:13:25
Speaker
Okay, well, with that then, thanks, Charlene and Frank, and hope to discuss this matter at some point in time in the future again.
00:13:31
Speaker
Thank you again.
00:13:32
Speaker
Thanks, Harold.
00:13:33
Speaker
Well, we're going to have to wrap things up here, folks.
00:13:36
Speaker
Another very interesting discussion here under the banyan tree.
00:13:39
Speaker
And we thank you, as always, for joining us.
00:13:42
Speaker
Remember to check out our sister podcast, The Macro Brief and the ESG Brief, available wherever you get your podcast.
00:13:48
Speaker
We'll be back again, same time, next week.
00:14:12
Speaker
Thank you for joining us at HSBC Global Viewpoint.
00:14:16
Speaker
We hope you enjoyed the discussion.
00:14:18
Speaker
Make sure you're subscribed to stay up to date with new episodes.