A few months ago, Qualcomm (QCOM 2.37%) quietly demonstrated an AI milestone that many completely missed: It operated Stable Diffusion, a popular text-to-image generative AI service, directly on an Android smartphone.
Apple (NASDAQ: AAPL) and other device makers have been moving AI models to the "network edge" too -- meaning the AI is being operated on-device, rather than in the cloud. But Qualcomm's achievement is a first, and it could solve a real pain point that is already cropping up with this new batch of generative AI services like Stable Diffusion and ChatGPT. Here's why Qualcomm is worth paying attention to right now.
'Training day' is over... now the real work starts
To understand Qualcomm's accomplishment, first consider the two steps involved in new generative AI text-to-image services like Stable Diffusion, or DALL-E 2 from OpenAI, the same organization responsible for ChatGPT.
The first step is called training. When reduced down to its base definition, generative AI is simply an algorithm that intakes user input and "creates" an output based on past experience. In the case of Stable Diffusion, that's the creation of a new image based on a user's text description. But to get that algorithm to accomplish this feat, it needs to be trained with a massive amount of data. Again, in the case of Stable Diffusion, the data fed to the algorithm are images (human-created ones) with accompanying text descriptions.
Once it has parsed through this information, the AI algorithm is ready to go to work. This second step is called inference. Using its past experience with text descriptions of images, the AI algorithm can infer a new image based on the unique text input provided by the user. In Qualcomm's recent on-device demonstration, it used the prompt "Super cute fluffy cat warrior in armor, photorealistic, 4K, ultra detailed, vray rendering, unreal engine."
This second step, inference, is where most of the computing work is done over the long term. Sure, training a new generative AI model requires incredibly expensive computing systems designed by the likes of Nvidia, and uses up lots of electricity in the process. But that training process happens once. Inference occurs countless times, as users may prompt an AI algorithm to process work for many years -- consuming lots more electricity on powerful cloud-based chips in the process. This is why on-device inference is so important.
Moving the work from the cloud to the 'edge'
As generative AI starts to take off, there are already growing concerns about the massive cost (amount of electricity used and ongoing computing hardware purchases) required to operate it in the cloud. But what if once trained, generative AI models could be operated directly on an ultra-power-efficient device like a smartphone?
That's exactly what Qualcomm did. It was able to shrink down the Stable Diffusion model to such a size that the algorithm could be stored directly on the phone. This was an early demo. But if Qualcomm can keep improving its "AI Studio" (a suite of tools for developers to bring their AI models to the "edge," that is, directly to a user device), no more sending a text prompt over the internet to a remote data center for inference, and then back over the web again to the user.
Besides an obvious performance improvement and cost savings, bringing AI inference on-device also has security implications, since data would never travel across the internet. Indeed, some AI models already operate this way for obvious reasons, like a biometric scan of a fingerprint or face scan to unlock a phone. Uploading that kind of personal info to the cloud would be dangerous.
What's the upside for Qualcomm?
Qualcomm's breakthrough has significance beyond its smartphone business -- still its bread-and-butter at roughly 80% of revenue. Qualcomm is in process of diversifying its processor business into new realms, though. For example, work with Microsoft and laptop manufacturers is ongoing to bring more of Qualcomm's ultra-energy-efficient mobile chips to bear, versus power-hungry designs dominated by Intel and AMD.
This is a clear response to Apple's work bringing a larger version of its iPhone processors to the MacBook family, creating a similar high-performance experience on laptop as users have come to love on a smartphone. If Qualcomm succeeds, it could disrupt a very large market that has been controlled by Intel and AMD chips for decades. Being able to efficiently run AI models directly on a laptop could be quite the selling point. Qualcomm has said a lot more Windows-based laptops powered by its Snapdragon processors will be available in 2024.
And Qualcomm is also making a concerted push into automotive chips, too. While its demonstration of Stable Diffusion won't have a direct impact on cars, it does nonetheless illustrate the company's growing prowess in helping developers with on-device AI -- in this case, the "device" being a vehicle.
While Apple has been doing on-device AI inference for years, and talking about doing even more, Qualcomm may have actually beaten the iPhone maker to the punch in this new era of generative AI. Qualcomm currently trades for a cheap price, and its business is at or near a cyclical bottom for the smartphone market. At just 12 times trailing-12-month earnings, or 19 times free cash flow, this top chip stock could be one of the best AI players to buy right now for the long haul.
Nicholas Rossolillo has positions in Advanced Micro Devices, Apple, Nvidia, and Qualcomm. The Motley Fool has positions in and recommends Advanced Micro Devices, Apple, Microsoft, Nvidia, and Qualcomm. The Motley Fool recommends Intel and recommends the following options: long January 2023 $57.50 calls on Intel and long January 2025 $45 calls on Intel. The Motley Fool has a disclosure policy.
"Smartphone" - Google News
June 10, 2023 at 04:00AM
https://ift.tt/t2ocpRS
Has Qualcomm Already Beaten Apple to the Punch in Smartphone AI? - The Motley Fool
"Smartphone" - Google News
https://ift.tt/vzrV9nP
https://ift.tt/f25gFRH
Bagikan Berita Ini
0 Response to "Has Qualcomm Already Beaten Apple to the Punch in Smartphone AI? - The Motley Fool"
Post a Comment