Credit Suisse issued research on the potential of ChatGPT at the 26th Asian Investment Conference, which was held in Hong Kong from March 21 to 23, 2023. Credit Suisse’s global sector research teams released a thematic research study that delves into ChatGPT, adjacent AI use cases that may arise by industry/sector, the AI hardware supply chain that will grow it, and the organizations most likely to profit from rapidly rising AI model deployments.
According to the research, the OpenAI program registered 100 million users in just two months, outperforming TikTok and Instagram in terms of growth pace. According to sources, the AI chatbot grew in popularity in a couple of weeks – much quicker than social media sites. ChatGPT gained one million members in only five days after its November 30, 2022 launch, 57 million by December 2022, and 100 million by January 2023.
According to the paper, Generative Artificial Intelligence is expected to be transformational with more fine-tuning and regulation. It stated that we see ChatGPT’s technological innovations, the progression of generative/conversational AI (of which ChatGPT is a product), and Bing AI (among other apps/services) as broadly transformative and primarily a productivity, cost-cutting, efficiency tool versus a revenue-generating tool today in most industries.
ChatGPT’s capacity to generate and verify code in different languages of programming would help dramatically increase the speed of innovation for software applications, it added, adding that other sectors that require experts to search/validate facts or inquiries would also see a real-time benefit because ChatGPT is already a useful tool for several productivity use cases, such as an idea or content generation.
According to the article, ChatGPT’s LLMs will be fine-tuned over time, with the next major milestone being GPT-4, an LLM having much more parameters than GPT-3’s 175 billion.
Even though corporations and organizations have already used ChatGPT (and other contemporary generative AI technologies) throughout the world, there are hazards associated with ChatGPT, as well as some limits of the GPT LLMs. As a result, researchers feel that laws and regulations are required for AI development, particularly for ChatGPT, given the potential influence on society. Although there are no existing rules on ChatGPT, relevant talks have been taking place addressing how to ensure the impact of recent advancements is responsible and managed, according to the report.
While it is obvious that the Information Technology industry is the primary beneficiary of ChatGPT, the report discovered that within the technology industry, 30% of all new code is generated with AI assistance via tools such as ChatGPT and Copilot – a testament to the technology’s value proposition and a significant productivity accelerator.
Importantly, the AI hardware and semiconductor supply chain is expected to benefit as well. According to the paper, AI models are compute-heavy during training and continue to be compute-intensive when users use compute resources for inference (for example, entering a prompt into ChatGPT’s prompt box).
Within the next five years, there will very certainly be a large scaling-up effort to handle AI workloads and the resources needed to deliver their development objectives. The paper also analyses numerous sub-categories of AI hardware and semiconductor supply chains that may be off the radar of most investors. In a fast-expanding AI environment, the Asia/Europe tech supply chain for AI / ChatGPT will be a crucial facilitator.
In the paper, the Credit Suisse global tech team summarised the supply chain implications and company-level benefits of Chat GPT’s quick acceptance and its potential to boost AI ecosystem deployment further.
While the new ChatGPT workloads are not yet offsetting macro to generate upside in supply chain orders, it stated that focused investments coupled with the acceleration of AI have the potential to produce over-indexed growth during the industry slump. In the medium term, the adoption of AI services and their industrial use cases for revenue generation and cost/capex savings might feed into a new hardware and semiconductor cycle to sustain innovation and improvements.
The CS study also discussed how AI computing and memory might help the semiconductor industry. It said that AI training and inference are computation-intensive operations that should drive semiconductor breakthroughs in computing, storage, and data transport. From 2019-24E, data center compute TAM including accelerators, has grown at a 14 percent CAGR, greatly surpassing CPU server growth at a 2 percent CAGR.
Furthermore, in semiconductors, AI can increase prospects for server memory for the memory leaders, with mobile now accounting for 40% of industry bits, power management into AI boards, network switch ICs and ASICs, and IC design services.
Also, the supply chain beneficiaries of advancements in compute intensity will be a good driver for leading-edge silicon, which is now replacing mobile as a significant driver for innovation in both advanced manufacturing and high-end package integration.
edited and proofread by nikita sharma