Micron and Qualcomm to Accelerate Generative AI at the Edge for Flagship Smartphones

Micron and Qualcomm to Accelerate Generative AI at the Edge for Flagship Smartphones

Delivering world’s highest mobile performance of 9.6 Gbps, Micron LPDDR5X now sampling for Snapdragon 8 Gen 3

Micron Technology, Inc. (Nasdaq: MU), announced today that it is now shipping production samples of its low-power double data rate 5X (LPDDR5X) memory — the industry’s only 1β (1-beta) mobile-optimized memory — for use with Qualcomm Technologies, Inc.’s latest flagship mobile platform, Snapdragon® 8 Gen 3. Running at the world’s fastest speed grade of 9.6 gigabits per second (Gbps), Micron LPDDR5X provides the mobile ecosystem with the fast performance needed to unlock generative artificial intelligence (AI) at the edge. Enabled by its innovative, industry-leading 1β process node technology, Micron LPDDR5X also delivers advanced power-saving capabilities for mobile users.

“Generative AI is poised to unleash unprecedented productivity, ease of use, and personalization for smartphone users by delivering the power of large language models to flagship mobile phones,” said Mark Montierth, corporate vice president and general manager of Micron’s Mobile Business Unit. “Micron’s 1β LPDDR5X combined with Qualcomm Technologies’ AI-optimized Snapdragon 8 Gen 3 Mobile Platform empowers smartphone manufacturers with the next-generation performance and power efficiency essential to enabling revolutionary AI technology at the edge.”

Micron and Qualcomm to Accelerate Generative AI at the Edge for Flagship Smartphones

As the industry’s fastest mobile memory offered in speed grades up to 9.6 Gbps, Micron’s LPDDR5X provides over 12% higher peak bandwidth1 compared to the previous generation — critical for enabling AI at the edge. The Snapdragon 8 Gen 3 allows powerful generative AI models to run locally on flagship smartphones, unlocking a new generation of AI-based applications and capabilities. Enabling on-device AI additionally improves network efficiency and reduces the energy requirements and expense of more costly cloud-based solutions, which require back-and-forth data transfer to and from remote servers.

“To date, powerful generative AI has mostly been executed in the cloud, but our new Snapdragon 8 Gen 3 brings revolutionary generative AI use cases to users’ fingertips by enabling large language models and large vision models to run on the device,” said Ziad Asghar, senior vice president of product management at Qualcomm Technologies, Inc. “Our collaboration with Micron to pair the industry’s fastest mobile memory, its 1β LPDDR5X, with our latest Snapdragon mobile platform opens up a new world of on-device, ultra-personalized AI experiences for smartphone users.”

Built on Micron’s industry-leading 1β process node and delivering the industry’s most advanced power-saving capabilities such as enhanced dynamic voltage and frequency scaling core techniques, LPDDR5X offers a nearly 30% power improvement2 and the flexibility to deliver workload-customized power and performance. These power savings are especially crucial for energy-intensive, AI-fueled applications, enabling users to reap the benefits of generative AI with prolonged battery life.

Offered in capacities up to 16 gigabytes and providing the industry’s highest performance and lowest power consumption, Micron’s LPDDR5X delivers unprecedented support for on-device AI, accelerating generative AI’s capabilities at the edge.


Manufacturing & Engineering Magazine | The Home of Manufacturing Industry News

Share this post

Featured MEM Consumer

Subscribe to MEM Newsletters!