Kioxia, a global leader in memory solutions, recently unveiled a groundbreaking prototype large-capacity flash memory module crucial for handling extensive artificial intelligence (AI) models. This development is a part of the “Post-5G Information and Communication Systems Infrastructure Enhancement R&D Project” initiated by Japan’s national research agency, NEDO. The flash memory module boasts an impressive five terabytes capacity and a high bandwidth of 64 gigabytes per second.
To overcome the conventional trade-off between memory capacity and bandwidth, Kioxia devised a novel module configuration utilizing daisy-chained connections with flash memories. By integrating high-speed transceiver technology capable of supporting bandwidths of 128 gigabits per second, along with performance-enhancing techniques, Kioxia has successfully enhanced both memory controllers and modules.
The practical application of this advanced memory module is poised to revolutionize digital transformation by facilitating the integration of Internet of Things (IoT), Big Data analytics, and advanced AI processing into post-5G/6G Mobile Edge Computing (MEC) servers and other applications.
In the era of post-5G/6G networks, the demand for faster speeds, reduced latency, and increased device connectivity necessitates the adoption of MEC servers. These servers process data closer to users, thereby enhancing real-time application performance. With the rising need for sophisticated AI applications, such as generative AI, there is a growing requirement for memory modules with larger capacity and higher bandwidth.
Against this backdrop, Kioxia focused on leveraging flash memory to enhance memory module capacity and bandwidth. By successfully developing a prototype memory module with a capacity of 5 TB and a bandwidth of 64 GB/s, Kioxia has demonstrated the feasibility and effectiveness of their approach.
Kioxia’s achievements in this project include the adoption of daisy-chain connections with controllers and memory boards, enabling high-capacity, high-bandwidth memory modules. The utilization of 128 Gbps PAM4 transceivers and technologies enhancing flash memory performance have further augmented the capabilities of these modules.
Looking ahead, Kioxia plans to commercialize the outcomes of this research swiftly to align with emerging market trends such as generative AI. The company aims to capitalize on these developments to drive innovation in IoT, big data analysis, and advanced AI processing at the edge.
Ray Sharma, an Industry Analyst with extensive experience in mobile broadband technologies, underscores the significance of Kioxia’s advancements in memory solutions. He emphasizes the impact these developments can have on advancing network management and enabling cutting-edge technologies like generative AI.
📰 Related Articles
- Micron Unveils LPDDR5X Memory for Faster Mobile AI Performance
- Wildix Unveils Wilma AI: Revolutionizing Business Communication and Collaboration
- Saudi Arabia Unveils AI ‘Fatwa Robot’ at Grand Mosque
- Research Solutions Unveils AI Upgrade Revolutionizing Scientific Discovery
- Nvidia Unveils Next-Gen AI Systems at GPU Technology Conference






