State-Independent Low Resistance Drift SiSbTe Phase Change Memory for Analog In-Memory Computing Applications
HY Cheng*
Zhi-Lun Liu*
Amlan Majumdar*
Alexander Grun*
Asit Ray*
Jeff Su*
Malte J. Rasch
Fabio Carta*
Lynne Gignac*
Christian Lavoie*
Cheng-Wei Cheng*
M Bright Sky*
HL Lung*
* External authors
VLSI 2024
2024
Abstract
We developed a phase-change memory (PCM), with SiSbTe material, that showed state-independent resistance drift (v~0.04) at 65°C over the entire analog conductance range. We evaluated this PCM for In Memory Compute (IMC) applications simulating the performance of BERT model with the IBM Analog Hardware Acceleration Kit (AIHWKit). Drift and data retention are dependent on the amount of A-type dopant into SiSbTe materials. Finding a trade-off between the two is important to deliver a balanced material that can tackle IMC workload without losing in performance. The fabricated SiSbTe PCM devices maintain the BERT accuracy (<2% loss) for more than 7 days at 65°C and pass the data retention at 85 °C/48hrs demonstrating a great balance between the two metrics
Related Publications
Analog in-memory accelerators present a promising solution for energy-efficient training and inference of large vision or language models. While the inference on analog accelerators has been studied recently, the analog training perspective is under-explored. Recent studies …
Analog in-memory computing is a promising future technology for efficiently accelerating deep learning networks. While using in-memory computing to accelerate the inference phase has been studied extensively, accelerating the training phase has received less attention, despi…
This paper introduces the Analog AI Cloud Composer platform, a service that allows users to access Analog In-Memory Computing (AIMC) simulation and computing resources over the cloud. We introduce the concept of an Analog AI as a Service (AAaaS). AIMC offers a novel approach…
JOIN US
Shape the Future of AI with Sony AI
We want to hear from those of you who have a strong desire
to shape the future of AI.