Authors

* External authors

Venue

Date

Share

Fast and robust analog in-memory deep neural network training

Malte J. Rasch

Fabio Carta*

Omobayode Fagbohungbe*

Tayfun Gokmen*

* External authors

Nature Communications

2024

Abstract

Analog in-memory computing is a promising future technology for efficiently accelerating deep learning networks. While using in-memory computing to accelerate the inference phase has been studied extensively, accelerating the training phase has received less attention, despite its arguably much larger compute demand to accelerate. While some analog in-memory training algorithms have been suggested, they either invoke significant amount of auxiliary digital compute—accumulating the gradient in digital floating point precision, limiting the potential speed-up—or suffer from the need for near perfectly programming reference conductance values to establish an algorithmic zero point. Here, we propose two improved algorithms for in-memory training, that retain the same fast runtime complexity while resolving the requirement of a precise zero point. We further investigate the limits of the algorithms in terms of conductance noise, symmetry, retention, and endurance which narrow down possible device material choices adequate for fast and robust in-memory deep neural network training.

Related Publications

Towards Exact Gradient-based Training on Analog In-memory Computing

NeurIPS, 2024
Zhaoxian Wu*, Tayfun Gokmen*, Malte J. Rasch, Tianyi Chen*

Analog in-memory accelerators present a promising solution for energy-efficient training and inference of large vision or language models. While the inference on analog accelerators has been studied recently, the analog training perspective is under-explored. Recent studies …

Analog AI as a Service: A Cloud Platform for In-Memory Computing

SSE, 2024
Kaoutar El Maghraouir*, Kim Tran*, Kurtis Ruby*, Borja Godoy*, Jordan Murray*, Manuel Le Gallo-Bourdeau*, Todd Deshane*, Pablo Gonzalez*, Diego Moreda*, Hadjer Benmeziane*, Corey Liam Lammie*, Julian Büchel*, Malte J. Rasch, Abu Sebastian*, Vijay Narayanan*

This paper introduces the Analog AI Cloud Composer platform, a service that allows users to access Analog In-Memory Computing (AIMC) simulation and computing resources over the cloud. We introduce the concept of an Analog AI as a Service (AAaaS). AIMC offers a novel approach…

State-Independent Low Resistance Drift SiSbTe Phase Change Memory for Analog In-Memory Computing Applications

VLSI, 2024
HY Cheng*, Zhi-Lun Liu*, Amlan Majumdar*, Alexander Grun*, Asit Ray*, Jeff Su*, Malte J. Rasch, Fabio Carta*, Lynne Gignac*, Christian Lavoie*, Cheng-Wei Cheng*, M Bright Sky*, HL Lung*

We developed a phase-change memory (PCM), with SiSbTe material, that showed state-independent resistance drift (v~0.04) at 65°C over the entire analog conductance range. We evaluated this PCM for In Memory Compute (IMC) applications simulating the performance of BERT model w…

  • HOME
  • Publications
  • Fast and robust analog in-memory deep neural network training

JOIN US

Shape the Future of AI with Sony AI

We want to hear from those of you who have a strong desire
to shape the future of AI.