Sorry, you need to enable JavaScript to visit this website.

STOCHASTIC DATA-DRIVEN HARDWARE RESILIENCE TO EFFICIENTLY TRAIN INFERENCE MODELS FOR STOCHASTIC HARDWARE IMPLEMENTATIONS

Citation Author(s):
Bonan Zhang, Lung-Yen Chen, Naveen Verma
Submitted by:
Bonan Zhang
Last updated:
10 May 2019 - 12:07am
Document Type:
Presentation Slides
Document Year:
2019
Event:
Presenters:
Bonan Zhang
Paper Code:
3653
 

Machine-learning algorithms are being employed in an increasing range of applications, spanning high-performance and energy-constrained platforms. It has been noted that the statistical nature of the algorithms can open up new opportunities for throughput and energy efficiency, by moving hardware into design regimes not limited to deterministic models of computation. This work aims to enable high accuracy in machine-learning inference systems, where computations are substantially affected by hardware variability. Previous work has overcome this by training inference model parameters for a particular instance of variation-affected hardware. Here, training is instead performed for the distribution of variation-affected hardware, eliminating the need for instance-by-instance training. The approach is referred to as Stochastic Data-Driven Hardware Resilience (S-DDHR), and it is demonstrated for an in-memory-computing architecture based on magnetoresistive random-access memory (MRAM). S-DDHR successfully address different samples of stochastic hardware, which would otherwise suffer degraded performance due to hardware variability.

up
0 users have voted: