Self-similar Magneto-electric Nanocircuit Technology for Probabilistic Inference Engines
Abstract
Probabilistic graphical models are powerful mathematical formalisms for machine learning and reasoning under uncertainty that are widely used for cognitive computing. However they cannot be employed efficiently for large problems (with variables in the order of 100K or larger) on conventional systems, due to inefficiencies resulting from layers of abstraction and separation of logic and memory in CMOS implementations. In this paper, we present a magneto-electric probabilistic technology framework for implementing probabilistic reasoning functions. The technology leverages Straintronic Magneto-Tunneling Junction (S-MTJ) devices in a novel mixed-signal circuit framework for direct computations on probabilities while enabling in-memory computations with persistence. Initial evaluations of the Bayesian likelihood estimation operation occurring during Bayesian Network inference indicate up to 127x lower area, 214x lower active power, and 70x lower latency compared to an equivalent 45nm CMOS Boolean implementation.