Year of Publication:
Probabilistic graphical models like Bayesian Networks (BNs) are powerful cognitive-computing formalisms, with many similarities to human cognition. These models have a multitude of real-world applications. New emerging-technology based circuit paradigms leveraging physical equivalence e.g., operating directly on probabilities vs. introducing layers of abstraction, have shown promise in raising the performance and overall efficiency of BNs, enabling networks with millions of random variables. While previous BNs of up to 100s of nodes have been shown to require single-digit precision without affecting application outcomes, the significantly larger number of variables requires the computational precision to be scaled to correctly support BN operations. We introduce a new computational circuit fabric based on mixed-signal magneto-electric computations operating with physical equivalence and supporting probabilistic computations with a new approximate circuit style. Precision scaling impacts area at a logarithmic vs. linear scale offering a much lower power and performance cost than in prior directions. Results show 30x area reduction for a 0.001 precision vs. prior direction, while maintaining three orders of magnitude benefits vs. 100-core processor implementations.