Author(s): Faiyaz Elahi Mullick, Supriyo Bandyopadhyay, Rob Baxter, Tony J. Ragucci, and Avik W. Ghosh Modern neural networks require enormous server infrastructure, as parameter count and memory demands grow with data richness. In contrast, biological brains learn and process information using limited memory and power through extensive reuse of neural representations, encoding shared features across related concepts (such as horses and zebras) while adding only a small number of synaptic connections to capture variations. This study implements a mathematical model of the neocortex that economizes on neuronal usage for image classification, and proposes a magnetic hardware platform to realize key aspects of its functionality. [Phys. Rev. Applied 24, 064047] Published Wed Dec 17, 2025
Preview
Open Original
Similar Posts
Loading similar posts...