Member-only story
Between novel AI paradigms and realistic simulations of part of the brain
6 min readJust now
–
Press enter or click to view image in full size
Image royalty free from www.pexels.com
Deep learning models have made impressive strides in the quickly changing field of artificial intelligence. These traditional artificial neural networks, however, function very differently from biological brains. Although they are quite good at some tasks, they are not as efficient, flexible, or energy-conscious as organic brain systems. They also need large datasets and use a lot of energy.
In parallel to the well-over-hyped quantum computing architecture, there are other opportunities that are popping up. One of these is neuromorphic computing. In …
Member-only story
Between novel AI paradigms and realistic simulations of part of the brain
6 min readJust now
–
Press enter or click to view image in full size
Image royalty free from www.pexels.com
Deep learning models have made impressive strides in the quickly changing field of artificial intelligence. These traditional artificial neural networks, however, function very differently from biological brains. Although they are quite good at some tasks, they are not as efficient, flexible, or energy-conscious as organic brain systems. They also need large datasets and use a lot of energy.
In parallel to the well-over-hyped quantum computing architecture, there are other opportunities that are popping up. One of these is neuromorphic computing. In reality, neuromorphing computing has been around since a long time. The basic idea is to have both hardware and software with more direct inspiration from biological neural systems. Artificial neural networks claim to be inspired by biological neural networks but in reality they borrow very little idea. Neuromorphic hardware mimic more biological phenomena, such as event-driven response, protein and chemical interactions, and the sparse, and highly parallel nature of real brains. The goal is to produce even more efficient computation, and given sparsity to achieve improvements in energy efficiency. Early implementations like Intel’s Loihi and IBM’s TrueNorth have…