What If AI Worked More Like the Human Brain? ft. Chris Eliasmith of Applied Brain Research Podcast By  cover art

What If AI Worked More Like the Human Brain? ft. Chris Eliasmith of Applied Brain Research

What If AI Worked More Like the Human Brain? ft. Chris Eliasmith of Applied Brain Research

Listen for free

View show details
At CES 2026, we sat down with Chris Eliasmith, CTO of Applied Brain Research, to discuss how brain-inspired AI is enabling fast, low-power voice interfaces that run directly on edge devices. Drawing on research modeling the hippocampus, his team developed new neural network architectures that significantly improve efficiency and accuracy for tasks like speech recognition and text to speech. These advances allow devices such as AR glasses, robots, and wearables to respond to voice commands in under 300 milliseconds, creating interactions that feel natural and conversational. Eliasmith also explains the tradeoffs between model size, accuracy, and power consumption, and how running AI at the edge can reduce costs and reliance on the cloud. He ultimately envisions a future where complete AI agents run locally on small devices, making technology simpler and more accessible for everyday users. 🎧 Episode Highlights: ●[01:59]: Introducing ultra-low-power voice AI at the edge ●[03:27]: Why 300ms latency is critical for natural conversations ●[09:06]: Brain-inspired neural networks modeled after the hippocampus ●[15:02]: Tiny AI chips for AR glasses, robotics, and wearables ●[20:25]: Cutting cloud costs with local speech processing ●[27:54]: The future of full AI agents running at the edge 🔑 Key Takeaways: ● By modeling neural networks after how parts of the brain like the hippocampus process time-based information, researchers can build AI systems that achieve higher accuracy with far fewer parameters. This approach allows models to process speech and other signals more efficiently, making advanced AI practical even on small, resource-constrained devices. ● For voice interfaces to feel natural, responses must happen within roughly 300 milliseconds, the same timing humans expect in conversation. Designing AI systems that meet this latency requirement changes how models are built and deployed, pushing developers to prioritize real-time performance rather than relying on slower cloud-based processing. ● Low-power AI that operates directly on devices reduces reliance on internet connectivity, lowers operational costs, and improves responsiveness. As models become efficient enough to run locally, entire AI agents could operate on wearables, robotics platforms, and AR devices, simplifying technology and making intelligent interfaces accessible to more users. 👤 Guest Spotlight: Chris Eliasmith Chris Eliasmith is the Director of the Centre for Theoretical Neuroscience at the University of Waterloo and holds the Canada Research Chair in Theoretical Neuroscience. He is also the CTO and co-founder of Applied Brain Research, where he works on low-power AI technologies for machine learning, robotics, and edge computing. Eliasmith is the co-inventor of the Neural Engineering Framework, the Nengo software platform, and the Semantic Pointer Architecture, and is the author of How to Build a Brain (Oxford University Press) and Neural Engineering (MIT Press). Stay Connected: ●https://www.softeq.com/ ●https://www.linkedin.com/in/techris/ ●https://www.linkedin.com/in/chris-eliasmith/ ●https://www.linkedin.com/company/applied-brain-research/
No reviews yet