#326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning Podcast By  cover art

#326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning

#326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning

Listen for free

View show details

This episode is sponsored by tastytrade.

Trade stocks, options, futures, and crypto in one platform with low commissions and zero commission on stocks and crypto. Built for traders who think in probabilities, tastytrade offers advanced analytics, risk tools, and an AI-powered Search feature.

Learn more at https://tastytrade.com/



This episode dives into why Pathway's Baby Dragon Hatchling (BDH) might mark the beginning of the post-transformer era in AI.

Zuzanna Stamirowska, Pathway's CEO and co‑author of BDH, explains why today's transformer-based LLMs hit a wall on long-horizon reasoning, how memory and synaptic plasticity are built directly into BDH's architecture, and what that means for continual learning, hallucinations, and "generalization over time."

The conversation ranges from complexity science and brain-inspired computation to practical implications for real-world, small-data, and safety‑critical applications.

Stay Updated:

Craig Smith on X: https://x.com/craigss

Eye on A.I. on X: https://x.com/EyeOn_AI


(00:00) The Core Problem: Why Today's AI Lacks Memory

(03:16) Pathway's Mission to Bring Memory Into AI

(04:53) Zuzanna's Background in Complexity Science

(10:30) Why Transformers Reset Like "Groundhog Day"

(14:34) The Brain-Inspired Dragon Hatchling Architecture

(23:59) How the Network Learns and Builds Connections

(37:38) Performance vs Transformers on Language Tasks

(49:37) Productizing the Technology With NVIDIA and AWS

(54:23) Can Memory Solve AI Hallucinations?

No reviews yet