#541: Monty - Python in Rust for AI Podcast By  cover art

#541: Monty - Python in Rust for AI

#541: Monty - Python in Rust for AI

Listen for free

View show details
When LLMs write code to accomplish a task, that code has to actually run somewhere. And right now, the options aren't great. Spin up a sandboxed container and you're paying a full second of cold start overhead plus the complexity of another service. Let the LLM loose on your actual machine and... well, you'd better be watching. On this episode, I sit down with Samuel Colvin, creator of Pydantic, now at 10 billion downloads, to explore Monty, a Python interpreter written from scratch in Rust, purpose-built to run LLM-generated code. It starts in microseconds, is completely sandboxed by design, and can even serialize its entire state to a database and resume later. We dig into why this deliberately limited interpreter might be exactly what the AI agent era needs. Episode sponsors Talk Python Courses Python in Production Links from the show Guest Samuel Colvin: github.com CPython: github.com IronPython: ironpython.net Jython: www.jython.org Pyodide: pyodide.com monty: github.com Pydantic AI: pydantic.dev Python AI conference: pyai.events bashkit: github.com just-bash: github.com Narwhals: narwhals-dev.github.io Polars: pola.rs Strands Agents: aws.amazon.com Subscribe Running Pydantic’s Monty Rust sandboxed Python subset in WebAssembly: simonwillison.net Rust Python: github.com Valgrind: valgrind.org Cod Speed: codspeed.io Watch this episode on YouTube: youtube.com Episode #541 deep-dive: talkpython.fm/541 Episode transcripts: talkpython.fm Theme Song: Developer Rap 🥁 Served in a Flask 🎸: talkpython.fm/flasksong ---== Don't be a stranger ==--- YouTube: youtube.com/@talkpython Bluesky: @talkpython.fm Mastodon: @talkpython@fosstodon.org X.com: @talkpython Michael on Bluesky: @mkennedy.codes Michael on Mastodon: @mkennedy@fosstodon.org Michael on X.com: @mkennedy
No reviews yet