The Doomer's Error: Why AGI Is An Incoherent Concept Podcast By  cover art

The Doomer's Error: Why AGI Is An Incoherent Concept

The Doomer's Error: Why AGI Is An Incoherent Concept

Listen for free

View show details

What's the strongest anti-AGI case, the argument that reveals the fallacies underlying the belief that AGI is a viable goal – as well as the AI doomerism that believing AGI will soon arrive often spawns? Princeton professor Arvind Narayanan recently made a statement that we feel deserves amplification: For real-world problems, machines face some of the same key fundamental limits and challenges that humans face.

Listen to Luba and Eric unpack, explore, and expound. #noAGI

No reviews yet