Eliezer Yudkowsky On OpenAI, GPT-4 and the End-Game of AGI
"Pausing AI Developments Isn't Enough. We Need to Shut it All Down"
Hey Guys,
So I was a bit intrigued by the LessWrong key E.Y. this weekend. I listened to this Podcast with Bankless he did not too long ago, which sounded entirely different. So what changed? GPT-4 came out.
In an op-ed for TIME, AI theorist Eliezer Yudkowsky said that pausing research into AI isn’t enough.
The moment Eliezer Yudkowsky has been waiting for came in 2023, and yet he didn’t seem exactly prepared for it.
Timestamps
0:43 - GPT-4
23:23 - Open sourcing GPT-4
39:41 - Defining AGI
47:38 - AGI alignment
1:30:30 - How AGI may kill us
2:22:51 - Superintelligence
2:30:03 - Evolution
2:36:33 - Consciousness
2:47:04 - Aliens
2:52:35 - AGI Timeline
3:00:35 - Ego
3:06:27 - Advice for young people
3:11:45 - Mortality
3:13:26 - Love
Keep reading with a 7-day free trial
Subscribe to The Nvidia Patterns to keep reading this post and get 7 days of free access to the full post archives.