Welcome to the alpha release of TYPE III AUDIO.
Expect very rough edges and very broken stuff—and regular improvements. Please share your thoughts.
Welcome to the alpha release of TYPE III AUDIO.
Expect very rough edges and very broken stuff—and regular improvements. Please share your thoughts.
Readings from the AI Safety Fundamentals: Governance course.
Observing from afar, it’s easy to think there’s an abundance of people working on AGI safety. Everyone on your timeline is fretting about AI risk, and it seems like there is a well-funded EA-industrial-complex that has elevated this to their main issue. Maybe you’ve even developed a slight distaste for it all—it reminds you a bit too much of the woke and FDA bureaucrats, and Eliezer seems pretty crazy to you.
That’s what I used to think too, a couple of years ago. Then I got to see things more up close. And here’s the thing: nobody’s actually on the friggin’ ball on this one!
There’s no secret elite SEAL team coming to save the day. This is it. We’re not on track.
Source:
https://www.forourposterity.com/nobodys-on-the-ball-on-agi-alignment/
Narrated for AI Safety Fundamentals by Perrin Walker of TYPE III AUDIO.
---
Click “Add to feed” on episodes, playlists, and people.
Listen online or via your podcast app.