June 2023: Welcome to the alpha release of TYPE III AUDIO.
Expect very rough edges and very broken stuff—and daily improvements. Please share your thoughts.

Homearrow rightPlaylists

[Week 3] “AGI Ruin: A List of Lethalities” by Eliezer Yudkowsky

AGI Safety Fundamentals: Alignment

Readings from the AI Safety Fundamentals: Alignment course.

https://agisafetyfundamentals.com

Subscribe

Apple PodcastsSpotifyGoogle PodcastsRSS

I have several times failed to write up a well-organized list of reasons why AGI will kill you. People come in with different ideas about why AGI would be survivable, and want to hear different obviously key points addressed first. Some fraction of those people are loudly upset with me if the obviously most important points aren't addressed immediately, and I address different points first instead.

Having failed to solve this problem in any good way, I now give up and solve it poorly with a poorly organized list of individual rants. I'm not particularly happy with this list; the alternative was publishing nothing, and publishing this seems marginally more dignified.

Crossposted from the LessWrong Curated Podcast by TYPE III AUDIO.

Share feedback on this narration.