"And someday when the descendants of humanity have spread from star to star they won't tell the children about the history of Ancient Earth until they're old enough to bear it and when they learn they'll weep to hear that such a thing as Death had ever once existed"
Eliezer Yudkowsky is an American AI safety researcher and writer. He co-founded the Machine Intelligence Research Institute and is known for work on rationality and AI alignment. He has published essays and technical writing on long-term AI risk.