The point is that even if the chances of [extinction by AGI] are extremely slim
the chances are zero. i don't buy into the idea that the "probability" of some made-up cataclysmic event is worth thinking about as any other number because technically you can't guarantee that a unicorn won't fart AGI into existence which in turn starts converting our bodies into office equipment
It's kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire
if you had done just a little bit of googling instead of repeating something you heard off of Oppenheimer, you would know this was basically never put forward as serious possibility (archive link)
which is actually a fitting parallel for "AGI", now that i think about it
EDIT: Alright, well this community was a mistake..
if you're going to walk in here and diarrhea AGI Great Filter sci-fi nonsense onto the floor, don't be surprised if no one decides to take you seriously
...okay it's bad form but i had to peek at your bio
Sharing my honest beliefs, welcoming constructive debates, and embracing the potential for evolving viewpoints. Independent thinker navigating through conversations without allegiance to any particular side.
seriously do all y'all like. come out of a factory or something
a thought on this specifically:
we're right back to "you're holding it wrong" again, i see
i'm definitely imagining Google re-whipping up their "Big Data" sales pitches in response to Gemini being borked or useless. "oh, see your problem is that you haven't modernized and empowered yourself by dumping all your databases into a (our) cloud native synergistic Data Sea, available for only $1.99/GB"