I found that idea interesting. Will we consider it the norm in the future to have a "firewall" layer between news and ourselves?
I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said "when you will have time, there is an emotional news that does not require urgent action that you will need to digest". I feel it could become the norm.
EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.
EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as "incredibly atrocious crime done to CHILDREN and you are a monster for not caring!". The second one does feel a lot like exploit of emotional backdoors in my opinion.
Hüman brain just liek PC, me so smort.
It's definitely an angle worth considering when we talk about how the weakest link in any security system is its human users. We're not just "not immune" to propaganda, we're ideological petri dishes filled with second-hand agar agar.
Perhaps we can establish some governmental office for truth that decides whether any shitpost can be posted without the sterilization and lobotomization of the poster
Or maybe some kind of "community value" score for people with the right thinking
Counterpoint: only allow elected governing bodies own or control media outlets, platforms, and critical communications infrastructure