this post was submitted on 04 Jan 2024
17 points (100.0% liked)
SneerClub
983 readers
10 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Their mistake is not grokking contrition. An apology ought either to be contrite or to justify why contrition is impossible.
To be explicit, contrition is the part of an apology where the apologizing party promises to change something. Without contrition, apologies are worthless, since they do not amend any social contract.
What the author proposes instead is indeed "Machiavellian" and "hacking social APIs;" we should recognize it as a form of deceit or lie. They are clearly more interested in appearing to be decent than in improving society, and should be marked as confidence scammers.
And indeed, the other crucial piece is that... apologizing isn't a protocol with an expected reward function. I can just, not accept your apology. I can just, feel or "update my priors" howmever I like.
We apologize and care about these things because of shame. Which we have to regulate, in part through our actions and perspectives.
Why people feel the way they do and act the way do makes total sense when ~~one finally confronts your own vulnerabilities~~ sorry, builds an API and RL framework.
True, there's value. But I think if you try to measure that value, it disappears.
A good postmorterm puts the facts on the table, and leaves the team to evaluate options. I don't think any good postmorterm should have apologies or ask people to settle social conflicts directly. One of the best tools a postmorterm has is the "we're going to work around this problem by reducing the dependency on personal relationships."