sheetzoos

joined 3 months ago
[–] [email protected] 16 points 3 days ago (1 children)

Invaders out of Ukraine.

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (1 children)

"A straw man fallacy occurs when someone distorts or exaggerates another person's argument"

They distorted my argument by making shit up. That's called a straw man fallacy.

You think you're saying a lot, but you've said nothing.

[–] [email protected] 1 points 1 week ago (3 children)

I am not a corporate apologist. I never said I was a corporate apologist. My post history backs up the fact that I am not a corporate apologist. There's nothing "flimsy" about this. It's clear cut if you're willing to objectively look at the logic of the arguments presented.

I'm not using that one point to discredit their entire post. I posted two examples and stated their wall of text was so full of false statements that I wasn't interested in debating every single point with someone who already had their mind made up.

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (5 children)

Did you not read my previous post? The first point I refuted is a strawman argument. They created a position I do not hold to make it easier to attack.

If you don't believe this to be a strawman argument, please explain your logic.

[–] [email protected] 2 points 2 weeks ago (7 children)

Many of their points are factually incorrect. The first point I refuted is a strawman argument. They created a position I do not hold to make it easier to attack.

[–] [email protected] 5 points 2 weeks ago* (last edited 2 weeks ago) (9 children)

Dissecting his wall of text would take longer than I'd like, but I would be happy to provide a few examples:

  1. I have "...corporate-apologist principles".

Though wolfram claims to have read my post history, he seems to have completely missed my many posts hating on TSLA, robber barons, Reddit execs, etc. I completely agree with him that AI will be used for evil by corporate assholes, but I also believe it will be used for good (just like any other technology).

  1. "...tools are distinctly NOT inherently neutral. Consider the automatic rifle or the nuclear bomb" "HOWEVER, BOTH the automatic rifle and the nuclear bomb are tools, and tools have a specific purpose"

Tools are neutral. They have more than one purpose. A nuclear bomb could be used to warm the atmosphere another planet to make it habitable. Not to mention any weapon can be used to defend humanity, or to attack it. Tools might be designed with a specific purpose in mind, but they can always be used for multiple purposes.

There are a ton of invalid assumptions about machine learning as well, but I'm not interested in wasting time on someone who believes they know everything.

[–] [email protected] 2 points 2 weeks ago (25 children)

You've made many incorrect assumptions and setup several strawmen fallacies. Rather than try to converse with someone who is only looking to feed their confirmation bias, I'll suggest you continue your learnings by looking up the Dunning Kruger effect.

[–] [email protected] 9 points 2 weeks ago (30 children)

Every technology is a tool - both safe and unsafe depending on the user.

Nuclear technology can be used to kill every human on earth. It can also be used to provide power and warmth for every human.

AI is no different. It can be used for good or evil. It all depends on the people. Vilifying the tool itself is a fool's argument that has been used since the days of the printing press.

[–] [email protected] 4 points 2 weeks ago* (last edited 2 weeks ago) (38 children)

People are constantly getting upset about new technologies. It's a good thing they're too inept to stop these technologies.

[–] [email protected] 15 points 2 weeks ago (1 children)

New technologies are not the issue. The problem is billionaires will fuck it up because they can't control their insatiable fucking greed.

[–] [email protected] 1 points 3 weeks ago

If this is a reference to Asimov's novels, kudos! Though I believe in his books, humans would fill the glass to the brim to test if someone was a robot, because only a machine wouldn't spill a drop.

view more: next ›