this post was submitted on 21 Sep 2023
296 points (97.1% liked)

Science Memes

10406 readers
2631 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.


Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 19 comments
sorted by: hot top controversial new old
[–] [email protected] 33 points 1 year ago* (last edited 1 year ago) (2 children)

Dude on the right is correct that perturbed gradient descent with threshold functions and backprop feedback was implemented before most of us were born.

The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.

[–] [email protected] 17 points 1 year ago

The current boom is an embarrassingly parallel task meeting an architecture designed to run that kind of task.

Plus organizations outside of the FAANGs having hit critical mass on data that's actually useful for mass comparison multiple correlation analyses, and data as a service platforms making things seem sexier to management in those organizations.

[–] [email protected] 11 points 1 year ago (1 children)

Random but why is "embarrassing" or similar adjectives so often used to describe a parallel program? What's embarrassing about it?

[–] [email protected] 14 points 1 year ago (1 children)

"Embarrassingly parallelizable" is just the term for a process that can be perfectly paralleled.

[–] [email protected] 6 points 1 year ago (3 children)

rather odd choice of adjective though

[–] [email protected] 13 points 1 year ago

I think the usage implies it's so easy to parallelize that any competent programmer should be embarrassed if they weren't running it in parallel. Whereas many classes of problems can be extremely complex or impossible to parallelize, and running them sequentially would be perfectly acceptable.

[–] [email protected] 2 points 1 year ago

It's commonly used in some corners of computer science

[–] [email protected] 1 points 1 year ago

It's in the same spirit as the phrase "an embarrassment of riches". So a bit of an archaic usage.

[–] [email protected] 22 points 1 year ago (2 children)

So you're saying memes about AI don't belong in Science communities, because it's more of an art? 🤔

[–] [email protected] 19 points 1 year ago (1 children)

Any good scientist is a good artist.

[–] [email protected] 7 points 1 year ago (1 children)

That's what my senior year professor stated as well.

[–] [email protected] 3 points 1 year ago
[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

I think AI belongs in the same box as electronics, which is just black magic with a cover story to make it seem like science.

You know how electronics sometimes release smoke when misused? Yea, that smoke is actually what makes the component work, and it's a hell of a chore to put the smoke back in when it first has escaped!

Or do you know that planes actually "fall up into the sky" when they have enough speed? Ever looked at a jet fighter? Yea, the wings are actually just decoys.

Perhaps the memes would be better placed in witchymemes, or maybe we should make a new community for real, black magic, magus bullshit, "science".

(Disclaimer: I'm an engineer, the examples above are jokes we like to tell each other. I know how that stuff actually works, and I'm obligated to make this disclaimer so you don't realize that most engineers actually are modern day mages.)

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago)

Man i dont know. I had an introductery lecture into ML and we were told of some kernel stuff, where you look at a space that could be infinite dimensional and that you do some math to project into low dimensional feature space, where your seperation still works because of your kernel function.

That isnt some black box art form, that is clearly black magic.

[–] [email protected] 9 points 1 year ago

The only areas of machine learning that I expect to live up to the hype are in areas, where somewhat noisy input and output doesn't ruin the usability, like image and audio processing and generation, or where you have to validate the output anyway, like the automated copy-paste from stackexchange. Anything that requires actual specifity and factuality straight from the output, like the language models attempting to replace search engines (or worse, professional analysis), will for the foreseeable future be tainted with hallucinations and misinformation.

[–] [email protected] 6 points 1 year ago

The reached the right end pretty quickly. One of the reasons I gave up on ML rather fast. Hyperparameter tuning is really, really random.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

There is truth in this, but it isn't as true as some people seem to think. it's true that trial and error is a real part of working in ml, but it isn't just luck whether something works or not. We do know why some models work better than others for many tasks, there are some cases in which some mannual hyperparameter tuning is good, there was a lot of progress in the last 50 years, and so on.

[–] [email protected] 2 points 1 year ago

Maybe we should try sacrificing a farm animal to ML. If we're getting into the realm of magic, there are established practices going back thousands of years.