It's a good thing I killed my CPU attempt after 2 hours then!
I would have definitely died in that sandstorm!
An unofficial home for the advent of code community on programming.dev!
Advent of Code is an annual Advent calendar of small programming puzzles for a variety of skill sets and skill levels that can be solved in any programming language you like.
Solution Threads
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 |
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
console.log('Hello World')
It's a good thing I killed my CPU attempt after 2 hours then!
I would have definitely died in that sandstorm!
I don't know shit about stuff l7ke this, but it's interesting. What does this mean?
An extremely watered-down version for a non-tech person: Let's say you want to hack an account's password. When you try all combinations of characters continuously like how they do in the movies, that is a type of brute-force algorithm. It is considered the least efficient form of problem-solving.
Now, instead of using a CPU, they're using a GPU. A GPU is also known as a display driver. It calculates and renders graphical stuff. That's not the CPU's job - it is simply not good at it, for a bunch of reasons.
Now, a CPU is very fast, but can calculate fewer results. However, a GPU is slow, but can calculate in bulk. This is for the same reason why some people use GPU for mining crypto coin (they have been superseded by FGPA and ASIC).
Calculating on a GPU is not as straight forward as using a CPU. Now, I did not read the post clearly, but this Reddit guy apparently used GLSL to calculate stuff? I am not really sure about the specifics of how, and why was this used. Last I remember using GLSL, you can't simply use that to print log on the screen. Maybe they're using OpenGL? Maybe Vulkan?
One way to get the data is to render to a (hidden) surface/canvas. It's just bytes to the computer, so just dump the result data in the display buffer. Then you take a "screenshot" and interpret the RGBA values as data.
Graphics Programmer here.
More likely you would just write data to a buffer (basically an array of whatever element type you want) rather than a render target and then read it back to the cpu. Dx, vulkan, etc. all have APIs to upload / download to / from the GPU quite easily, and CUDA makes it even easier, so a simple compute shader or CUDA kernel that writes to a buffer would make the most sense for general purpose computation like an advent of code problem.
Thank you!
Brute force what? Part 2 of what?
Erm, do you know where you are?
I can't speak for the other guy, but I definitely don't!
OP is talking about advent of code day 8 part 2 solution. I can't give you details of the problem (because that's just to long) but you're not supposed to brute force the solution because it would take trillions of iterations to get to the right combination.
But the guy in the picture did it anyway.
I forgot to add, you're in the advent of code community so that alone should be enough context to understand OPs post. The questions the other guy asked are the equivalent of going to a formula 1 community and asking "Who the fuck is Toto Wolff and what does Mercedes have to do with anything?"
What does my auntie Mercedes have to do with anything? I'd like to know too, you know?