this post was submitted on 09 Dec 2023
59 points (100.0% liked)
Advent Of Code
763 readers
1 users here now
An unofficial home for the advent of code community on programming.dev!
Advent of Code is an annual Advent calendar of small programming puzzles for a variety of skill sets and skill levels that can be solved in any programming language you like.
AoC 2023
Solution Threads
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 |
Rules/Guidelines
- Follow the programming.dev instance rules
- Keep all content related to advent of code in some way
- If what youre posting relates to a day, put in brackets the year and then day number in front of the post title (e.g. [2023 Day 10])
- When an event is running, keep solutions in the solution megathread to avoid the community getting spammed with posts
Relevant Communities
Relevant Links
Credits
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
console.log('Hello World')
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
An extremely watered-down version for a non-tech person: Let's say you want to hack an account's password. When you try all combinations of characters continuously like how they do in the movies, that is a type of brute-force algorithm. It is considered the least efficient form of problem-solving.
Now, instead of using a CPU, they're using a GPU. A GPU is also known as a display driver. It calculates and renders graphical stuff. That's not the CPU's job - it is simply not good at it, for a bunch of reasons.
Now, a CPU is very fast, but can calculate fewer results. However, a GPU is slow, but can calculate in bulk. This is for the same reason why some people use GPU for mining crypto coin (they have been superseded by FGPA and ASIC).
Calculating on a GPU is not as straight forward as using a CPU. Now, I did not read the post clearly, but this Reddit guy apparently used GLSL to calculate stuff? I am not really sure about the specifics of how, and why was this used. Last I remember using GLSL, you can't simply use that to print log on the screen. Maybe they're using OpenGL? Maybe Vulkan?
Thank you!
One way to get the data is to render to a (hidden) surface/canvas. It's just bytes to the computer, so just dump the result data in the display buffer. Then you take a "screenshot" and interpret the RGBA values as data.
Graphics Programmer here.
More likely you would just write data to a buffer (basically an array of whatever element type you want) rather than a render target and then read it back to the cpu. Dx, vulkan, etc. all have APIs to upload / download to / from the GPU quite easily, and CUDA makes it even easier, so a simple compute shader or CUDA kernel that writes to a buffer would make the most sense for general purpose computation like an advent of code problem.