this post was submitted on 20 Aug 2024
48 points (96.2% liked)
Sysadmin
7640 readers
1 users here now
A community dedicated to the profession of IT Systems Administration
No generic Lemmy issue posts please! Posts about Lemmy belong in one of these communities:
[email protected]
[email protected]
[email protected]
[email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Clicked the link, started reading .. closed the window when I read "Netdata also incorporates A.I. insights for all monitored data".
~~Eesh. Yeah, that’s a nope from me, dawg.~~
Actually, it’s all self-hosted. Granted, I haven’t looked at the code in detail, but building NNs to help efficiently detect and capture stuff is actually a very appropriate use of ML. This project looks kinda cool.
Machine Learning might be marketed as "all fine and dandy", but I'm not planning on running a monitor training system loose on my production server under any circumstances.
Not to mention that for it to be useful I'd have to give it at least a year of logs, which is both impossible and pointless, since the system running a year ago is not remotely the same as the one running today, even if not a single piece of our own code changed, which of course it did, the OS, applications and processes have been continually updated by system updates and security patches.
So, no.
That’s why I put in the caveat about looking at the code. If you can’t grok what’s going on, that’s fine, but someone who does get it and can comfortably assert that no sketchy “phone home” shit is going on can and should use stuff like this, if they’re so inclined.
this limited scope ML trained analysis is actually where "AI" excels, e.g. "computer vision" in specific medical scenarios
If the training data is available, yes, in this case, no chance.
you don't think they could get training data from friendly customers using their service?