its been on the experimental branch for a while now
Faceman2K23
These are the people that complain to their ISP when their game 'lags' on their wireless connected computer several rooms away from the router.
can you run something like iperf3 or openspeedtest between the server and client to prove its a network throughput issue?
do you have a network switch you can add to avoid switching through your router (if it is indeed bad?)
Have you ensured you arent unknowingly using wifi at either end?
NGINX is a bit more hands on than some other options but it's mature, configurable and there's a huge amount of information out there for setting it up for various use cases.
in my case, its what I set up when i was first getting into this and it works, so I don't want to go through setting up anything else.
Thanks for the insightful and helpful comment.
Unraid is great and I have been using it for over a decade now, but a paid OS on a 2bay nas seems excessive
I cant say I care as much as I used to, since encoding has gotten quite good, but I have also gotten better at seeing (aka. worse at being distracted by) compression artifacts so while I am less of a perfect remux rip supremacist, I'm also more sensitive to bad encodes so its a double edged sword.
I still seek out the highest quality versions of things that I personally care about, but I don't seek those out for absolutely everything like I used to. I recently saved 12TB running a slight compression pass on my non-4k movie library, turning (for example) a 30gb 1080p Bluray Remux into a 20gb H265 high bitrate encode, which made more room for more full fat 4K bluray files for things I care about, and the few 1080p full remuxes I want to keep for rarities and things that arent as good from the 4k releases or the ones where the 4k release was drastically different (like the LOTR 4k's having poor dynamic range and the colours being changed for the Matrix etc), which I may encode in the future to save more space again. I know I can compress an 80gb UHD bluray file down to 60gb with zero noticeable loss, thats as far as I need to go, I don't need to go down to 10gigs like some release groups try to do, and at that level of compression you might as well be at 1080p.
I cant go as low as a low bitrate 720p movie these days as I'm very close to a large screen so they tend to look quite poor, soft edges, banded gradients, motion artifacts, poor sound etc. but if I were on a smaller screen or watching movies on a phone like I used to, I probably wouldn't care as much.
Another side to my choice to compress is that I have about 10 active Plex clients at the moment and previously they were mostly getting transcoded feeds (mostly from remux sources) but now most of them are getting a better quality encode (slow CPU encode VS fast GPU stream) direct to their screens, so while I've compressed a decent chunk of the library, my clients are getting better quality feeds from it.
I use Plexamp for that, Jellyfin does it too. You can assign libraries per user quite easily.
So for 3 users you might have 4 libraries, one per user then a shared library they all have access to.
I have complete ROM sets for a couple of platforms in my archive, they're available on SLSK but not a huge amount of bandwidth available.
Sad to see the old giants like Vimms finally being attacked after all these years.
I'm quite fond of One MoЯn Time and No ones MoЯn, actually the whole UnmoЯnables album.
Soulseek has been getting hammered too
honestly I still cant figure out how to configure a network interface properly without using the old control panel.