this post was submitted on 14 Jan 2024
200 points (94.6% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54758 readers
366 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

For instance, say I search for "The Dark Knight" on my Usenet indexer. It returns to me a list of uploads and where to get them via my Usenet provider. I can then download them, stitch them together, and verify that it is, indeed, The Dark Knight. All of this costs only a few dollars a month for me.

My question is, why can't copyright holders do this as well? They could follow the same process, and then send takedown requests for each individual article which comprises the movie. We already know they try to catch people torrenting so why don't they do this as well?

I can think of a few reasons, but they all seem pretty shaky.

  1. The content is hosted in countries where they don't have to comply with takedown requests.

It seems unlikely to me that literally all of it is hosted in places like this. Plus, the providers wouldn't be able to operate at all in countries like the US without facing legal repercussions.

  1. The copyright holders feel the upfront cost of indexer and provider access is greater than the cost of people pirating their content.

This also seems fishy. It's cheap enough for me as an individual to do this, and if Usenet weren't an option, I'd have to pay for 3+ streaming services to be able to watch everything I do currently. They'd literally break even with this scheme if they could only remove access to me.

  1. They do actually do this, but it's on a scale small enough for me not to care.

The whole point of doing this would be to make Usenet a non-viable option for piracy. If I don't care about it because it happens so rarely, then what's the point of doing it at all?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 20 points 10 months ago (7 children)

Off topic, but is there any tutorial on how to do this Usenet thing? Feel free to contact me on Matrix, it's on my profile.

[–] [email protected] 47 points 10 months ago

You'll need 3 things:

A usenet client such as SABnzbd. This is equivalent to a torrent client like qbittorrent.

An NZB indexer such as NZBGeek, again equivalent to torrent indexers, but for nzb files.

And finally a usenet provider such as FrugalUsenet. This is where you're actually downloading articles from. (there are other providers listed in the photo in my other comment here)

Articles are individual posts on usenet servers. NZB files contain lists of articles that together result in the desired files. There are also additional articles included so if some are lost (taken down due to dmca/ntd) they can be rebuilt from the remaining data. Your nzb client handles the process of reading nzb files, trying to download the articles from each of your configured usenet providers, then decompressing, rebuilding lost data, and finally stitching it all together into the files you wanted.

[–] [email protected] 16 points 10 months ago (1 children)

I'd like to know too, but people are so cryptic about it every time this shit is brought up that I'm overwhelmed before I even begin. So I just stick to the tried and true methods I know

[–] [email protected] 14 points 10 months ago

Find a Usenet provider. A quick web search and some reading should get you to the right place. I’m not sure if any good free servers are available anymore, but there’s probably one that’s cheap enough.

Looks like https://sabnzbd.org/ is a free and open source Windows/MacOS/Linux client that can download files. I haven’t tried it, but it’s highly rated on alternativeto.net

[–] [email protected] 1 points 10 months ago

Install the *arr programs that you want to manage your libraries - Radarr (for movies), Sonarr (for tv shows), Lidarr (for music)

Install NZBGet (for downloading the files)

Sign up to a usenet provider.

Sign up to an indexer like NZBGeek, NZBFinder, etc.

Put your usenet provider details in to NZBGet under the News-Server section.

Put your indexer details into the indexer settings in *arr programs.

Put your NZBGet details into the Download Clients setting section in *arr programs.

Pretty much the gist of it. Then you can just search for and add the content you're after in Radarr/Sonarr/Lidarr and it will go looking.

If you need a hand I'm happy to help.

[–] [email protected] -5 points 10 months ago

There's about 100 tutorials that'll come up in a Google search.