this post was submitted on 11 Jul 2023
355 points (96.3% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54500 readers
538 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

My best list of free ChatGPT and other models. Required - no signups.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 92 points 1 year ago* (last edited 1 year ago) (4 children)

You don't need to pirate OpenAI. I've built the AI Horde so y'all can use it without any workarounds of shenanigans and you can use your PCs to help others as well.

Here's a client for LLM you can run directly on your browser: https://lite.koboldai.net

[–] [email protected] 9 points 1 year ago (1 children)

I had an interesting result.

I proposed a simple question like I did all the other AI with "airoboros-65B-gpt4-1.4-GPTQ for 13 kudos in 369.6 seconds". It was a bit of a wait, I understand why.

It gave me a word for word comment on what I assume is a blog post from a Melissa. The topic was related, just barely.

Which LLM do you recommend for questions about a subject? I looked in the FAQ to see if there was a guide to the choices.

[–] [email protected] 7 points 1 year ago (1 children)

Unfortunately I'm not an expert in LLMs so I don't know. I suggest you contact the KoboldAI community and they should be able to point you to the right direction

[–] [email protected] 3 points 1 year ago

Thank you. Will do.

I kept playing and tried the scenarios and was getting closer.

[–] [email protected] 4 points 1 year ago

Just tested. Thanks for building and sharing!

[–] [email protected] 2 points 1 year ago (1 children)

Aren't KobaldAI models on par with GPT3? Why not just use ChatGPT then?

AI Horde looks dope for image generation though!

[–] [email protected] 7 points 1 year ago (1 children)

Kobald is a program to run local llms, some seem on par with gpt3 but normaly youre gonna need a very beefy system to slowly run them.

The benefit is rather clear, less centralized and free from strict policies but Gpt3 is also miles away from gpt3.5. Exponential growth ftw. I have yet to see something as good and fast as chatgpt

[–] [email protected] 3 points 1 year ago (1 children)

I've always wondered how it's possible. No way they've got some crazy software optimisations that nobody else can replicate right? They've gotta just be throwing a ridiculous amount of compute power at every request?

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

Well there are 2 things.

First there is speed for which they do indeed rely on multiple thousands of super high end industrial Nvidia gpus. And since the 10Billion investment from microsoft they likely expanded that capacity. I’ve read somewhere that chatgpt costs about 700,000 a day to keep running.

There are a few others tricks and caveats here though. Like decreasing the quality of the output when there is high load.

For that quality of output they do deserve a lot of credit cause they train the models really well and continuously manage to improve their systems to create even higher qualitive and creative outputs.

I dont think gpt4 is the biggest model that is out there but it does appear to be the best that is available.

I can run a small llm at home that is much much faster then chatgpt.. that is if i want to generate some unintelligent nonsense.

Likewise there might be a way to redesign gpt-4 to run on consumer graphics card with high quality output… if you don’t mind waiting a week for a single character to be generated.

I actually think some of the open sourced local runnable llms like llama, vicuna and orca are much more impressive if you judge them on quality vs power requirement.

[–] [email protected] 1 points 1 year ago (1 children)

Checking it out, how come I can't paste my api key in the field on the option tab? I gotta type it out?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)
[–] [email protected] 1 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago (1 children)

The embedded browser version is just a demo. Just download and run the local executable and it should work normally.

[–] [email protected] 2 points 1 year ago

oh ok, cool! Thanks!

This project looks really interesting.