this post was submitted on 01 Feb 2024
40 points (95.5% liked)

LocalLLaMA

2207 readers
4 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 7 months ago (2 children)

It's not open source it's weight available(for now). As of now there's nothing you can do with it publicly because it lacks a license and is known to be stolen.

[–] [email protected] 6 points 7 months ago

I'm sure that's not that big of a deal to some people. For example, I'm mainly using LLMs for use in my home assistant instance

[–] [email protected] 4 points 7 months ago

So what I'm hearing is I can use it but just make sure I don't tell anyone

[–] [email protected] 4 points 7 months ago (1 children)

Has anyone tried to see how censored it is yet?

[–] [email protected] 3 points 7 months ago (1 children)

Definitely. It has some alignment, but it won’t straight up refuse to do anything. It will sometimes add notes saying that what you’ve asked is kinda maybe against the law, but will produce a great response regardless. It’s a 70b, so running it locally is kind of a challenge, but for those who can run it - there is simply no other LLM that you can run at home that gets even close to it. It follows instructions amazingly, it’s very consistent and barely hallucinates. There is some special mistral sauce in it for sure, even if it’s “just” a llama2-70b.

[–] [email protected] 1 points 5 months ago

GGUF q2_K works quite well IMO, I've run it with 12GB vram + 32GB ram

[–] [email protected] 2 points 7 months ago (1 children)

How does one leak open source material?

[–] [email protected] 1 points 7 months ago

Not all open source is created in the light.

I’ve no idea what happened in this case. But it is extremely common for companies to take closed source developments and transform them into open source after they go through an internal process.