this post was submitted on 08 Nov 2023
38 points (95.2% liked)

Actually Useful AI

1989 readers
1 users here now

Welcome! ๐Ÿค–

Our community focuses on programming-oriented, hype-free discussion of Artificial Intelligence (AI) topics. We aim to curate content that truly contributes to the understanding and practical application of AI, making it, as the name suggests, "actually useful" for developers and enthusiasts alike.

Be an active member! ๐Ÿ””

We highly value participation in our community. Whether it's asking questions, sharing insights, or sparking new discussions, your engagement helps us all grow.

What can I post? ๐Ÿ“

In general, anything related to AI is acceptable. However, we encourage you to strive for high-quality content.

What is not allowed? ๐Ÿšซ

General Rules ๐Ÿ“œ

Members are expected to engage in on-topic discussions, and exhibit mature, respectful behavior. Those who fail to uphold these standards may find their posts or comments removed, with repeat offenders potentially facing a permanent ban.

While we appreciate focus, a little humor and off-topic banter, when tasteful and relevant, can also add flavor to our discussions.

Related Communities ๐ŸŒ

General

Chat

Image

Open Source

Please message @[email protected] if you would like us to add a community to this list.

Icon base by Lord Berandas under CC BY 3.0 with modifications to add a gradient

founded 1 year ago
MODERATORS
 

If youโ€™re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and thereโ€™s also a Raycast extension for more powerful usage.

top 6 comments
sorted by: hot top controversial new old
[โ€“] [email protected] 4 points 1 year ago (1 children)

There's also GPT4All which has the same concept but comes with a convenient GUI rather than run on the command line. I had some fun with Mistral-7B but honestly the weaker models are too dumb to be useful.

[โ€“] [email protected] 1 points 1 year ago

Oh nice! Thanks for sharing

[โ€“] [email protected] 2 points 1 year ago (1 children)
[โ€“] [email protected] 6 points 1 year ago

Canโ€™t speak to that much because I havenโ€™t reviewed the code myself, but itโ€™s open-source and everything runs locally on your machine without network requests

[โ€“] [email protected] 1 points 1 year ago

Anyone try running this on WSL?

[โ€“] [email protected] 1 points 5 months ago

when running models locally, I presume the models are trained and the weights and stuff are exported to a "model." For example Meta's LLama model.

Do these models get updated, new versions released? I don't quite understand