[-] [email protected] 0 points 19 hours ago

I like how all the big media just happened to publish articles about "peaceful protests" to pacify the protestors, "please don't be violent everyone, the police can't shoot you all in the leg with rubber bullets".

The violent protests are the successful ones. Just ask every country, ever.

[-] [email protected] 1 points 19 hours ago

Damn, I should have ended the post with /s for people like you.

[-] [email protected] -1 points 19 hours ago

See here's the thing. Why would anyone want to host ALL the stuff on one pi? That is not what they were designed for. Ollama on a pi? Are you out of your mind? I'd run the biggest model I can on a modern gpu not some crappy old computer or pi....Right tool, right job. And why is dropping containers "less secure"? Do you mean "less cool"? Less easy to deploy? But you're not deploying it, you're installing it. You sound like a complete newb which is fine, but just take a step back from things and get some more experience. A pi is a tool for a purpose, not the end all. Using an old laptop is not going to save the world and arguing that it's just better than a pi (or similar alternative) is just dumb. Use a laptop for all I care, I'm not the boss of you.

As for an arr stack, I'm really disappointed with the software and don't use it and those who do have way too much time to set it up, and then make use of it!

[-] [email protected] 0 points 23 hours ago

I can self host what I want on a pi zero. But, I do have some 30 years of experience so can probably do things some won't understand / bother with.

[-] [email protected] 5 points 1 day ago

I'm sure silicon valley are stepping on each other, vying to get their hands on these super cheap laptops for their 24/7 AI training.

[-] [email protected] 1 points 1 day ago* (last edited 1 day ago)

It's even worth pointing out you can disable various parts of the pi so it uses / needs even less juice.

[-] [email protected] 1 points 1 day ago

Pi’s are ARM-based, which still to this day limits the scope of their applicability.

Untrue.

Also, you should absolutely inspect a laptop before buying. Many, if not most, of old laptops will run just fine for the next few years.

Until the battery needs replacing, costing more than a pi, one key on the keyboard dies, etc.

[-] [email protected] 0 points 1 day ago

Please be specific rather than referring to 'raspberry pis' together. Different models have way different characteristics.

[-] [email protected] 2 points 1 day ago

This is generally not true. A small server running on an old pi when idling will have hardly any draw. It will cost literally pennies to run for the whole year.

[-] [email protected] 1 points 1 day ago

But... that's so uncool...

[-] [email protected] 9 points 1 day ago* (last edited 1 day ago)

I dislike posts like this. Technology moves quickly. PIs are great for hobby electronics where you need a little computer. Want a cheap computer to run a few things 24/7 and know what you're doing? Pi it is. You don't need to run containers on a pi because you have the skills to install the dependencies manually. They cost pennies to run 24/7.

I think of pis as beefed-up calculators. I have made lots of money using a pi zero running code I needed to run 24/7. Code I developed myself.

Having an old laptop with outdated parts taking up lots of space, weighing a lot, and having components like fans, keyboard, and mousepad most-likely soon dying and needing replacing is an additional concern you don't want.

Someone below saying use an old laptop if you're living with parents and don't pay the electricity bill is a bit lame. Do your part for the world. Someone will be paying for it.

Ultimately, use what you want but if you're just starting with servers, use a virtual machine on your computer and log in to it. You can dick about with it as much as you want, and reset back to a working state in seconds.

[-] [email protected] 3 points 2 days ago

But its website is Chinese. Also what's the github?

61
submitted 3 days ago* (last edited 3 days ago) by [email protected] to c/[email protected]

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

12
submitted 3 days ago* (last edited 3 days ago) by [email protected] to c/[email protected]

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›

catty

0 post score
0 comment score
joined 3 days ago