[-] bazinga@discuss.tchncs.de 5 points 22 hours ago

Hope he looses. Don't give them a platform

[-] bazinga@discuss.tchncs.de 3 points 1 day ago

And again, hope he looses, dont give populists a platform!

[-] bazinga@discuss.tchncs.de 1 points 1 day ago

Hope he looses

[-] bazinga@discuss.tchncs.de 1 points 2 days ago

Hope Orban looses

[-] bazinga@discuss.tchncs.de 20 points 2 days ago

Hope Orban looses

[-] bazinga@discuss.tchncs.de 1 points 2 days ago

Hope Orban looses

[-] bazinga@discuss.tchncs.de 1 points 2 days ago

Hope he looses big time

[-] bazinga@discuss.tchncs.de 0 points 2 days ago

Hope he looses

19
submitted 3 days ago* (last edited 1 day ago) by bazinga@discuss.tchncs.de to c/meshtastic@mander.xyz

Hi, I am on Grapheneos and don't want to use Google play. Is there any way to make the map work in android? I have seen "manage custom tile sources" but do you know how it works?

EDIT: no option to use openstreetmap instead?

[-] bazinga@discuss.tchncs.de 11 points 4 days ago

Goodnewseveryone: BBC uses suspicious sources for journalism but figured it out eventually. Probably, haven't learned anything for the future and will continue using X as source

[-] bazinga@discuss.tchncs.de 7 points 6 days ago* (last edited 6 days ago)

I agree. I also think that there is nothing good in for-profit AI corporations. I can recommend the book "the empire of AI" However, I personally think self hosting and having full control of the use is a bit different.

27
submitted 6 days ago* (last edited 6 days ago) by bazinga@discuss.tchncs.de to c/selfhosted@lemmy.world

I realize, I need to upgrade my little NUC to something bigger for higher inference of bigger llama models. I want something that you still can have on your living room's tv bench, so no monster rack please, but that has also the necessary muscle when needed for llama. Budget doesn't matter right now, want to understand what's good and what's out there. Thanks

EDIT: Wow, thanks for the inspiration, guess I need to look at bit for "how to stuff a huge graphics card into a mini box". To clarify a bit more what I want with it: I want to build a responsive personal assistant. I am dreaming of models bigger than 8B, good tool calling for things like memory, websearch etc., no coding, no image generation, no video generation required. Image recognition would be good but not a must. Regarding footprint, the no monster ;) Something that you can have in your livingroom, and could be wife approved - so no big gaming rig with exhaust pipes and stuff, needs to be good looking ;)

view more: next ›

bazinga

0 post score
0 comment score
joined 3 weeks ago