girls <3
You can run an LLM on a phone (tried it myself once, with llama.cpp), but even on the simplest model I could find it was doing maybe one word every few seconds while using up 100% of the CPU. The quality is terrible, and your battery wouldn’t last an hour.
A study from 1989 doesn’t apply to modern plants built 35 years later, it really doesn’t make sense to extrapolate it like this.
I would rather do that instead of indirectly killing a bunch of unwilling people, yeah.
what about the best motherfucking website?
Lemmy but twitter instead of reddit.
yeah, from my experience lemmy has way fewer but more active users compared to reddit
Because I don’t want to be reliant on someone else’s servers, plus I can set it up exactly how I want.
As for how I pay for the server, I use a free OCI VPS, so… I don’t.
A Very Polish Christmas by Sabadu.