![](https://slrpnk.net/pictrs/image/00be6193-b3ab-400b-9374-f640eb98cedf.jpeg)
![](https://programming.dev/pictrs/image/8140dda6-9512-4297-ac17-d303638c90a6.png)
Perhaps BeautifulSoup for scraping data to fill your arrays…
Perhaps BeautifulSoup for scraping data to fill your arrays…
Fermentation (kefir, yoghurt, cheese). Recently lactase.
Sounds like an excellent idea, I’d be surprised if it isn’t happening.
Sure, I was being mildly facetious, but pointing to a better pattern, the nature of python means it is, barring some extreme development, always going to be an order of magnitude slower than compiled. If you’re not going to write even a little C, then you need to look for already written C / FORTRAN / (SQL for data) / whatever that you can adapt to reap those benefits. Perhaps a general understanding of C and a good knowledge of what your Python is doing is enough to get a usable result from a LLM.
When you need speed in Python, after profiling, checking for errors, and making damn sure you actually need it, you code the slow bit in C and call it.
When you need speed in C, after profiling, checking for errors, and making damn sure you actually need it, you code the slow bit in Assembly and call it.
When you need speed in Assembly, after profiling, checking for errors, and making damn sure you actually need it, you’re screwed.
Which is not to say faster Python is unwelcome, just that IMO its focus is frameworking, prototyping or bashing out quick and perhaps dirty things that work, and that’s a damn good thing.
These sets of concentric shells contain a thin layer of positive mass tucked inside an outer layer of negative mass.
The next question, then, is how to possibly confirm or refute the shells Lieu has proposed through observations.
First, we take some negative mass… Oh, wait.
Still, fresh blood!
Dear God, hope you got my letter…
Narky, but real, updoot.
Everything old is new again (GIGO)
Give blood.
Very nice, now acti in my rc.
A little rabid, but entirely likely.
Not horribly far away, I guess.
And now I’m thinking of the comedian from Watchmen. Alan Moore, knows the score…
Craziest all week for me, guess I curate my feeds different…
Boo!
I see this a lot, but do you really think the big players haven’t backed up the pre-22 datasets? Also, synthetic (LLM generated) data is routinely used in fine tuning to good effect, it’s likely that architectures exist that can happily do primary training on synthetic as well.
Ah, write only code ;) I was an enjoyer pre python.