This is a great list! Thanks for sharing!
This is a great list! Thanks for sharing!
I see your point. Rereading the OP, it looks like I jumped to a conclusion about LLMs and not AI in general.
My takeaway still stands for LLMs. These models have gotten huge with little net gain on each increase. But a Moore’s Law equivalent should apply to context sizes. That has a long way to go.
What drastically better results are you thinking of?
We’ve reached far beyond practical necessity in model sizes for Moore’s Law to apply there. That is, model sizes have become so huge that they are performing at 99% of the capability they ever will be able to.
Context size however, has a lot farther to go. You can think of context size as “working memory” where model sizes are more akin to “long term memory”. The larger the context size, the more a model is able to understand beyond the scope of it’s original model training in one go.
Image file format with excellent compression. It’s designed for web browsers, so what you’re probably running into is compatibility with other programs. It’s fairly easy to convert though to GIF or JPEG formats though.
I don’t usually recognize Lemmy users but I’ve run into your posts a lot these last few days. I assumed you were a bot but from interactions like these it’s clear you mean well. Love to see Lemmy growing from power users like yourself!