HN.zip

Context Is Software, Weights Are Hardware

14 points by maxaravind - 2 comments
maxaravind [3 hidden]5 mins ago
Author here.

I spent the last weekend thinking about continual learning. A lot of people think that we can solve long term memory and learning in LLMs by simply extending the context length to infinity. I analyse a different perspective that challenges this assumption.

Let me know how you think about this.

adityaathalye [3 hidden]5 mins ago
> Let me know how you think about this.

Well, I think of every Large Language Model as if it were a spectacularly faceted diamond.

More on these lines in a recent-ish "thinking in public" attempt by yours truly, lay programmer, to interpret what an LLM-machine might be.

Riff: LLMs are Software Diamonds

https://www.evalapply.org/posts/llms-are-diamonds/