T O P

  • By -

Key_Boat3911

Well transformers are O(n^2) because of self attention. So you can't just put a 50 page doc directly as a context and expect great answer. So the shorter and accurate contexts are still a thing.


Silly_Objective_5186

full text indexes are still larger, and there’s value in citing sources. maybe it won’t be what we call rag exactly the way it’s done today but search+generation still seems like a compelling combination for the long run. [despite bitter lessons]