Current: Home Those that interest me >

Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention

Infinite attention?

Maomei

The key point of this new technique is that it allows ``infinitely'' long inputs.This sentence is short, but it should be eye-catching. For more, see below:

Efficient Infinite Context Transformers with Infini-attention.



Previous: Language Models as Agent Models Next:From r to Q∗: Your Language Model is Secretly a Q-Function
  • Those that interest me
  • 收藏
TOP