Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
Infinite attention?
The key point of this new technique is that it allows ``infinitely'' long inputs.This sentence is short, but it should be eye-catching. For more, see below:
Efficient Infinite Context Transformers with Infini-attention.