Skip to content

kaiokendev/llama-rmt-test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

llama-rmt-test

This is completely unoptimized, likely not even working code of https://arxiv.org/pdf/2304.11062.pdf

just did it to play around

it does spit out actual content though

This technical report presents the application of a recurrent memory to extend the context length of BERT, one of the most effective Transformer-based models in natural language processing. By leveraging the Recurrent Memory Transformer architecture, we have successfully increased the model’s effective context length to an unprecedented two million tokens, while maintaining high memory retrieval accuracy. Our method allows for the storage and processing of both local and global information and enables information flow between segments of the input sequence through the use of recurrence. Our experiments demonstrate the effectiveness of our approach, which holds significant potential to enhance long-term dependency handling in natural language understanding and generation tasks as well as enable large-scale context processing for memory-intensive applications.

About

Just checking this for one sec https://arxiv.org/pdf/2304.11062.pdf

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages