Skip to content

LegallyCoder/mamba-hf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

mamba-hf

mamba-hf

Implementation of the Mamba SSM with hf_integration.

Usage:

To use the mamba-hf, follow these steps:

  1. Clone the repository to your local machine.
git clone https://github.com/LegallyCoder/mamba-hf
  1. Open a terminal or command prompt and navigate to the script's directory.
cd src
  1. Install the required packages using this command:
pip3 install -r requirements.txt
  1. Open new python file at the script's directory.
from modeling_mamba import MambaForCausalLM
from transformers import AutoTokenizer

model = MambaForCausalLM.from_pretrained('Q-bert/Mamba-130M')
tokenizer = AutoTokenizer.from_pretrained('Q-bert/Mamba-130M')

text = "Hi"

input_ids = tokenizer.encode(text, return_tensors="pt")

output = model.generate(input_ids, max_length=20, num_beams=5, no_repeat_ngram_size=2)

generated_text = tokenizer.decode(output[0], skip_special_tokens=True)

print(generated_text)

Hi, I'm looking for a new job. I've been working at a company for about a year now.

For more:

You can look at here Mamba Models Collection

References and Credits:

The Mamba architecture was introduced in Mamba: Linear-Time Sequence Modeling with Selective State Spaces by Albert Gu and Tri Dao.

Thank for the simple implementation (https://github.com/johnma2006/mamba-minimal)

The official implementation is here: https://github.com/state-spaces/mamba/tree/main

About

Implementation of the Mamba SSM with hf_integration.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages