You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
rllama/src
Mikko Juola 18ef805458 Read parameters from model's JSON file instead of hard-coding them, make max sequence length configurable. 3 years ago
..
benches First commit. LLaMA works now. It is not pretty but it does generate text from prompts. Yay. 3 years ago
protomodels First commit. LLaMA works now. It is not pretty but it does generate text from prompts. Yay. 3 years ago
embedding.rs First commit. LLaMA works now. It is not pretty but it does generate text from prompts. Yay. 3 years ago
lib.rs First commit. LLaMA works now. It is not pretty but it does generate text from prompts. Yay. 3 years ago
main.rs First commit. LLaMA works now. It is not pretty but it does generate text from prompts. Yay. 3 years ago
rllama_main.rs Read parameters from model's JSON file instead of hard-coding them, make max sequence length configurable. 3 years ago
tensor.rs Make the output colored. This is essential to be taken seriously. 3 years ago
token_sampler.rs Add readme, make clippy happy. 3 years ago
tokenizer.rs Add readme, make clippy happy. 3 years ago
transformer.rs Read parameters from model's JSON file instead of hard-coding them, make max sequence length configurable. 3 years ago
unpickler.rs Add readme, make clippy happy. 3 years ago