Chat with Meta's LLaMA models at home made easy
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
randaller f8f2af349b
Update README.md
3 years ago
llama Add files via upload 3 years ago
README.md Update README.md 3 years ago
example-chat.py Add files via upload 3 years ago
merge-weights.py Add files via upload 3 years ago
requirements.txt Add files via upload 3 years ago
setup.py Add files via upload 3 years ago

README.md

Chat with Meta's LLaMA models at home made easy

This repository is a chat example with LLaMA (arXiv) models running on a typical home PC. You will just need a NVIDIA videocard and some RAM to chat with model.

Conda Environment Setup Example for Windows 10+

Download and install Anaconda Python https://www.anaconda.com and run Anaconda Prompt

conda create -n llama python=3.10
conda activate llama
conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia

Setup

In a conda env with pytorch / cuda available, run

pip install -r requirements.txt

Then in this repository

pip install -e .

Download tokenizer and models

magnet:?xt=urn:btih:ZXXDAUWYLRUXXBHUYEMS6Q5CE5WA3LVA&dn=LLaMA

or

magnet:xt=urn:btih:b8287ebfa04f879b048d4d4404108cf3e8014352&dn=LLaMA&tr=udp%3a%2f%2ftracker.opentrackr.org%3a1337%2fannounce