diff --git a/README.md b/README.md index b8dc2fd..c7d02fb 100755 --- a/README.md +++ b/README.md @@ -2,6 +2,8 @@ This repository is intended as a minimal, hackable and readable example to load [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) ([arXiv](https://arxiv.org/abs/2302.13971v1)) models and run inference by using only CPU. Thus requires no videocard, but 64 (better 128 Gb) of RAM and modern processor is required. +At the moment only 7B model inference supported. + ### Conda Environment Setup Example for Windows 10+ Download and install Anaconda Python https://www.anaconda.com and run Anaconda Prompt ```