@ -96,4 +96,4 @@ Simply comment those three lines in llama/generation.py to turn it to a generato
python example.py ./model ./tokenizer/tokenizer.model
```
Confirming that 30B model could generate code: https://github.com/randaller/llama-chat/issues/7
Confirming that 30B model is able to generate code: https://github.com/randaller/llama-chat/issues/7