mirror of https://github.com/novarobot/llama.cpp
Warn user if a context size greater than 2048 tokens is specified (#274)
LLaMA doesn't support more than 2048 token context sizes, and going above that produces terrible results.llama-patch-enable-fma-msvc master-d7def1a
parent
6f61c18ec9
commit
d7def1a752
Loading…
Reference in New Issue