Mention that `server` feature must be turned on to use the inference API.

master
Mikko Juola 3 years ago
parent 5e241722cb
commit 957a8f9f98

@ -94,7 +94,12 @@ Use `rllama --help` to see all the options.
## Inference server ## Inference server
`rllama` can run in an inference server mode with a simple HTTP JSON API. `rllama` can run in an inference server mode with a simple HTTP JSON API. You
need to enable `server` features for this.
```
cargo build --release --features server
```
The command line flags for this are: The command line flags for this are:

Loading…
Cancel
Save