From 957a8f9f98f91632b86a8dbf3d8dac28d4e5addb Mon Sep 17 00:00:00 2001 From: Mikko Juola Date: Mon, 20 Mar 2023 19:02:10 -0700 Subject: [PATCH] Mention that `server` feature must be turned on to use the inference API. --- README.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 1a1b3ad..f707752 100644 --- a/README.md +++ b/README.md @@ -94,7 +94,12 @@ Use `rllama --help` to see all the options. ## Inference server -`rllama` can run in an inference server mode with a simple HTTP JSON API. +`rllama` can run in an inference server mode with a simple HTTP JSON API. You +need to enable `server` features for this. + +``` +cargo build --release --features server +``` The command line flags for this are: