From 5b29836ca0f0bfa9d711af5e75543ce3b9aab37e Mon Sep 17 00:00:00 2001 From: randaller Date: Sun, 19 Mar 2023 18:41:42 +0300 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 145cb74..19f9a5d 100644 --- a/README.md +++ b/README.md @@ -186,7 +186,7 @@ python hf-inference-example.py To save CPU RAM or GPU VRAM memory, one may wish to enable Bfloat16 processing. ``` -# to save memory use bfloat16 on cpu +# to save memory use bfloat16 import torch torch.set_default_dtype(torch.bfloat16) ```