From da6f35f276e931fcff2d88050800d8867a721931 Mon Sep 17 00:00:00 2001 From: PENG Bo <33809201+BlinkDL@users.noreply.github.com> Date: Fri, 25 Mar 2022 06:13:51 +0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 7650c28..b2a7650 100644 --- a/README.md +++ b/README.md @@ -16,7 +16,7 @@ Write out the formulas for "token at pos 2" and "token at pos 3" and you will ge * a and b: EMAs of kv and k. * c and d: a and b combined with self-attention. -kv / k is the memory mechanism. The token with high k can be remember for a long period, if W is close to 1 in the channel. +kv / k is the memory mechanism. The token with high k can be remembered for a long duration, if W is close to 1 in the channel. The pseudocode (execution from top to bottom):