Update README.md

main
PENG Bo 4 years ago committed by GitHub
parent 5ac82f37be
commit d8234047e6
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -1,6 +1,6 @@
# The RWKV Language Model
## RWKV v2 RNN
## RWKV v2 RNN: Language is in O(1)
RWKV v2 is a RNN which can also be directly trained like a GPT transformer.
@ -16,6 +16,8 @@ Write out the formulas for "token at pos 2" and "token at pos 3" and you will ge
* a and b: EMAs of kv and k.
* c and d: a and b combined with self-attention.
kv / k is the memory mechanism. The token with high k can be remember for a long period, if W is close to 1 in the channel.
The pseudocode (execution from top to bottom):
![RWKV-v2-RNN](RWKV-v2-RNN.png)

Loading…
Cancel
Save