Update README.md

main
PENG Bo 4 years ago committed by GitHub
parent f45198b1db
commit 803397945b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -79,6 +79,8 @@ kv / k is the memory mechanism. The token with high k can be remembered for a lo
**RWKV v2 is parallelizable because the time-decay of each channel is data-independent (and trainable)**. For example, in usual RNN you can adjust the time-decay of a channel from say 0.8 to 0.5 (these are called "gates"), while in RWKV v2 you simply move the information from a W-0.8-channel to a W-0.5-channel to achieve the same effect.
========================================================================
### RWKV v2+ improvements (not yet uploaded to github. used in the latest 1.5B run)
Use different trainable TimeMix factors for R / K / V in SA and FF layers. Example:

Loading…
Cancel
Save