|
|
|
|
@ -51,7 +51,7 @@ magnet:xt=urn:btih:b8287ebfa04f879b048d4d4404108cf3e8014352&dn=LLaMA&tr=udp%3a%2
|
|
|
|
|
First, you need to unshard model checkpoints to a single file. Let's do this for 30B model.
|
|
|
|
|
|
|
|
|
|
```
|
|
|
|
|
python merge_weights.py --input_dir D:\Downloads\LLaMA --model_size 30B
|
|
|
|
|
python merge-weights.py --input_dir D:\Downloads\LLaMA --model_size 30B
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
In this example, D:\Downloads\LLaMA is a root folder of downloaded torrent with weights.
|
|
|
|
|
|