# of course, this is also applicable to a [python example.py] as well (see below)
```
### Enable multi-line answers
If you wish to stop generation not by "\n" sign, but by another signature, like "User:" (which is also good idea), or any other, make the following modification in the llama/generation.py:
@ -111,7 +125,7 @@ Share your best prompts and generations with others here: https://github.com/ran
### Typical generation with prompt (not a chat)
Simply comment those three lines in llama/generation.py to turn it to a generator back.
Simply comment three lines in llama/generation.py to turn it to a generator back.