JaumePrats commited on
Commit
fdd6b17
·
verified ·
1 Parent(s): 35501fd

Change add_bos_token to true

Browse files

Otherwise this causes the special tokens in chat_template to not be applied correctly in certain inference frameworks, such as llama.cpp.

Files changed (1) hide show
  1. tokenizer_config.json +1 -1
tokenizer_config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "add_bos_token": false,
3
  "add_eos_token": false,
4
  "add_prefix_space": true,
5
  "added_tokens_decoder": {
 
1
  {
2
+ "add_bos_token": true,
3
  "add_eos_token": false,
4
  "add_prefix_space": true,
5
  "added_tokens_decoder": {