Skip to content

Commit b5ae068

Browse files
committed
fix: 将tokenizer重命名为rag_tokenizer,确保所有导入都能正常工作
1 parent 0a403f1 commit b5ae068

File tree

1 file changed

+9
-12
lines changed

1 file changed

+9
-12
lines changed

deepdoc/depend/rag_tokenizer.py

Lines changed: 9 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -498,18 +498,15 @@ def naiveQie(txt):
498498
return tks
499499

500500

501-
tokenizer = RagTokenizer()
502-
tokenize = tokenizer.tokenize
503-
fine_grained_tokenize = tokenizer.fine_grained_tokenize
504-
tag = tokenizer.tag
505-
freq = tokenizer.freq
506-
loadUserDict = tokenizer.loadUserDict
507-
addUserDict = tokenizer.addUserDict
508-
tradi2simp = tokenizer._tradi2simp
509-
strQ2B = tokenizer._strQ2B
510-
511-
# Backward compatibility alias
512-
rag_tokenizer = tokenizer
501+
rag_tokenizer = RagTokenizer()
502+
tokenize = rag_tokenizer.tokenize
503+
fine_grained_tokenize = rag_tokenizer.fine_grained_tokenize
504+
tag = rag_tokenizer.tag
505+
freq = rag_tokenizer.freq
506+
loadUserDict = rag_tokenizer.loadUserDict
507+
addUserDict = rag_tokenizer.addUserDict
508+
tradi2simp = rag_tokenizer._tradi2simp
509+
strQ2B = rag_tokenizer._strQ2B
513510

514511
if __name__ == '__main__':
515512
tknzr = RagTokenizer(debug=True)

0 commit comments

Comments
 (0)