Skip to content

Commit 05f7e9f

Browse files
committed
fix: get llm token usage add result type
1 parent f46c257 commit 05f7e9f

File tree

1 file changed

+5
-0
lines changed

1 file changed

+5
-0
lines changed

src/backend/bisheng/llm/domain/utils.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,11 @@ def parse_token_usage(result: Any) -> tuple[int, int, int, int]:
6262
output_token += tmp2
6363
cache_token += tmp3
6464
total_token += tmp4
65+
elif isinstance(result, ChatGenerationChunk):
66+
token_usage = result.generation_info.get('token_usage', {})
67+
input_token, output_token, cache_token, total_token = get_token_from_usage(token_usage)
68+
else:
69+
logger.warning(f'unknown result type: {type(result)}')
6570
return input_token, output_token, cache_token, total_token
6671

6772

0 commit comments

Comments
 (0)