docs(ai-proxy): clarify logging options apply to structured access logs not error.log#13187
Merged
Baoyuantop merged 4 commits intoapache:masterfrom Apr 10, 2026
Merged
Conversation
…ogs, not error.log
Baoyuantop
reviewed
Apr 9, 2026
docs/en/latest/plugins/ai-proxy.md
Outdated
| The `ai-proxy` Plugin simplifies access to LLM and embedding models by transforming Plugin configurations into the designated request format. It supports the integration with OpenAI, DeepSeek, Azure, AIMLAPI, Anthropic, OpenRouter, Gemini, Vertex AI, and other OpenAI-compatible APIs. | ||
|
|
||
| In addition, the Plugin also supports logging LLM request information in the access log, such as token usage, model, time to the first response, and more. | ||
| In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log` |
Contributor
There was a problem hiding this comment.
Suggested change
| In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log` | |
| In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`. |
kayx23
reviewed
Apr 9, 2026
| The `ai-proxy-multi` Plugin simplifies access to LLM and embedding models by transforming Plugin configurations into the designated request format for OpenAI, DeepSeek, Azure, AIMLAPI, Anthropic, OpenRouter, Gemini, Vertex AI, and other OpenAI-compatible APIs. It extends the capabilities of [`ai-proxy`](./ai-proxy.md) with load balancing, retries, fallbacks, and health checks. | ||
|
|
||
| In addition, the Plugin also supports logging LLM request information in the access log, such as token usage, model, time to the first response, and more. | ||
| In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`. |
Member
Contributor
Author
There was a problem hiding this comment.
agreed. should we keep the first sentence from before, and just add the additional sentence explicitly mentioning the logging plugins and error.log?
Member
There was a problem hiding this comment.
Sure, as long as it reads well.
The same idea applies to the parameter description as well. Please make sure they are adjusted accordingly. I think the intention of this PR is just to clarify error log is not affected.
kayx23
reviewed
Apr 10, 2026
| @@ -74,8 +74,8 @@ In addition, the Plugin also supports logging LLM request information in the acc | |||
| | instances.options | object | False | | | Model configurations. In addition to `model`, you can configure additional parameters and they will be forwarded to the upstream LLM service in the request body. For instance, if you are working with OpenAI, DeepSeek, or AIMLAPI, you can configure additional parameters such as `max_tokens`, `temperature`, `top_p`, and `stream`. See your LLM provider's API documentation for more available options. | | |||
| | instances.options.model | string | False | | | Name of the LLM model, such as `gpt-4` or `gpt-3.5`. See your LLM provider's API documentation for more available models. | | |||
| | logging | object | False | | | Logging configurations. | | |||
Member
There was a problem hiding this comment.
The information should perhaps be added here (parent) if it's applicable to all the parameters within?
kayx23
reviewed
Apr 10, 2026
Co-authored-by: Traky Deng <trakydeng@gmail.com>
kayx23
approved these changes
Apr 10, 2026
shreemaan-abhishek
approved these changes
Apr 10, 2026
Baoyuantop
approved these changes
Apr 10, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.

Description
Clarifies the behavior of
logging.summariesandlogging.payloadsin theai-proxyandai-proxy-multiplugin documentation.Previously, the documentation described these options ambiguously as "logs request/response ...", which implied they controlled all forms of logging including
error.log. This caused confusion for users who expected setting them tofalseto suppress all AI request-related log output.The two options exclusively control whether LLM data is appended to structured access log entries consumed by logging plugins (e.g.
http-logger,kafka-logger). They have no effect on debug messages written toerror.log.Which issue(s) this PR fixes:
Fixes #13118 (comment) - number 2
Checklist
No tests needed here since it's a pure documentation fix.