Skip to content

docs(ai-proxy): clarify logging options apply to structured access logs not error.log#13187

Merged
Baoyuantop merged 4 commits intoapache:masterfrom
janiussyafiq:docs/ai-proxy-log-summary
Apr 10, 2026
Merged

docs(ai-proxy): clarify logging options apply to structured access logs not error.log#13187
Baoyuantop merged 4 commits intoapache:masterfrom
janiussyafiq:docs/ai-proxy-log-summary

Conversation

@janiussyafiq
Copy link
Copy Markdown
Contributor

Description

Clarifies the behavior of logging.summaries and logging.payloads in the ai-proxy and ai-proxy-multi plugin documentation.

Previously, the documentation described these options ambiguously as "logs request/response ...", which implied they controlled all forms of logging including error.log. This caused confusion for users who expected setting them to false to suppress all AI request-related log output.

The two options exclusively control whether LLM data is appended to structured access log entries consumed by logging plugins (e.g. http-logger, kafka-logger). They have no effect on debug messages written to error.log.

Which issue(s) this PR fixes:

Fixes #13118 (comment) - number 2

Checklist

  • I have explained the need for this PR and the problem it solves
  • I have explained the changes or the new features added to this PR
  • I have added tests corresponding to this change
  • I have updated the documentation to reflect this change
  • I have verified that this change is backward compatible (If not, please discuss on the APISIX mailing list first)

No tests needed here since it's a pure documentation fix.

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. doc Documentation things labels Apr 9, 2026
@janiussyafiq janiussyafiq changed the title docs(ai-proxy): clarify logging options apply to structured access logs, not error.log docs: clarify logging options apply to structured access logs, not error.log Apr 9, 2026
@janiussyafiq janiussyafiq changed the title docs: clarify logging options apply to structured access logs, not error.log docs(ai-proxy): clarify logging options apply to structured access logs not error.log Apr 9, 2026
The `ai-proxy` Plugin simplifies access to LLM and embedding models by transforming Plugin configurations into the designated request format. It supports the integration with OpenAI, DeepSeek, Azure, AIMLAPI, Anthropic, OpenRouter, Gemini, Vertex AI, and other OpenAI-compatible APIs.

In addition, the Plugin also supports logging LLM request information in the access log, such as token usage, model, time to the first response, and more.
In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`
In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`.

@Baoyuantop Baoyuantop requested a review from kayx23 April 9, 2026 08:44
The `ai-proxy-multi` Plugin simplifies access to LLM and embedding models by transforming Plugin configurations into the designated request format for OpenAI, DeepSeek, Azure, AIMLAPI, Anthropic, OpenRouter, Gemini, Vertex AI, and other OpenAI-compatible APIs. It extends the capabilities of [`ai-proxy`](./ai-proxy.md) with load balancing, retries, fallbacks, and health checks.

In addition, the Plugin also supports logging LLM request information in the access log, such as token usage, model, time to the first response, and more.
In addition, the Plugin also supports appending LLM request information to structured access log entries, such as token usage, model, time to the first response, and more. These structured entries are consumed by logging plugins such as `http-logger` and `kafka-logger`, and are separate from the debug messages written to `error.log`.
Copy link
Copy Markdown
Member

@kayx23 kayx23 Apr 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The access log here meant:

Image

I think this update has tweaked the original meaning a bit. The sentence could be updated to cover both the access logs and usage with loggers. WDYT?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed. should we keep the first sentence from before, and just add the additional sentence explicitly mentioning the logging plugins and error.log?

Copy link
Copy Markdown
Member

@kayx23 kayx23 Apr 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, as long as it reads well.

The same idea applies to the parameter description as well. Please make sure they are adjusted accordingly. I think the intention of this PR is just to clarify error log is not affected.

@@ -74,8 +74,8 @@ In addition, the Plugin also supports logging LLM request information in the acc
| instances.options | object | False | | | Model configurations. In addition to `model`, you can configure additional parameters and they will be forwarded to the upstream LLM service in the request body. For instance, if you are working with OpenAI, DeepSeek, or AIMLAPI, you can configure additional parameters such as `max_tokens`, `temperature`, `top_p`, and `stream`. See your LLM provider's API documentation for more available options. |
| instances.options.model | string | False | | | Name of the LLM model, such as `gpt-4` or `gpt-3.5`. See your LLM provider's API documentation for more available models. |
| logging | object | False | | | Logging configurations. |
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The information should perhaps be added here (parent) if it's applicable to all the parameters within?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Co-authored-by: Traky Deng <trakydeng@gmail.com>
@kayx23 kayx23 requested a review from Baoyuantop April 10, 2026 04:23
@Baoyuantop Baoyuantop merged commit d091001 into apache:master Apr 10, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

doc Documentation things size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: ai plugins log summaries and payloads even when logging options are set to false

4 participants