fix(DATAGO-134114): replace hardcoded max_llm_calls_per_task with constant and increase value to 30#1471
fix(DATAGO-134114): replace hardcoded max_llm_calls_per_task with constant and increase value to 30#1471
Conversation
…AX_LLM_CALLS_PER_TASK
✅ FOSSA Guard: Licensing (
|
✅ FOSSA Guard: Vulnerability (
|
There was a problem hiding this comment.
Pull request overview
This PR aims to improve maintainability by centralizing the default “max LLM calls per task” value into a shared constant and using it as the fallback where the runtime config is read.
Changes:
- Added
DEFAULT_MAX_LLM_CALLS_PER_TASK = 20tocommon/constants.py. - Replaced hardcoded
20fallbacks formax_llm_calls_per_taskwithDEFAULT_MAX_LLM_CALLS_PER_TASKin agent execution paths (SAC component, structured invocation handler, and A2A request handling).
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
src/solace_agent_mesh/common/constants.py |
Introduces the centralized default constant for max LLM calls per task. |
src/solace_agent_mesh/agent/sac/structured_invocation/handler.py |
Uses the shared constant as the get_config fallback when building RunConfig. |
src/solace_agent_mesh/agent/sac/component.py |
Uses the shared constant as the get_config fallback when re-triggering runs. |
src/solace_agent_mesh/agent/protocol/event_handlers.py |
Uses the shared constant as the get_config fallback for A2A task execution. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
enavitan
left a comment
There was a problem hiding this comment.
looks good, wonder if we log the failures and at least try to understand where things went south, and perhaps have a follow up task to tweak prompts to reduce these failures case per case basis




What is the purpose of this change?
How was this change implemented?
Key Design Decisions (optional - delete if not applicable)
How was this change tested?
Is there anything the reviewers should focus on/be aware of?