Conversation
Review Summary by QodoAppend system prompts after conversation messages
WalkthroughsDescription• Reorder system message building to append after conversation messages • Move instruction, knowledge, and samples to end of message list • Ensure system prompts are appended after user/assistant conversation • Apply consistent message ordering across chat and realtime providers Diagramflowchart LR
A["Prepare Instructions<br/>and Functions"] --> B["Build Conversation<br/>Messages"]
B --> C["Build System<br/>Messages"]
C --> D["Build Function<br/>Tools"]
D --> E["Get Final<br/>Prompt"]
File Changes1. src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs
|
Code Review by Qodo
1. Samples/system appended after chat
|
| // Build system messages | ||
| if (!string.IsNullOrWhiteSpace(instruction)) | ||
| { | ||
| renderedInstructions.Add(instruction); | ||
| messages.Add(new SystemChatMessage(instruction)); | ||
| } | ||
|
|
||
| if (!string.IsNullOrEmpty(agent.Knowledges)) | ||
| { | ||
| messages.Add(new SystemChatMessage(agent.Knowledges)); | ||
| } | ||
|
|
||
| var samples = ProviderHelper.GetChatSamples(agent.Samples); | ||
| foreach (var sample in samples) | ||
| { | ||
| messages.Add(sample.Role == AgentRole.User ? new UserChatMessage(sample.Content) : new AssistantChatMessage(sample.Content)); | ||
| } | ||
|
|
||
| // Render functions | ||
| if (options.WebSearchOptions == null) | ||
| { | ||
| foreach (var function in functions) | ||
| { | ||
| if (!agentService.RenderFunction(agent, function, renderData)) | ||
| { | ||
| continue; | ||
| } | ||
|
|
||
| var property = agentService.RenderFunctionProperty(agent, function, renderData); | ||
|
|
||
| options.Tools.Add(ChatTool.CreateFunctionTool( | ||
| functionName: function.Name, | ||
| functionDescription: function.Description, | ||
| functionParameters: BinaryData.FromObjectAsJson(property))); | ||
| } | ||
| } |
There was a problem hiding this comment.
1. Samples/system appended after chat 🐞 Bug ✓ Correctness
PrepareOptions now appends system instruction/knowledges and few-shot samples after all conversation turns, so the final messages sent to OpenAI may no longer end with the latest user message (often ending with a system or assistant-sample message instead). This can cause the model to continue a sample exchange or otherwise respond off-context compared to the previous/expected ordering used elsewhere in the codebase.
Agent Prompt
## Issue description
`PrepareOptions` in the OpenAI ChatCompletionProvider appends system/knowledge and sample (few-shot) messages after the conversation messages. This changes the sequence sent to `chatClient.CompleteChat(...)` and can cause the model to respond to trailing system/sample content instead of the latest user input.
## Issue Context
Other providers in this repo (e.g., AzureOpenAI) build messages in the order: system instruction -> knowledges -> samples -> conversation turns. The OpenAI provider should match this pattern unless there is a deliberate and validated behavioral reason.
## Fix Focus Areas
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[371-458]
- src/Plugins/BotSharp.Plugin.AzureOpenAI/Providers/Chat/ChatCompletionProvider.cs[382-419] (reference behavior)
ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools
No description provided.