Skip to content

Comments

append system prompt#1295

Merged
iceljc merged 1 commit intoSciSharp:masterfrom
iceljc:bugfix/append-system-instruction
Feb 19, 2026
Merged

append system prompt#1295
iceljc merged 1 commit intoSciSharp:masterfrom
iceljc:bugfix/append-system-instruction

Conversation

@iceljc
Copy link
Collaborator

@iceljc iceljc commented Feb 19, 2026

No description provided.

@qodo-code-review
Copy link

Review Summary by Qodo

Append system prompts after conversation messages

🐞 Bug fix

Grey Divider

Walkthroughs

Description
• Reorder system message building to append after conversation messages
• Move instruction, knowledge, and samples to end of message list
• Ensure system prompts are appended after user/assistant conversation
• Apply consistent message ordering across chat and realtime providers
Diagram
flowchart LR
  A["Prepare Instructions<br/>and Functions"] --> B["Build Conversation<br/>Messages"]
  B --> C["Build System<br/>Messages"]
  C --> D["Build Function<br/>Tools"]
  D --> E["Get Final<br/>Prompt"]
Loading

Grey Divider

File Changes

1. src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs 🐞 Bug fix +38/-35

Reorder system message building after conversation

• Moved system message building (instruction, knowledges, samples) from before conversation messages
 to after
• Reordered code blocks to build conversation messages first, then append system messages
• Function rendering moved to end of message preparation sequence
• Added clarifying comments for message building sections

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs


2. src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs 🐞 Bug fix +35/-31

Reorder system message building after conversation

• Moved system message building (instruction, knowledges, samples) from before conversation messages
 to after
• Reordered code blocks to build conversation messages first, then append system messages
• Function rendering moved to end of message preparation sequence
• Added clarifying comments for message building sections

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs


Grey Divider

Qodo Logo

@qodo-code-review
Copy link

Code Review by Qodo

🐞 Bugs (1) 📘 Rule violations (0) 📎 Requirement gaps (0)

Grey Divider


Action required

1. Samples/system appended after chat 🐞 Bug ✓ Correctness
Description
PrepareOptions now appends system instruction/knowledges and few-shot samples after all conversation
turns, so the final messages sent to OpenAI may no longer end with the latest user message (often
ending with a system or assistant-sample message instead). This can cause the model to continue a
sample exchange or otherwise respond off-context compared to the previous/expected ordering used
elsewhere in the codebase.
Code

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[R422-457]

+        // Build system messages
+        if (!string.IsNullOrWhiteSpace(instruction))
+        {
+            renderedInstructions.Add(instruction);
+            messages.Add(new SystemChatMessage(instruction));
+        }
+
+        if (!string.IsNullOrEmpty(agent.Knowledges))
+        {
+            messages.Add(new SystemChatMessage(agent.Knowledges));
+        }
+
+        var samples = ProviderHelper.GetChatSamples(agent.Samples);
+        foreach (var sample in samples)
+        {
+            messages.Add(sample.Role == AgentRole.User ? new UserChatMessage(sample.Content) : new AssistantChatMessage(sample.Content));
+        }
+
+        // Render functions
+        if (options.WebSearchOptions == null)
+        {
+            foreach (var function in functions)
+            {
+                if (!agentService.RenderFunction(agent, function, renderData))
+                {
+                    continue;
+                }
+
+                var property = agentService.RenderFunctionProperty(agent, function, renderData);
+
+                options.Tools.Add(ChatTool.CreateFunctionTool(
+                    functionName: function.Name,
+                    functionDescription: function.Description,
+                    functionParameters: BinaryData.FromObjectAsJson(property)));
+            }
+        }
Evidence
In the OpenAI provider, conversation turns are added to messages first, then system/knowledge and
sample messages are appended afterward; this messages list is passed directly to
chatClient.CompleteChat(...), so the API receives the reordered sequence. The AzureOpenAI provider
still prepends instruction/knowledge/samples before conversation turns, indicating the intended
pattern in this repo is to place system/few-shot context before the live conversation (and keep the
latest user turn at the end).

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[371-458]
src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[37-56]
src/Plugins/BotSharp.Plugin.AzureOpenAI/Providers/Chat/ChatCompletionProvider.cs[382-419]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`PrepareOptions` in the OpenAI ChatCompletionProvider appends system/knowledge and sample (few-shot) messages after the conversation messages. This changes the sequence sent to `chatClient.CompleteChat(...)` and can cause the model to respond to trailing system/sample content instead of the latest user input.

## Issue Context
Other providers in this repo (e.g., AzureOpenAI) build messages in the order: system instruction -&gt; knowledges -&gt; samples -&gt; conversation turns. The OpenAI provider should match this pattern unless there is a deliberate and validated behavioral reason.

## Fix Focus Areas
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[371-458]
- src/Plugins/BotSharp.Plugin.AzureOpenAI/Providers/Chat/ChatCompletionProvider.cs[382-419] (reference behavior)

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

Comment on lines +422 to +457
// Build system messages
if (!string.IsNullOrWhiteSpace(instruction))
{
renderedInstructions.Add(instruction);
messages.Add(new SystemChatMessage(instruction));
}

if (!string.IsNullOrEmpty(agent.Knowledges))
{
messages.Add(new SystemChatMessage(agent.Knowledges));
}

var samples = ProviderHelper.GetChatSamples(agent.Samples);
foreach (var sample in samples)
{
messages.Add(sample.Role == AgentRole.User ? new UserChatMessage(sample.Content) : new AssistantChatMessage(sample.Content));
}

// Render functions
if (options.WebSearchOptions == null)
{
foreach (var function in functions)
{
if (!agentService.RenderFunction(agent, function, renderData))
{
continue;
}

var property = agentService.RenderFunctionProperty(agent, function, renderData);

options.Tools.Add(ChatTool.CreateFunctionTool(
functionName: function.Name,
functionDescription: function.Description,
functionParameters: BinaryData.FromObjectAsJson(property)));
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. Samples/system appended after chat 🐞 Bug ✓ Correctness

PrepareOptions now appends system instruction/knowledges and few-shot samples after all conversation
turns, so the final messages sent to OpenAI may no longer end with the latest user message (often
ending with a system or assistant-sample message instead). This can cause the model to continue a
sample exchange or otherwise respond off-context compared to the previous/expected ordering used
elsewhere in the codebase.
Agent Prompt
## Issue description
`PrepareOptions` in the OpenAI ChatCompletionProvider appends system/knowledge and sample (few-shot) messages after the conversation messages. This changes the sequence sent to `chatClient.CompleteChat(...)` and can cause the model to respond to trailing system/sample content instead of the latest user input.

## Issue Context
Other providers in this repo (e.g., AzureOpenAI) build messages in the order: system instruction -> knowledges -> samples -> conversation turns. The OpenAI provider should match this pattern unless there is a deliberate and validated behavioral reason.

## Fix Focus Areas
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[371-458]
- src/Plugins/BotSharp.Plugin.AzureOpenAI/Providers/Chat/ChatCompletionProvider.cs[382-419] (reference behavior)

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

@iceljc iceljc merged commit 89eb73e into SciSharp:master Feb 19, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant