Skip to content

FunctionInvokingChatClient returns 400 Bad Request with OpenRouter Gemini 3 Preview models (reasoning_details not preserved) #7173

@vbandi

Description

@vbandi

Description

When using FunctionInvokingChatClient with OpenRouter's Gemini 3 Preview models (google/gemini-3-flash-preview, google/gemini-3-pro-preview), tool call results fail with HTTP 400 Bad Request.

The root cause is that OpenRouter requires reasoning_details to be preserved and passed back when sending tool results to Gemini 3 reasoning models. The FunctionInvokingChatClient doesn't preserve this field when constructing the tool result message.

From OpenRouter's documentation:

Reasoning Details must be preserved when using multi-turn tool calling.

Reproduction Steps

Save and run this single-file .NET 10 script:

`csharp
#!/usr/bin/env dotnet run
#:package Microsoft.Extensions.AI@10.1.1-preview.1.25612.2
#:package Microsoft.Extensions.AI.OpenAI@10.1.1-preview.1.25612.2
#:package OpenAI@2.8.0

using System.ClientModel;
using System.ComponentModel;
using Microsoft.Extensions.AI;
using OpenAI;

var apiKey = Environment.GetEnvironmentVariable("OPENROUTER_API_KEY")
?? throw new InvalidOperationException("Set OPENROUTER_API_KEY environment variable");

var model = "google/gemini-3-flash-preview"; // FAILS with 400
// var model = "google/gemini-2.5-flash"; // WORKS

var options = new OpenAIClientOptions { Endpoint = new Uri("https://openrouter.ai/api/v1") };

var client = new OpenAI.Chat.ChatClient(model, new ApiKeyCredential(apiKey), options)
.AsIChatClient()
.AsBuilder()
.UseFunctionInvocation()
.Build();

var chatOptions = new ChatOptions { Tools = [AIFunctionFactory.Create(GetCurrentTime)] };
var messages = new List { new(ChatRole.User, "What time is it?") };

var response = await client.GetResponseAsync(messages, chatOptions);
Console.WriteLine(response.Text);

[Description("Gets the current time")]
static string GetCurrentTime() => DateTime.Now.ToString("HH:mm:ss");
`

Run with:
�ash export OPENROUTER_API_KEY=your_key_here dotnet run GeminiToolCallRepro.cs

Expected behavior

The tool call completes successfully and returns the time.

Actual behavior

ERROR: 400 - Service request failed. Status: 400 (Bad Request)

The error occurs after:

  1. Initial request sent with tool definition
  2. Model responds with tool call request (including reasoning_details)
  3. FunctionInvokingChatClient executes the tool and sends result back
  4. 400 error because reasoning_details wasn't preserved in the follow-up request

Workaround

Use non-reasoning Gemini models (google/gemini-2.5-flash, google/gemini-2.0-flash-001) or OpenAI models (openai/gpt-4o-mini).

Configuration

  • .NET 10.0 Preview
  • Microsoft.Extensions.AI 10.1.1
  • Microsoft.Extensions.AI.OpenAI 10.1.1-preview.1.25612.2
  • OpenAI 2.8.0
  • OpenRouter API (OpenAI-compatible endpoint)
  • Models affected: google/gemini-3-flash-preview, google/gemini-3-pro-preview

Related

Metadata

Metadata

Assignees

No one assigned

    Labels

    area-aiMicrosoft.Extensions.AI libraries

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions