Openrouter - x-ai/grok-3-beta fails with tool call
# support
v
The error occurs because the
type
field is missing in your tool configuration. To fix this:
Copy code
json
{
  "tools": [
    {
      "type": "function",  // Must be one of: dtmf, endCall, transferCall, voicemail, output, function, etc.
      "function": {
        // Your function definition here
      }
    }
  ]
}
Ensure each tool in your configuration has a valid
type
field as shown above. Source: - [Discord conversation about toolType error](https://discord.com/channels/1211482211119796234/1333880245681393756)
e
The type was already defined in the VAPI dashboard. And it works with other models but not openrouter: x-ai/grok-3-beta
Copy code
{
  "id": "ce22126c-f2e4-4d3c-8884-2ee755f97b8a",
  "createdAt": "2025-04-23T17:38:15.939Z",
  "updatedAt": "2025-04-23T17:47:56.750Z",
  "type": "function",
  "function": {
    "name": "fetch_knowledge",
    "strict": true,
    "description": "Fetch up to N thematically-relevant knowledge items from the coach’s knowledge-base",
    "parameters": {
      "type": "object",
      "properties": {
        "needs": {
          "description": "Desired content types e.g. story, principle, quote",
          "type": "array",
          "items": {
            "type": "string"
          }
        },
        "query": {
          "description": "The query that your subconscious will use to intuit what to pull from your knowledgebase, that will serve in this moment",
          "type": "string"
        },
        "max_items": {
          "description": "Maximum number of items to return (defaults to 1)",
          "type": "number"
        },
        "client_state": {
          "description": "Short description of the client’s current emotional / situational context.",
          "type": "string"
        }
      },
      "required": [
        "query"
      ]
    }
  },
  "orgId": "bb2057ad-fa86-43f7-ab31-93f5e8d69d6c",
  "server": {
    "url": "https://steadily-light-lobster.ngrok-free.app/api/tool-handler"
  }
}
k
This error occurs because toolType is missing in your tool config, set toolType: "function" explicitly, and note that x-ai/grok-3-beta via OpenRouter may not fully support function calls, use a model like gpt 4.
e
Thanks @User - the definition already has "type": "function" , is that not enough? Openrouter and Grok-3-beta both support function calling Also this tool config was set up through the Vapi dashboard. Isn't this therefore a bug to fix?
k
The issue is on us, and we have created a ticket for it. It will be fixed by team in their next sprint cycle.
e
Thank you, I tested it a bit more, and sometimes it works, sometimes it doesn't work, and sometimes it reads out the tool call
@Kings_big💫 i tested it with openrouter google/gemini-2.5-flash-preview and it has the same error. Model request failed (attempt #1, reason: (toolType not set)) When is the next sprint cycle please?
k
The dev work is done on this, and the change will be released to daily config by today.
e
amazing!
@Kings_big💫 looks like this is still not working. Instead of making a tool call, it outputs the tool call as part of its message. Call ID: d2aede8b-b740-4b9a-9678-0bd0c926c106 https://cdn.discordapp.com/attachments/1364914396958031882/1369682795500404746/Screenshot_2025-05-07_at_15.29.57.png?ex=681cc001&is=681b6e81&hm=a2c30c1a1f0b2ef910fb0fcd78555990a7b0c4f8f73af4800fb2f40c6973f7d6&
k
Looking into it
e
thank you
@Kings_big💫 any update on this please? thank you Another issue is that even when I implement the tool on mys server, and the tool responds with the tool result, often the assistant will not say anything for example in this call, the result came back and the assistant didnt say anything until i asked "Hello?": Call ID: 1a7dd052-0fb7-4d0f-8706-0bcaa211ae0e
Copy code
{
  "role": "tool_calls",
  "time": 1746639138100,
  "message": "",
  "toolCalls": [
    {
      "id": "call_41HnV5k9JyrSQFWxv2vEwhZl",
      "type": "function",
      "function": {
        "name": "fetch_knowledge",
        "arguments": "{\n  \"query\": \"JP Morgan on presence, being present, future orientation vs. present moment, how to be more present\",\n  \"needs\": [\"principle\", \"story\", \"quote\"],\n  \"max_items\": 3,\n  \"client_state\": \"Client wants to be more present, feels pulled to the future, seeking JP's wisdom or stories on presence.\"\n}"
      }
    }
  ],
  "secondsFromStart": 30.767
}
{
  "role": "bot",
  "time": 1746639138112,
  "source": "force-say",
  "endTime": 1746639138147,
  "message": "One moment",
  "secondsFromStart": 30.814
}
Copy code
{
  "name": "fetch_knowledge",
  "role": "tool_call_result",
  "time": 1746639155894,
  "result": "\n{\n  \"items\": [\n    {\n      \"type\": \"principle\",\n      \"content\": \"True freedom, the prerequisite for creation, arises paradoxically not from fighting limitations, but from *loving* them... Freedom isn't about achieving a future state; it's found in the present acceptance, like loving the falling inherent in flying, or hearing the music and daring to dance despite fear.\",\n      \"source\": \"I Will Set You Free (To Create What You Want)\",\n      \"relevance\": 0.9,\n      \"use_hint\": \"You could share this principle that true freedom and creation aren't future states but are found in present acceptance, aligning with the client's desire for presence.\"\n    }\n  ],\n  \"summary\": \"True freedom isn't a future state but found in present acceptance, even of limitations.\"\n}\n",
  "metadata": {},
  "toolCallId": "call_41HnV5k9JyrSQFWxv2vEwhZl",
  "secondsFromStart": 48.561
}
{
  "role": "user",
  "time": 1746639182929.995,
  "endTime": 1746639183249.995,
  "message": "Hello?",
  "duration": 320,
  "secondsFromStart": 77.258
}
a
Hey Ethan, checking if this is resolved/solved for you?
e
Hi @Shubham Bajaj thanks for checking, it is not resolved 1. Assistant doesn’t speak after my server returns the tool result 2. openrouter doesn’t call the tool correctly
i
I have the same issue, but with every openrouter LLM. I've tried using different openrouter LLM from openAI, and google, and it just didn't work. All of the JSON is consistent among my tests (It's a transient agent, that connects to n8n), the only thing that I change is the LLM, tool calling works without an issue on gemini-2.0-flash, openai-4o, 4o-mini, 4.1, 4.1-mini, 4.1-nano. But... it never works when using an openrouter LLM, I've tried with the same models, different providers, and by some weird reason, the bot says 'I'll call the tool' and completely shuts down, the bot stops answering, and there's no tool call on the logs when I hang up.
e
thanks @IPlayThings @Shubham Bajaj would definitely appreciate if you can get this fixed!
a
Hey, Could you please schedule a brief call at your convenience so we can discuss this matter in detail? Kindly use the following link to book a suitable time: .
e
Kyle thanks for the offer, in this case, there's clearly an issue with the OpenRouter tool calls that needs to be fixed on Vapi's end. So, I don't think a call will help here
i
Still haven't found a single workaround
e
nudge @Shubham Bajaj 🙏
i
@Kings_big💫 any news on this?
a
For more stable tool calls, we recommend using OpenAI, Anthropic, or Grok
12 Views