Update Assistant Prompt via API
# support
r
I understood the error the correct data should be:
Copy code
client.assistants.update(
        id: "e00731a7-a7a3-4471-8a4b-e04f99c8bc1e",
        name: update_assistant_version(assistant.name),
        model: {
          "provider" => "openai",
          "model" => assistant.model.member.model,
          "messages" => [{
            "role" => "assistant",
            "content" => escape_unescaped_characters(new_prompt),
          }],
        },
      )
Although the prompt in the updated assistant is empty, even if inspecting the request in the api logs looks fine
s
@Rublo We do not validate properties from the send level of nesting therefore, nested properties may be overwritten. Instead, you must provide the prompt when updating the model property of your assistant.
r
Hi @Shubham Bajaj than you so much for your reply. I did manage to update the prompt but somehow I am still getting a weird chaining on it. Could you help me sharing with me an example of request in which is the best approach to update only the agent prompt? that would be of great help 🙏 thanks in advance
s
@Rublo Here's an curl request example to update the assistant prompt, and other model properties.
Copy code
curl -X PATCH https://api.vapi.ai/assistant/insert-your-assistant-id-here \
     -H "Authorization: Bearer insert-your-assistant-id-here" \
     -H "Content-Type: application/json" \
     -d '{
  "model": {
    "provider": "openai",
    "model": "gpt-4o",
    "messages": [
      {
        "role": "system",
        "content": "insert-your-system-prompt-here"
      }
    ],
    "toolIds": [],
    "maxTokens": 50,
    "temperature": 0,
...restOfModelConfig
  }
...restOfAssistantConfig
}'
Do let me know if you require further help.
5 Views