Server-to-Server Messaging and Pricing Inquiry for...
# support
s
I would like to ask questions to the assistant I created and receive written responses — is this possible? I plan to do this as scheduled tasks in a server-to-server setup. Can I interact directly via the /messages endpoint from the server and get the responses? Also, how is this usage priced? Thank you
v
Hi there! 👋 I wasn't able to find specific information about using scheduled tasks with the /messages endpoint. However, I can help with information about server-to-server communication options with custom assistants and some general pricing insights. According to the information I found, direct interaction with a custom assistant via the
/messages
endpoint from a server may not be fully supported or documented yet. The text chat endpoints for testing assistants are still under development, as mentioned in a [Discord thread](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1339978679119249469). However, there are several alternative approaches for server-to-server communication with your assistant: 1. **Server URLs**: You can set up [Server URLs](https://docs.vapi.ai/server-url) that enable your application to receive data and communicate with Vapi during conversations. 2. **Control URL for Message Injection**: According to a [Discord thread](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1339634926248525935), you can request a "control URL" when creating a room, which can be used to inject messages (system/user) into the conversation from your backend. 3. **OpenAI Chat Completions Integration**: As noted in the [changelog](https://docs.vapi.ai/changelog/2024/12/11), you can integrate your Assistant with OpenAI's chat completions by specifying
messages
and an
assistantId
. Regarding pricing, I found that: - Usage and costs can be tracked on an assistant basis - There's an API available for [querying usage](https://api.vapi.ai/api#/Analytics/AnalyticsController_query), grouped by assistant - An end-of-call report webhook might include the cost of the call Is there anything else you'd like to know about these server-to-server communication options? Sources: - [Discord thread about text chat endpoints](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1339978679119249469)
k
Yes, you can interact with your Vapi assistant in a server to server setup by initiating calls via the /calls endpoint and receiving responses through configured Server URLs..
Regarding pricing, vapi charge is 5 cents per minute + Underlying provider cost.
s
@Kings_big💫 thank you for your answer. After I create a call using the /calls method, how can I send messages to that call? Which API endpoint should I use? Could you please provide an example? I appreciate your help
k
After creating a call via the /calls endpoint, you can send messages to the call using the controlUrl with a POST request to the control endpoint, including a "type": "add-message" field, and setting "triggerResponseEnabled": true for the assistant to respond..
s
Got it! @Kings_big💫 One last question. My goal is to have written interactions with the assistant. In the assistant panel, there’s a button labeled ‘Chat’ (Chat with Assistant) — I’d like to send messages and receive replies in that style (but server-to-server). For this, do I still need to create a call? Because when I chat using that button, I don’t see those conversations in the assistant’s call logs. Also, does it have to be an inbound or outbound call? Can’t it be a web-type call instead? Your guidance has already brought us to a great point — if we can clarify this as well, the team will be very grateful.
k
To chat with your assistant server to server like the ‘Chat’ button, you still need to create a call session via API, even without a voice call, and handle messages through server URLs..