An error occurred in this call: pipeline-error-ope...
# support
h
I wanted to use openrouter as a provider. I wanted to set the model I wanted and I couldn't- it was set to openai/gpt-3.5-turbo by default I tried to update the model with the API call - it was updated, but I'm getting an error: an error occurred in this call: pipeline-error-openrouter-llm-failed Call id: 9835c5bd-699f-4e64-827e-20eded73392b
v
message has been deleted
You can continue this conversation by mentioning me in the message.
@Harter I'm sorry to hear that you're experiencing issues with updating the model. It seems there might be a misunderstanding, as "openrouter" is not listed as a provider option in the context provided. The available provider you can use is "openai," and there are several models under the OpenAI umbrella that you can choose from. To assist you further, could you please confirm if you intended to use "openai" as the provider and specify which model you were trying to update to? Once I have this information, I can guide you through the correct process to update the model.
Got it. Tagging @User for additional help.
h
@Support.Voice-Agent please help
please help
v
@Harter I'm here to help! It seems like you've encountered an error with the model update process. To better assist you, could you please provide me with the following details: 1. The exact model name you were trying to set when you encountered the error. 2. Any specific parameters or settings you used during the API call. Once I have this information, I can offer more precise guidance on how to resolve the issue. If the problem persists, I may need to direct you to our support team for further assistance.
h
gryphe/mythomax-l2-13b
v
@Harter Thank you for providing the model name. It seems like you're trying to use a custom model,
gryphe/mythomax-l2-13b
. If you've encountered an error after updating the model, it's possible that there might be an issue with the compatibility of the model or with the API call you made. To resolve this, please ensure that: 1. The model
gryphe/mythomax-l2-13b
is compatible with the API you're using. 2. You have the correct permissions and access rights to use this model. 3. The API call is correctly formatted according to the documentation. If you've checked all of the above and the issue still occurs, it might be best to reach out to the support team with the call ID you provided (
9835c5bd-699f-4e64-827e-20eded73392b
) so they can investigate the specific pipeline error you're encountering. Would you like me to guide you on how to contact support or is there anything else I can assist you with right now?
h
guide me on how to contact support
v
@Harter Certainly! To contact support for further assistance with the error you're experiencing, you can follow these steps: 1. Visit the official support website or user portal for the service you're using (since you mentioned "openrouter," please ensure you're on the correct website for the API provider). 2. Look for a 'Support', 'Help Center', or 'Contact Us' section on the website. 3. Choose the method of contact that suits you best. This could be a support ticket system, email address, live chat, or phone number. 4. When contacting support, provide them with the call ID (
9835c5bd-699f-4e64-827e-20eded73392b
) and a detailed description of the issue, including the error message
pipeline-error-openrouter-llm-failed
and the model you were trying to use (
gryphe/mythomax-l2-13b
). 5. If possible, include any relevant screenshots or logs that could help the support team diagnose the problem. If you're unable to find the contact information or need further assistance navigating the support process, please let me know, and I'll do my best to assist you.
h
I failed to find the contct information
v
@Harter I understand that finding the correct contact information can sometimes be challenging. Since I don't have access to external databases or the ability to provide direct links, I recommend the following general steps to locate the support contact information: 1. **Check the Documentation**: Often, API providers include support contact information within their official documentation or developer guides. 2. **Search Online**: Use a search engine to look for the support contact information by entering terms like "openrouter API support contact" or "gryphe/mythomax-l2-13b model support." 3. **Official Website**: Visit the official website of the API provider or the platform where you found the model. There might be a 'Contact' or 'Support' section in the website's footer or menu. 4. **Community Forums**: Sometimes, community forums or discussion boards related to the API or model can provide guidance or a way to contact support. 5. **Social Media**: Check if the API provider has a presence on social media platforms where they might offer support or provide contact information. If you're still unable to find the contact information, you might consider reaching out to the community or forums where developers discuss the API or model you're using. Other users might have faced similar issues and could guide you on how to contact support. Please let me know if you need further assistance with anything else.
h
connect a live person
v
@Harter Please ask support for more info on this.
5 Views