custom LLM without OpenAI
# support
r
I want to use a custom LLM in my Assistant without using OpenAI’s gpt-3.5-turbo-instruct model, is that possible?
v
message has been deleted
@Rick S. Absolutely, you can use a custom LLM with your Assistant. To do so, you'll need to provide the details of your custom LLM model, such as the provider, the URL endpoint for the API, and the model identifier. Once you have these details, you can configure your Assistant to use your custom LLM instead of the default OpenAI models. If you have the necessary information, I can assist you in setting this up. Would you like to proceed with configuring your Assistant with a custom LLM?
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
s
yes that's possible make sure to follow openai messages request model for messages history.