@Rick S.
Absolutely, you can use a custom LLM with your Assistant. To do so, you'll need to provide the details of your custom LLM model, such as the provider, the URL endpoint for the API, and the model identifier. Once you have these details, you can configure your Assistant to use your custom LLM instead of the default OpenAI models. If you have the necessary information, I can assist you in setting this up. Would you like to proceed with configuring your Assistant with a custom LLM?