Hey, we only support streaming for all of our service providers (be it LLM, TTS, STT). In your case, you have the wrong information. Custom LLM only supports streaming based openAI-compatible responses. If you send the response in the wrong stream mode, or which are not compatible with OpenAI, they will eventually fail. Let me know if you require further help or have more questions.