I'm finding it quite difficult to integrate a custom LLM. The documentation on the website mentions needing a JSON object, while chat support indicates the requirement for a stream. Is there a single, simple, tested, and working example, either in Python or NextJS, that actually works? I've encountered multiple examples, but none seem to function correctly.
Does anyone have a working custom-llm integration?