Custom LLM - Does anyone have a working custom-llm...
# support
b
I'm finding it quite difficult to integrate a custom LLM. The documentation on the website mentions needing a JSON object, while chat support indicates the requirement for a stream. Is there a single, simple, tested, and working example, either in Python or NextJS, that actually works? I've encountered multiple examples, but none seem to function correctly. Does anyone have a working custom-llm integration?
v
message has been deleted
s
Here you go
3 Views