When I try to use Perplexity as my LLM call closes
# support
g
Perplexity does not seem to work right. I put in my own Perplexity key but when I try to use it with an assistant the call drops. Has anyone else has issues trying to use Perplexity with their own API key?
j
Hey Geo, mind sharing a call ID?
g
sure 1 sec.
c7da28f4-28a3-4a58-b46f-872d78c34e6b
An error occurred in this call: pipeline-error-perplexity-ai-llm-failed
Im sure my perplexity key is correct.
s
im seeing the same with pplx models. ended reason - pipeline-error-perplexity-ai-llm-failed sample calls ids- efe5baae-ed99-4024-a27d-e152d28df1a8 61a89fa8-5d18-404a-ac76-2b95760da742
j
Ah, are you trying to use their online models? Unfortunately they don't work through Vapi :/
g
actually i tried the sonar-medium-chat model, not the online model.
@jordan check your DMs 🙂
CallId: 7bb61c1f-4f4d-44df-90fc-b2599e430054
Perplexity still does not work
pipeline-error-perplexity-ai-llm-failed sonar-medium-chat model parameter passed
Any luck with Perplexity?
j
Hey we're still lookin into this one!
Fixed!
s
does it work with pplx online models also?
j
Unfortunately not, they aren't optimized for conversation
n
they actually do lol
try it!
g
ok what are you guys doing to make it work?? lol its ok tell the secret.. lol what model are you putting in the model field? or are you using it as a custom-llm?
7 Views