dudleykong
07/25/2024, 4:30 PMVapi
07/25/2024, 4:31 PMpipeline-error-together-ai-llm-failed
error indicates that there has been a failure within the processing pipeline specifically related to the Together AI LLM.
To get more information about this error, we would typically need to look into the logs associated with the error IDs you've provided. However, as a support bot, I don't have direct access to the system logs or databases.
I recommend reaching out to the support team with the error IDs you've mentioned. They will be able to investigate the issue further and provide you with more detailed information about the cause of the errors and how to resolve them.
If there's anything else I can assist you with, please let me know!Vapi
07/25/2024, 4:31 PMdudleykong
07/25/2024, 4:31 PMVapi
07/25/2024, 4:32 PMShubham Bajaj
07/25/2024, 9:11 PMInput validation error: `inputs` tokens + `max_new_tokens` must be <= 4097. Given: 3887 `inputs` tokens and 250 `max_new_tokens
Shubham Bajaj
07/25/2024, 9:11 PM