GROQ ISSUE - URGENT
# support
n
CallID: 2b1aa042-b074-4e64-a411-b79479d16f7b 02:50:17:811 [WARN] Model request failed (attempt #1, reason: (Rate limit reached for model
llama-3.3-70b-versatile
in organization
org_01h************b10hfsqkt
service tier
on_demand
on requests per day (RPD): Limit 1000, Used 1000, Requested 1. Please try again in 1m8.977s. Visit https://console.groq.com/docs/rate-limits for more i...)). I do not think this is my Groq account is this a VAPI account, if so its gone on demand and is out of credit therefore killing all my agents that use GROQ, can I get advise on this ASAP
v
@Nickers I understand the urgency of the situation. It seems like your organization has reached the rate limit for the
llama-3.3-70b-versatile
model on the
on_demand
service tier. This is indeed related to the VAPI account associated with your organization
org_01h************b10hfsqkt
. To address this issue, you have a few options: 1. **Wait for the Rate Limit to Reset**: If the rate limit is based on a daily quota, you can wait for it to reset at the start of the next day according to the time zone of the service provider. 2. **Upgrade Your Service Tier**: Consider upgrading to a higher service tier that offers more requests per day if this is a recurring issue and you need a higher limit to support your agents' operation. 3. **Optimize Usage**: Review your usage patterns to ensure you're using the API efficiently. This might involve caching results or batching requests to minimize the number of calls to the API. 4. **Contact Support**: For immediate assistance and to explore if there are any temporary measures or workarounds, it would be best to contact the support team of the service provider. If you need further assistance or would like to explore these options, please let me know, and I can guide you through the next steps. If this is an issue that requires immediate attention from our support team, please ask support for more info on this.
You can continue this conversation by mentioning me in the message.
n
I am pretty sure this isnt my groq account, I just selected groq from the drop down but am sure i have added my credentials to vapi
s
@Nickers 🔵 09:26:08:760 [user WARN] Model request failed (attempt #1, reason: (The model
llama-3.1-70b-versatile
has been decommissioned and is no longer supported. Please refer to https://console.groq.com/docs/deprecations for a recommendation on which model to use instead.)
3 Views