Hello,
I’ve added my ElevenLabs API key to Vapi and successfully configured the burt voice, which works as expected for the support AI. However, when I try to add other voices from the ElevenLabs library (e.g., adam, dorothy, chris, michael), they do not work.
Here is the relevant part of my configuration:
public readonly voices = {
'burt': 'burt', // works
'adam': 'adam', // doesn’t work
'dorothy': 'dorothy', // doesn’t work
'chris': 'chris', // doesn’t work
'michael': 'michael' // doesn’t work
} as const;
return {
transcriber: {
provider: 'deepgram',
model: 'nova-2',
language: 'en-US'
},
model: {
provider: 'openai',
model: 'gpt-3.5-turbo',
messages
},
voice: {
provider: '11labs',
voiceId: selectedVoice
},
name: this.config.value.assistantName
};
I’ve tried adding more voice names from the VAPI voice library, but they don’t seem to work. The burt voice was already assigned to the support AI, and it functions correctly. How can I add and configure additional voices?
Could you clarify:
How do I add and configure more voices from the ElevenLabs library?
Are there specific steps, naming conventions, or prerequisites for the voiceId values?
Is there documentation or examples that outline how to map ElevenLabs voices correctly to the voiceId?