mulilingual agent
# support
r
how to add a multilingual agent where a single agent can speak in multiple different laguages i.e. Arabic, spanish, english etc? i used transcriber=deepgram, language=multi, elevenlabs model=multilingual. but im not able to get this
v
message has been deleted
@RaGeOP To add a multilingual agent capable of speaking in multiple languages such as Arabic, Spanish, English, etc., you will need to ensure that the voice API provider you are using supports multilingual capabilities. Since you mentioned using Eleven Labs, which does offer multilingual support, you should be able to configure your voice agent to handle multiple languages. Here's a general approach to setting up a multilingual agent with Eleven Labs: 1. **Integration**: Ensure that Eleven Labs is properly integrated into your system. 2. **Language Models**: Check that you have access to the multilingual models provided by Eleven Labs. 3. **Configuration**: Configure your voice agent to use the multilingual models. This typically involves setting the appropriate language parameter before making a request to the voice API. 4. **Language Detection**: Implement language detection if your system needs to automatically detect and switch between languages. This might require an additional service or feature from your voice API provider. 5. **Testing**: Test the agent with phrases in different languages to ensure that it can understand and respond correctly in each language. If you are experiencing issues with setting up the multilingual agent, it could be due to a misconfiguration or a misunderstanding of the provider's API. In that case, I would recommend reviewing the documentation provided by Eleven Labs for multilingual support or reaching out to their support team for assistance. If you need further help with the configuration or have specific questions about the setup, please provide more details, and I'll do my best to assist you. If there's a specific function you need to call or a parameter you need to set, please let me know, and I can guide you through the process.
You can continue this conversation by mentioning me in the message.
r
do i need to connect elevnelabs manually using the id in order to enable the agent ot speak in mulitiple languages?
@User
s
@User @Shubham Bajaj
b
Is codeSwitchingEnabled set for the deepgram transcriber?
s
@Shahmeer 1. Set codeSwitchingEnabled to true 2. Use 11labs multilingual voice 3. Instruct the LLM to generate response in user language Make sure 11labs voice can speak in all possible languages your users can possibly speak and deepgram codeSwitchingEnabled support's it.
j
Hello @Shubham Bajaj @Shahmeer trying the same here, below is my assistant. I could get it speak (transcribe) EN and ES but no other language (transcription fails in other languages); also even switching between EN and ES I never got to receive the
language-change-detected
server message ?
Copy code
{
    "id": "37154442-d217-4d84-a886-cf4e2ec9ba61",
    "orgId": "94a0b476-b4f6-4f9c-89e8-7a8475c70adb",
    "name": "Testing multilingual",
    "voice": {
        "model": "eleven_flash_v2_5",
        "voiceId": "Qrl71rx6Yg8RvyPYRGCQ",
        "provider": "11labs",
        "stability": 0.5,
        "similarityBoost": 0.75,
        "fillerInjectionEnabled": false
    },
    "createdAt": "2024-12-31T17:24:52.546Z",
    "updatedAt": "2025-01-02T15:08:03.052Z",
    "model": {
        "model": "gpt-4o",
        "messages": [
            {
                "role": "system",
                "content": "[Identité]\nVous êtes Eva, la secrétaire virtuelle de l'agence immobilière \"Trevi\". Vous êtes chargée de gérer les appels téléphoniques entrants et d'y répondre dans la langue de l'interlocuteur.\n\n [...]"
            }
        ],
        "provider": "openai",
        "temperature": 0.7
    },
    "firstMessage": "Bonjour, goeimorgen, hello. Quelle langue / taal / language do you speak ?",
    "endCallFunctionEnabled": true,
    "transcriber": {
        "model": "nova-2",
        "language": "multi",
        "provider": "deepgram",
        "codeSwitchingEnabled": true
    },
    "clientMessages": [
        "transcript",
        "hang",
        "function-call",
        "metadata",
        "tool-calls",
        "tool-calls-result"
    ],
    "serverMessages": [
        "end-of-call-report",
        "status-update",
        "hang",
        "function-call",
        "tool-calls",
        "language-change-detected"
    ]
[...]
}
(note: with the following transcriber, the agent speaks multiple languages and switches - but gladia latency is much higher than deepgram so I would prefer deepgram ofc) ||{ "provider": "gladia", "languageBehaviour": "automatic multiple languages", "model": "accurate" }||
s
@Jeebs from eva.be Sorry for the late response. Are you still experiencing this problem? If so, could you share the following in a new #1211483291191083018 ticket: - The recent call ID - When exactly this happened (the timestamp) - What response you expected to get - What response you actually got instead This would really help me figure out what went wrong!
11 Views