Azure OpenAI API Key

Rachid Arzouni 20 Reputation points
2025-08-27T12:30:52.6866667+00:00

Hi,

In Elevenlabs (conversational AI) you can use you own OpenAI setup as an LLM, you just need your API key.

https://elevenlabs.io/docs/conversational-ai/customization/llm/custom-llm

The thing is that we have an OpenAI LLM setup through Azure. Does anyone know if it's possible to use this setup as well? The endpoint for Azure (OpenAI) is completely different from OpenAIs own endpoint (if you have a setup directly from them). I tried with the Azure endpoint but it didnt work (as expected).

I asked Elevenlabs this as well but the feedback i got was:
As long as they are able to offer us the exact format of chat/completions format as OpenAI forwards us, it works. If the format is slightly different however it will not be compatible.

Best regards,
Rachid

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
0 comments No comments
{count} votes

Accepted answer
  1. Jerald Felix 4,695 Reputation points
    2025-08-28T05:49:04.48+00:00

    Hello Rachid Arzouni ,

    Greetings,

    You’re absolutely right that Azure OpenAI uses a different endpoint structure compared to OpenAI’s native API, and this is usually where the compatibility issue arises when trying to integrate with platforms like ElevenLabs.

    The main difference is that Azure OpenAI requires a custom base URL that follows this format:

    https://<your-resource-name>.openai.azure.com/openai/deployments/<deployment-id>/chat/completions?api-version=<api-version>

    Whereas OpenAI’s native API typically uses:

    https://api.openai.com/v1/chat/completions

    So, if the third‑party tool (in this case, ElevenLabs) is hard‑coded to expect OpenAI’s format (/v1/chat/completions), it unfortunately won’t be able to talk directly to Azure’s endpoints unless the tool provides an option to configure both the endpoint URL and the headers.

    To make Azure OpenAI compatible, you typically need:

    1.	API Key → Passed in the header as `api-key` (instead of Bearer Token like native OpenAI).
    
    2.	Endpoint URL → Your unique Azure endpoint as described above.
    
    3.	Deployment Name → Azure doesn’t directly expose models by name (like `gpt-4o`), instead you deploy a model and call it by your chosen deployment ID.
    

    Since ElevenLabs mentioned they only work if the API matches OpenAI’s format exactly, this means support for Azure isn’t guaranteed out-of-the-box. A potential workaround is creating a small proxy or adapter service that translates between ElevenLabs’ expected OpenAI API format and Azure OpenAI’s endpoint structure. This way, ElevenLabs calls your proxy as if it were OpenAI, and your proxy forwards/reshapes the request to Azure.

    So in summary:

    •	Direct use of Azure API keys with ElevenLabs likely won’t work unless they support custom endpoint + header format.
    
    •	The only way around it right now is checking if ElevenLabs provides advanced configuration, or setting up a proxy to bridge the gap.
    

    Best regards,

    Jerald Felix

    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Rachid Arzouni 20 Reputation points
    2025-08-29T11:20:54.1133333+00:00

    Hi @Jerald Felix

    Thanks for the answer and clarification. I was kinda expecting that it wasn't possible in a easy way, but now i know for sure :)

    Best regards,
    Rachid

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.