Edit

Share via


Azure OpenAI in Azure AI Foundry Models API lifecycle

This article is to help you understand the support lifecycle for Azure OpenAI APIs.

Note

New API response objects may be added to the API response at any time. We recommend you only parse the response objects you require.

API evolution

Previously, Azure OpenAI received monthly updates of new API versions. Taking advantage of new features required constantly updating code and environment variables with each new API release. Azure OpenAI also required the extra step of using Azure specific clients which created overhead when migrating code between OpenAI and Azure OpenAI.

Starting in August 2025, you can now opt in to our next generation v1 Azure OpenAI APIs which add support for:

  • Ongoing access to the latest features with no need specify new api-version's each month.
  • Faster API release cycle with new features launching more frequently.
  • OpenAI client support with minimal code changes to swap between OpenAI and Azure OpenAI when using key-based authentication.
  • OpenAI client support for token based authentication and automatic token refresh without the need to take a dependency on a separate Azure OpenAI client will be added for all currently supported languages. Adding support for this functionality is coming soon for the Python, and the TypeScript/JavaScript libraries. .NET, Java, and Go support is currently available in preview.

Access to new API calls that are still in preview will be controlled by passing feature specific preview headers allowing you to opt in to the features you want, without having to swap API versions. Alternatively, some features will indicate preview status through their API path and don't require an additional header.

Examples:

  • /openai/v1/evals is in preview and requires passing an "aoai-evals":"preview" header.
  • /openai/v1/fine_tuning/alpha/graders/ is in preview and requires no custom header due to the presence of alpha in the API path.

For the initial v1 GA API launch we're only supporting a subset of the inference and authoring API capabilities. We'll be rapidly adding support for more capabilities soon.

Code changes

Last generation API

import os
from openai import AzureOpenAI

client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),  
    api_version="2025-04-01-preview",
    azure_endpoint="https://YOUR-RESOURCE-NAME.openai.azure.com")
    )

response = client.responses.create(
    model="gpt-4.1-nano", # Replace with your model deployment name 
    input="This is a test."
)

print(response.model_dump_json(indent=2)) 

Next generation API

import os
from openai import OpenAI

client = OpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/"
)

response = client.responses.create(   
  model="gpt-4.1-nano", # Replace with your model deployment name 
  input="This is a test.",
)

print(response.model_dump_json(indent=2)) 
  • OpenAI() client is used instead of AzureOpenAI().
  • base_url passes the Azure OpenAI endpoint and /openai/v1 is appended to the endpoint address.
  • api-version is no longer a required parameter with the v1 GA API.

v1 API support

Status

API Path Status
/openai/v1/chat/completions Generally Available
/openai/v1/embeddings Generally Available
/openai/v1/evals Preview
/openai/v1/files Generally Available
/openai/v1/fine_tuning/jobs/{fine_tuning_job_id}/checkpoints/{fine_tuning_checkpoint_id}/copy Preview
/openai/v1/fine_tuning/alpha/graders/ Preview
/openai/v1/fine_tuning/ Generally Available
/openai/v1/models Generally Available
/openai/v1/responses Generally Available
/openai/v1/vector_stores Generally Available

Preview headers

API Path Header
/openai/v1/evals "aoai-evals":"preview"
/openai/v1/fine_tuning/jobs/{fine_tuning_job_id}/checkpoints/{fine_tuning_checkpoint_id}/copy "aoai-copy-ft-checkpoints" : "preview"

Changes between v1 preview release and 2025-04-01-preview

  • v1 preview API
  • Video generation support
  • NEW Responses API features:
    • Remote Model Context Protocol (MCP) servers tool integration
    • Support for asynchronous background tasks
    • Encrypted reasoning items
    • Image generation

Changes between 2025-04-01-preview and 2025-03-01-preview

Changes between 2025-03-01-preview and 2025-02-01-preview

Changes between 2025-02-01-preview and 2025-01-01-preview

Changes between 2025-01-01-preview and 2024-12-01-preview

Changes between 2024-12-01-preview and 2024-10-01-preview

Changes between 2024-09-01-preview and 2024-08-01-preview

  • max_completion_tokens added to support o1-preview and o1-mini models. max_tokens doesn't work with the o1 series models.
  • parallel_tool_calls added.
  • completion_tokens_details & reasoning_tokens added.
  • stream_options & include_usage added.

Changes between 2024-07-01-preview and 2024-08-01-preview API specification

Changes between 2024-5-01-preview and 2024-07-01-preview API specification

Changes between 2024-04-01-preview and 2024-05-01-preview API specification

Changes between 2024-03-01-preview and 2024-04-01-preview API specification

Latest GA API release

Azure OpenAI API version 2024-10-21 is currently the latest GA API release. This API version is the replacement for the previous 2024-06-01 GA API release.

Known issues

  • The 2025-04-01-preview Azure OpenAI spec uses OpenAPI 3.1, is a known issue that this is currently not fully supported by Azure API Management

Next steps