Skip to content

Azure OpenAI Setup

Route AI agent traffic through TapPass while using Azure OpenAI as the LLM provider. Azure OpenAI offers EU-resident deployments (West Europe, Sweden Central). ideal for GDPR compliance.

BenefitDetails
EU data residencyDeploy models in West Europe or Sweden Central: data never leaves the EU
Enterprise billingUse your existing Azure EA/CSP agreement
Private networkingVNet integration, Private Endpoints, no public internet exposure
Content filteringAzure’s built-in content safety on top of TapPass governance
SLA99.9% availability backed by Microsoft
Terminal window
# Azure CLI
az cognitiveservices account create \
--name tappass-openai \
--resource-group your-rg \
--kind OpenAI \
--sku S0 \
--location westeurope # EU data residency
Terminal window
az cognitiveservices account deployment create \
--name tappass-openai \
--resource-group your-rg \
--deployment-name gpt-4o-mini \
--model-name gpt-4o-mini \
--model-version "2024-07-18" \
--model-format OpenAI \
--sku-capacity 10 \
--sku-name Standard

Add to your .env or environment:

Terminal window
# Azure OpenAI credentials (read by LiteLLM directly)
AZURE_API_KEY="your-azure-openai-key"
AZURE_API_BASE="https://tappass-openai.openai.azure.com/"
AZURE_API_VERSION="2024-10-21"
from tappass import Agent
agent = Agent("http://localhost:9620", "tp_...")
# Explicitly request Azure model
response = agent.chat("Analyze Q3 trends", model="azure/gpt-4o-mini")

Or set as the default model in your pipeline configuration.

TapPass uses LiteLLM’s azure/<deployment-name> format:

TapPass model nameAzure deploymentNotes
azure/gpt-4o-minigpt-4o-miniFast, cheap, good for most tasks
azure/gpt-4ogpt-4oBest quality, higher cost
azure/gpt-4gpt-4Previous generation

Important: The deployment name in Azure must match the part after azure/. If your Azure deployment is named my-gpt4o, use azure/my-gpt4o in TapPass.

When TAPPASS_EU_DATA_RESIDENCY=true, TapPass automatically routes CONFIDENTIAL+ data to Azure OpenAI (when configured) or Mistral (EU-native fallback).

Terminal window
TAPPASS_EU_DATA_RESIDENCY=true
AZURE_API_KEY="..."
AZURE_API_BASE="https://your-resource.openai.azure.com/"

Routing priority for EU residency:

ClassificationModelReason
PUBLICRequested modelNo restriction
INTERNALRequested modelNo restriction
CONFIDENTIALazure/gpt-4omistral/mistral-large-latestEU-hosted
RESTRICTEDollama/llama3.2Local only

For maximum security, use Azure Private Endpoints to ensure TapPass communicates with Azure OpenAI over a private network:

Terminal window
# Create private endpoint
az network private-endpoint create \
--name tappass-openai-pe \
--resource-group your-rg \
--vnet-name your-vnet \
--subnet your-subnet \
--private-connection-resource-id /subscriptions/.../tappass-openai \
--group-ids account \
--connection-name tappass-openai-connection

Then set AZURE_API_BASE to the private endpoint URL.

Instead of API keys, use Azure Managed Identity:

Terminal window
# Assign Cognitive Services OpenAI User role to your TapPass VM/container
az role assignment create \
--assignee <tappass-managed-identity-id> \
--role "Cognitive Services OpenAI User" \
--scope /subscriptions/.../tappass-openai

LiteLLM supports Azure Managed Identity automatically when AZURE_API_KEY is not set and AZURE_AD_TOKEN or DefaultAzureCredential is available.

TapPass tracks Azure OpenAI usage in the same audit trail as other providers:

  • Per-agent cost tracking (Azure pricing)
  • Token usage per call
  • Circuit breaker for Azure provider failures
  • Model routing decisions in audit trail
Terminal window
# Check Azure model availability
curl http://localhost:9620/v1/models | jq '.data[] | select(.id | startswith("azure/"))'