Skip to main content
Configure the language models and business rules that power AI for Work. This guide covers adding pre-built and custom LLM integrations, setting up embedding models, and defining answering and entity rules.

General Purpose LLMs

AI for Work integrates with OpenAI, Azure OpenAI, and Google Gemini out of the box, and supports custom models via API endpoint. All configurations are managed from the Admin Console.

Supported Providers

ProviderAvailable ModelsAccess MethodBest For
OpenAIGPT-5-mini, GPT-5-nano, GPT-5, GPT-4.1, GPT-4.1-mini, GPT-4o, GPT-3.5-turbo, o3, o3-mini, o4-miniOpenAI API keyGeneral-purpose tasks, conversational AI, complex reasoning, content generation
Azure OpenAIGPT-5-mini, GPT-5-nano, GPT-5, GPT-4.1, GPT-4.1-mini, GPT-4o, GPT-3.5-turbo, o3, o3-mini, o4-miniAzure Portal / Azure OpenAI ServiceEnterprise deployments requiring Azure infrastructure, compliance, and enhanced security
Google GeminiGemini 2.5 Pro (Recommended), Gemini 2.5 Flash (Recommended), Gemini 2.5 Flash Lite, Gemini 2.0 Flash, Gemini 2.0 Flash LiteGoogle Vertex AI / Gemini StudioMulti-modal tasks, fast responses, Google Cloud deployments

Model Tiers

Select a tier based on task complexity and cost requirements.
TierBest ForExample Models
BasicHigh-volume, straightforward tasks — classification, simple Q&A, routine inquiriesGPT-4o-mini, Gemini 2.5 Flash, Gemini 2.0 Flash
StandardComplex reasoning, multi-step workflows, deeper contextual understanding — recommended default for all system prompts and orchestratorsGPT-4.1, GPT-4o, Gemini 2.5 Pro
PremiumAdvanced reasoning and complex problem-solving requiring maximum AI capabilityo3, o3-mini, o4-mini

Configuration

Pre-Built LLM

  1. Go to Admin Console > Assist Configuration > General Purpose. general-purpose-llm
  2. Click New and select a provider: OpenAI, Azure OpenAI, or Google Gemini. general-purpose-llm-new
  3. Enter the required details:
    • Integration Name — A unique identifier (for example, OpenAI-Production).
    • API Key — Your provider API key.
    • Model Name — Select from the dropdown.
    open-ai
  4. Review and accept the Policy Guidelines, then click Save.
  5. Confirm the integration appears as active in the General LLM Integrations list. configured-model
Provider-specific notes
ProviderNotes
OpenAIGet your API key from platform.openai.com. Rotate keys regularly and monitor usage for cost control.
Azure OpenAIIn the Azure Portal, open your Azure OpenAI resource and go to Keys and Endpoints. Copy the endpoint URL and an API key. Confirm your subscription has the required quotas; configure VNet rules if needed.
Google GeminiEnable the Vertex AI API in Google Cloud Console or use Gemini Studio. Create a service account, generate credentials, and verify that selected models are enabled in Google Model Garden. Configure project-level billing before use.

Custom LLM

Use this option for proprietary or self-hosted models exposed via API.
  1. Go to Admin Console > Assist Configuration > General Purpose.
  2. Click New and select Custom LLM.
  3. Enter the basic configuration:
    • Integration Name — A descriptive name for this integration.
    • Model Name — The model identifier.
    • Endpoint URL — The full API URL where your model is hosted.
    custom-llm
  4. Configure API settings:
    • Method — Select the HTTP method (typically POST).
    • Max Request Tokens — Set a token limit to control cost and response size.
  5. Set up authentication:
    • Auth Type — Choose API Key, Bearer Token, or Custom Header.
    • Enter the credentials required by your model.
  6. Add custom headers if required:
    • Click + Add a Header and enter key-value pairs (for example, Content-Type: application/json).
    add-header
  7. Test the connection:
    • Enter a sample payload and click Test.
    • Success confirms connectivity; Error returns details to help you troubleshoot.
    success-message
  8. Accept the Policy Guidelines and click Save.
  9. Confirm the integration appears as active in the General LLM Integrations list.

Embedding Models

Embedding models convert text into vector representations, enabling semantic search, similarity matching, and other AI-powered features. Unlike keyword search, embeddings capture meaning and context, producing more intelligent results. Both pre-built and custom embedding models are supported.
Embedding models are required for the attachments feature, which lets users attach files and ask questions about their content.

Managing Integrations

View active integrations

All configured models appear in the General LLM Integrations list.
FieldDescription
Integration NameIdentifier for the configuration
ProviderOpenAI, Azure OpenAI, Gemini, or Custom
Model NameSpecific model version in use
StatusActive, inactive, or error
Last UpdatedTimestamp of the most recent change

Modify an integration

  1. Locate the integration in the General LLM Integrations list.
  2. Click the edit icon or the integration name.
  3. Update the required fields — API key, model version, headers, and so on.
  4. Test the connection, then click Save.

Remove an integration

  1. Locate the integration in the General LLM Integrations list.
  2. Click Delete or Deactivate.
  3. Confirm when prompted.
  4. Reconfigure any dependent features to use an alternative model.

Business Rules

Business rules control how AI for Work selects entities and generates responses when specific keywords appear in user input. Use them to ensure answers align with organizational policies and goals. Access business rules at Admin Console > Business Rules. Rules from all published custom integrations in the account appear here. Two rule types are available:

Answering Rules

Answering rules define the response the system returns when a user query contains specific keywords. When a keyword is detected, the system displays the answer you configured instead of generating one dynamically.
FieldDescription
ConnectionToggle on to make the rule connection-specific — it triggers only when both the bot connection and keyword match. Toggle off for keyword-only matching, independent of any connection.
RuleEnter the question and specify the answer the system should display.

Entity Rules

Entity rules prefill entities in a query when the system detects specific keywords within a matching connection intent. This automates data entry and streamlines interactions that depend on predefined values.
FieldDescription
ConnectionSelect a custom connection from the available list.
RuleWrite the rule in detail, then click Build Flow to proceed.