Building Custom AI Providers

Learn how to build custom AI provider integrations to connect any LLM service with BoxLang AI.

Create custom AI provider integrations to connect any LLM service with BoxLang AI. This guide shows you how to build providers that work seamlessly with all BoxLang AI features including streaming, tools, embeddings, and memory systems.

🎯 Why Custom Providers?

Build custom providers when you need to:

  • Enterprise LLMs - Connect to private AI deployments or custom endpoints

  • Emerging Services - Integrate new AI providers not yet supported

  • Custom Logic - Add organization-specific request/response handling

  • API Wrappers - Create simplified interfaces for complex AI services

  • Testing Mocks - Build mock providers for development and testing

🏗️ Provider Architecture

📝 IAiService Interface

All AI providers must implement the IAiService interface:

interface {
    /**
     * Get the name of the LLM
     */
    function getName();

    /**
     * Configure the service with an API key
     * @apiKey - The API key to use with the provider
     * @return The service instance
     */
    IAiService function configure( required any apiKey );

    /**
     * Invoke the provider service with a AiRequest object
     * @aiRequest The AiRequest object to send to the provider
     * @return The response from the service
     */
    function invoke( required AiRequest aiRequest );

    /**
     * Invoke the provider service in streaming mode
     * @aiRequest The AiRequest object to send to the provider
     * @callback A callback function called with each chunk: function( chunk )
     * @return void
     */
    function invokeStream( required AiRequest aiRequest, required function callback );

    /**
     * Generate embeddings for the given input text(s)
     * @embeddingRequest The embedding request object
     * @return The embeddings response from the provider
     */
    function embeddings( required AiEmbeddingRequest embeddingRequest );
}

🚀 Quick Start: Simple Custom Provider

Here's a minimal custom provider for an OpenAI-compatible service:

Usage:

🎨 Extending BaseService

The BaseService provides OpenAI-compatible implementation:

Inherited Properties

Inherited Methods

💡 Provider Types

Type 1: OpenAI-Compatible (Simplest)

If your provider follows OpenAI's API format, just extend BaseService:

Examples in codebase:

  • OpenAIService.bx - Standard OpenAI

  • GroqService.bx - Groq (OpenAI-compatible)

  • DeepSeekService.bx - DeepSeek (OpenAI-compatible)

  • PerplexityService.bx - Perplexity (OpenAI-compatible)

Type 2: Custom Authentication

Override methods to handle non-standard authentication:

Examples in codebase:

  • ClaudeService.bx - Uses x-api-key header instead of Bearer token

Type 3: Custom Request/Response Format

Override methods to transform request/response formats:

Examples in codebase:

  • GeminiService.bx - Different message format

  • CohereService.bx - Custom request structure

Type 4: Custom Streaming

Override streaming to handle provider-specific SSE formats:

Examples in codebase:

  • ClaudeService.bx - Custom authentication in streaming

  • OllamaService.bx - Different SSE format

🛠️ Advanced Features

Tool/Function Calling Support

If your provider supports tools, format them correctly:

Examples in codebase:

  • ClaudeService.bx - Full tool calling implementation with recursive handling

Custom Headers

Add provider-specific headers:

Examples in codebase:

  • ClaudeService.bx - Adds anthropic-version header

Embeddings Support

Override embeddings for custom embedding endpoints:

Examples in codebase:

  • OpenAIService.bx - Sets default embedding model

  • VoyageService.bx - Custom embeddings implementation

  • CohereService.bx - Different embeddings format

🔧 Real-World Example: Complete Custom Provider

Here's a comprehensive example integrating a fictional AI service with all features:

Usage:

📦 Registering Custom Providers

Module Registration

In your ModuleConfig.bx:

Interceptor implementation:

Application Registration

For non-module registration, use BoxRegisterInterceptor():

Direct Usage

Most common approach - instantiate directly:

✅ Best Practices

1. Configuration Validation

Validate configuration on initialization:

2. Error Handling

Provide detailed error information:

3. Event Announcements

Always announce requests/responses for observability:

4. Logging Support

Respect logging configuration:

5. Defensive Programming

Handle null/missing data gracefully:

🧪 Testing Custom Providers

Unit Tests

Create comprehensive tests:

Integration Tests

Test with real API:

📚 Next Steps

🎓 Summary

Custom AI providers enable you to:

  • ✅ Connect any LLM service to BoxLang AI

  • ✅ Handle custom authentication and request formats

  • ✅ Implement streaming and embeddings support

  • ✅ Add organization-specific logic and transformations

  • ✅ Create mock providers for testing

  • ✅ Work seamlessly with all BoxLang AI features

Start with BaseService for OpenAI-compatible APIs, or override methods for custom implementations!

Last updated