agentconnect.providers package

Provider implementations for the AgentConnect framework.

This module provides various model providers that can be used by AI agents, including support for OpenAI, Anthropic, Groq, and Google AI models. The module implements a factory pattern for creating provider instances based on the desired model provider.

Key components:

  • ProviderFactory: Factory class for creating provider instances

  • BaseProvider: Abstract base class for all providers

  • Specific providers: OpenAI, Anthropic, Groq, Google

class ProviderFactory

Bases: object

Factory class for creating provider instances.

This class implements the factory pattern for creating provider instances based on the desired model provider.

_providers

Dictionary mapping provider types to provider classes

classmethod create_provider(provider_type, api_key)

Create a provider instance.

Parameters:
  • provider_type (ModelProvider) – Type of provider to create

  • api_key (str) – API key for the provider

Return type:

BaseProvider

Returns:

Provider instance

Raises:

ValueError – If the provider type is not supported

classmethod get_available_providers()

Get all available providers and their models.

Return type:

Dict[str, Dict]

Returns:

Dictionary mapping provider names to provider information

class BaseProvider(api_key=None)

Bases: ABC

Abstract base class for all model providers.

This class defines the interface that all model providers must implement, including methods for generating responses, getting available models, and configuring the provider.

Parameters:

api_key (str | None)

api_key

API key for the provider

async generate_response(messages, model, **kwargs)

Generate a response from the language model.

Parameters:
  • messages (List[Dict[str, str]]) – List of message dictionaries with ‘role’ and ‘content’ keys

  • model (ModelName) – The model to use for generation

  • **kwargs – Additional arguments to pass to the model

Return type:

str

Returns:

Generated response text

Raises:

Exception – If there is an error generating the response

abstractmethod get_available_models()

Get a list of available models for this provider.

Return type:

List[ModelName]

Returns:

List of available model names

get_langchain_llm(model_name, **kwargs)

Get a LangChain chat model instance.

Parameters:
  • model_name (ModelName) – Name of the model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

BaseChatModel

Returns:

LangChain chat model instance

class OpenAIProvider(api_key)

Bases: BaseProvider

Provider implementation for OpenAI models.

This class provides access to OpenAI’s language models, including GPT-4o, GPT-4.5, and o1 models.

Parameters:

api_key (str)

api_key

OpenAI API key

async generate_response(messages, model=ModelName.GPT4O_MINI, **kwargs)

Generate a response using an OpenAI model.

Parameters:
  • messages (List[Dict[str, str]]) – List of message dictionaries with ‘role’ and ‘content’ keys

  • model (ModelName) – The OpenAI model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

str

Returns:

Generated response text

Raises:

Exception – If there is an error generating the response

get_available_models()

Get a list of available OpenAI models.

Return type:

List[ModelName]

Returns:

List of available OpenAI model names

class AnthropicProvider(api_key)

Bases: BaseProvider

Provider implementation for Anthropic Claude models.

This class provides access to Anthropic’s Claude models, including Claude 3 Opus, Sonnet, and Haiku variants.

Parameters:

api_key (str)

api_key

Anthropic API key

client

Anthropic client instance

async generate_response(messages, model=ModelName.CLAUDE_3_5_HAIKU, **kwargs)

Generate a response using an Anthropic Claude model.

Parameters:
  • messages (List[Dict[str, str]]) – List of message dictionaries with ‘role’ and ‘content’ keys

  • model (ModelName) – The Claude model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

str

Returns:

Generated response text

Raises:

Exception – If there is an error generating the response

get_available_models()

Get a list of available Anthropic Claude models.

Return type:

List[ModelName]

Returns:

List of available Claude model names

get_langchain_llm(model_name, **kwargs)

Get a LangChain chat model instance for Anthropic.

Parameters:
  • model_name (ModelName) – Name of the Claude model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

BaseChatModel

Returns:

LangChain chat model instance for Anthropic

class GroqProvider(api_key)

Bases: BaseProvider

Provider implementation for Groq models.

This class provides access to Groq’s hosted models, including Llama, Mixtral, and Gemma models.

Parameters:

api_key (str)

api_key

Groq API key

async generate_response(messages, model=ModelName.MIXTRAL, **kwargs)

Generate a response using a Groq-hosted model.

Parameters:
  • messages (List[Dict[str, str]]) – List of message dictionaries with ‘role’ and ‘content’ keys

  • model (ModelName) – The Groq model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

str

Returns:

Generated response text

Raises:

Exception – If there is an error generating the response

get_available_models()

Get a list of available Groq models.

Return type:

List[ModelName]

Returns:

List of available Groq model names

class GoogleProvider(api_key)

Bases: BaseProvider

Provider implementation for Google Gemini models.

This class provides access to Google’s Gemini models, including Gemini 1.5 and Gemini 2.0 variants.

Parameters:

api_key (str)

api_key

Google AI API key

async generate_response(messages, model=ModelName.GEMINI1_5_FLASH, **kwargs)

Generate a response using a Google Gemini model.

Parameters:
  • messages (List[Dict[str, str]]) – List of message dictionaries with ‘role’ and ‘content’ keys

  • model (ModelName) – The Gemini model to use

  • **kwargs – Additional arguments to pass to the model

Return type:

str

Returns:

Generated response text

Raises:

Exception – If there is an error generating the response

get_available_models()

Get a list of available Google Gemini models.

Return type:

List[ModelName]

Returns:

List of available Gemini model names

Submodules