agentconnect.prompts.chain_factory module

Chain factory for creating LangGraph workflows.

This module provides factory functions for creating LangGraph workflows with different configurations and capabilities. It simplifies the process of creating complex agent workflows by providing pre-configured templates.

Note: The ChainFactory class is deprecated. Use the workflow functions instead.

class State

Bases: TypedDict

State type for basic conversation workflows.

messages

Sequence of messages in the conversation

messages: Annotated[Sequence[BaseMessage]]
class ChainFactory

Bases: object

Factory for creating conversation chains.

Deprecated: Use the workflow functions instead.

static create_conversation_chain(provider_type, model_name, api_key, system_config)

Create a conversation chain with the specified configuration.

Parameters:
  • provider_type (ModelProvider) – Type of model provider to use

  • model_name (ModelName) – Name of the model to use

  • api_key (str) – API key for the provider

  • system_config (SystemPromptConfig) – Configuration for the system prompt

Return type:

Runnable

Returns:

A compiled Runnable representing the conversation chain

create_agent_workflow(agent_type, system_config, llm, agent_registry=None, tools=None, prompt_templates=None, agent_id=None, custom_tools=None)

Create a workflow for an agent.

Parameters:
  • agent_type (str) – Type of agent workflow to create

  • system_config (SystemPromptConfig) – Configuration for the system prompt

  • llm (BaseChatModel) – Language model to use for the agent

  • agent_registry (Optional[AgentRegistry]) – Registry of agents for collaboration

  • tools (Optional[List[BaseTool]]) – Tools for the agent to use

  • prompt_templates (Optional[PromptTemplates]) – Templates for prompts

  • agent_id (Optional[str]) – ID of the agent

  • custom_tools (Optional[List[BaseTool]]) – Custom tools for the agent

Return type:

AgentWorkflow

Returns:

An agent workflow that can be compiled and run

create_collaboration_workflow(llm, agent_registry, system_prompt, memory_key='chat_history', max_iterations=10)

Create a collaboration workflow for agent-to-agent interaction.

Parameters:
  • llm (BaseChatModel) – Language model to use for the workflow

  • agent_registry (AgentRegistry) – Registry of agents for collaboration

  • system_prompt (str) – System prompt for the workflow

  • memory_key (str) – Key to use for storing chat history

  • max_iterations (int) – Maximum number of iterations for the workflow

Return type:

StateGraph

Returns:

A StateGraph representing the collaboration workflow

create_custom_workflow(llm, nodes, edges, state_type, entry_point, tools=None)

Create a custom workflow with the specified nodes and edges.

Parameters:
  • llm (BaseChatModel) – Language model to use for the workflow

  • nodes (Dict[str, Callable]) – Dictionary mapping node names to node functions

  • edges (Dict[str, Dict[str, str]]) – Dictionary mapping source nodes to dictionaries of condition-target pairs

  • state_type (Any) – Type of state to use for the workflow

  • entry_point (str) – Name of the entry point node

  • tools (Optional[List[BaseTool]]) – Optional list of tools for the workflow

Return type:

StateGraph

Returns:

A StateGraph representing the custom workflow

Raises:

ValueError – If the entry point is not in the nodes dictionary

compile_workflow(workflow, config=None)

Compile a workflow into a runnable.

Parameters:
  • workflow (StateGraph) – StateGraph to compile

  • config (Optional[Dict[str, Any]]) – Optional configuration for the runnable

Return type:

Runnable

Returns:

A compiled Runnable

create_runnable_from_workflow(workflow, config=None)

Create a runnable from a workflow with the specified configuration.

Parameters:
  • workflow (StateGraph) – StateGraph to create a runnable from

  • config (Optional[RunnableConfig]) – Optional configuration for the runnable

Return type:

Runnable

Returns:

A Runnable that can be used to execute the workflow