praval.decorators

Decorator-based Agent API for Praval Framework.

This module provides a Pythonic decorator interface for creating agents that automatically handle reef communication and coordination.

Example:

from praval import agent, chat, broadcast, start_agents, get_reef

@agent("explorer", responds_to=["concept_request"])
def explore_concepts(spore):
    concepts = chat("Find concepts related to: " + spore.knowledge.get("concept", ""))
    broadcast({"type": "discovery", "discovered": concepts.split(",")})
    return {"discovered": concepts}

# Start the agent system
start_agents(explore_concepts, initial_data={"type": "concept_request", "concept": "AI"})
get_reef().wait_for_completion()
get_reef().shutdown()

Functions

achat(message[, timeout])

Async version of chat function for use within async agent handlers.

agent([name, channel, system_message, ...])

Decorator that turns a function into an autonomous agent.

broadcast(data[, channel, message_type])

Quick broadcast function that uses the current agent's communication.

chat(message[, timeout])

Quick chat function that uses the current agent's LLM with timeout support.

get_agent_info(agent_func)

Get information about an @agent decorated function.

praval.decorators.agent(name=None, channel=None, system_message=None, auto_broadcast=True, responds_to=None, memory=False, knowledge_base=None, on_error='log')[source]

Decorator that turns a function into an autonomous agent.

Parameters:
  • name (Optional[str]) – Agent name (defaults to function name)

  • channel (Optional[str]) – Channel to subscribe to (defaults to name + “_channel”)

  • system_message (Optional[str]) – System message (defaults to function docstring)

  • auto_broadcast (bool) – Whether to auto-broadcast return values

  • responds_to (Optional[List[str]]) – List of message types this agent responds to (None = all messages)

  • memory (Union[bool, Dict[str, Any]]) – Memory configuration - True for defaults, dict for custom config, False to disable

  • knowledge_base (Optional[str]) – Path to knowledge base files for auto-indexing

  • on_error (Union[str, Callable[[Exception, Any], None]]) – Error handling strategy: - “log” (default): Log error and continue processing - “raise”: Re-raise exception to caller - “ignore”: Silently ignore errors (not recommended) - callable: Custom error handler function(exception, spore)

Examples:

# Basic agent with message filtering
@agent("explorer", responds_to=["concept_request"])
def explore_concepts(spore):
    '''Find related concepts and broadcast discoveries.'''
    concepts = chat("Related to: " + spore.knowledge.get("concept", ""))
    return {"type": "discovery", "discovered": concepts.split(",")}

Memory-enabled agent:

@agent("researcher", memory=True)
def research_agent(spore):
    '''Research agent with memory capabilities.'''
    query = spore.knowledge.get("query")
    research_agent.remember(f"Researched: {query}")
    past_research = research_agent.recall(query)
    return {"research": "completed", "past_similar": len(past_research)}

Agent with knowledge base:

@agent("expert", memory=True, knowledge_base="./knowledge/")
def expert_agent(spore):
    '''Expert with pre-loaded knowledge base.'''
    question = spore.knowledge.get("question")
    relevant = expert_agent.recall(question, limit=3)
    return {"answer": [r.content for r in relevant]}

Agent with custom error handling:

def my_error_handler(error, spore):
    print(f"Error in agent: {error}")
    # Custom recovery logic here

@agent("processor", on_error=my_error_handler)
def process_agent(spore):
    '''Process with custom error handling.'''
    return {"processed": True}
praval.decorators.chat(message, timeout=10.0)[source]

Quick chat function that uses the current agent’s LLM with timeout support. Can only be used within @agent decorated functions.

Parameters:
  • message (str) – Message to send to the LLM

  • timeout (float) – Maximum time to wait for response in seconds

Return type:

str

Returns:

LLM response as string

Raises:
async praval.decorators.achat(message, timeout=10.0)[source]

Async version of chat function for use within async agent handlers.

Parameters:
  • message (str) – Message to send to the LLM

  • timeout (float) – Maximum time to wait for response in seconds

Return type:

str

Returns:

LLM response as string

Raises:
praval.decorators.broadcast(data, channel=None, message_type=None)[source]

Quick broadcast function that uses the current agent’s communication. Can only be used within @agent decorated functions.

Parameters:
  • data (Dict[str, Any]) – Data to broadcast

  • channel (Optional[str]) – Channel to broadcast to. Defaults to the channel set by start_agents(), or reef’s default channel if not in a start_agents() context.

  • message_type (Optional[str]) – Message type to set (automatically added to data)

Return type:

str

Returns:

Spore ID of the broadcast message

Raises:

RuntimeError – If called outside of an @agent function

Example

# Broadcast to all agents on the same channel (set by start_agents) broadcast({“type”: “analysis_request”, “data”: findings})

# Broadcast to a specific channel broadcast({“type”: “alert”}, channel=”urgent_alerts”)

praval.decorators.get_agent_info(agent_func)[source]

Get information about an @agent decorated function.

Parameters:

agent_func (Callable) – Function decorated with @agent

Return type:

Dict[str, Any]

Returns:

Dictionary with agent metadata