The InvokeAgent class is the primary interface between your LLM and real-world APIs. It manages chat history, tools, and agent memory, turning natural-language queries into structured API calls via tool-calling agents.
from invoke_agent.agent import InvokeAgentfrom langchain_openai import ChatOpenAI
from invoke_agent.agent import InvokeAgent
llm = ChatOpenAI(model="gpt-4o")
agent = InvokeAgent(llm, agents=["open-meteo"])
response = agent.chat("What's the weather in Berlin?")
print(response)-
No
agentsarg? If you have anagents_map.yamlin your working directory, it will be auto-loaded. -
Pass explicit file paths or URLs (no
{name: path}needed):agent = InvokeAgent(llm, agents=[ "./custom/calendar.json", "https://example.com/my_agents.txt", "google-calendar" ])
InvokeAgent(
llm: BaseChatModel,
agents: Optional[Union[str, List[Union[str, dict]]]] = None,
tools: Optional[List[BaseTool]] = None,
context: Optional[str] = None,
verbose: bool = False,
)| Parameter | Description |
|---|---|
llm |
A LangChain-compatible chat model (e.g., ChatOpenAI, ChatAnthropic). |
agents |
-
None(default): loadsagents_map.yamlif present, otherwise no integrations. -
str: path to a YAML map file. -
List[Union[str, dict]]: each entry may be:- A built-in alias (
"google-calendar"), - A bare file path or URL (infers name from JSON or filename),
- An explicit mapping
{ "alias": "path_or_url" }. | |tools| Optional extra tools (defaults to[api_executor]). | |context| Optional custom system prompt; if omitted, uses the internalDEFAULT_CONTEXT. Integrations are still appended afterward. | |verbose| Whether to log intermediate steps (context building, tool calls, etc.). |
- A built-in alias (
- Context Assembly
Calls
build_context()with yourcontext(or default) andagentslist/YAML. - Agent Wiring
Uses LangChain’s
create_tool_calling_agentplus your tools (defaults toapi_executor). - Chat History
Maintains
SystemMessage,HumanMessage, andAIMessagehistory for multi-turn coherence. - Tool Execution
When the LLM emits a function call (
api_executor), it performs the HTTP request and feeds back the result.
response = agent.chat("Book me a meeting at 2pm next Friday.")Returns the model’s final answer (after any necessary API calls). Each call updates the internal history.
To add custom tools:
from my_custom_tools import my_tool
agent = InvokeAgent(
llm,
agents=["open-meteo"],
tools=[api_executor, my_tool]
)