Integrations¶
Connect lionagi with external services, tools, and frameworks.
Core¶
- LLM Providers -- OpenAI, Anthropic, Gemini, Ollama, and more
- Tools -- Turn any Python function into an LLM-callable tool
- MCP Servers -- Connect to Model Context Protocol tool servers
Data¶
- Databases -- Serializing Branch state; pydapter for storage adapters
- Vector Stores -- Using embeddings and vector search with tools
External Frameworks¶
lionagi does not have native integrations with other AI frameworks, but you can wrap any framework's functionality as a lionagi tool:
- DSPy -- Wrap DSPy modules as tools
- LlamaIndex -- Wrap LlamaIndex query engines as tools
Integration Pattern¶
Any Python function (sync or async) can become a tool:
branch = Branch(tools=[your_function])
result = await branch.operate(instruction="...", actions=True)
This works with any external library -- database clients, HTTP APIs, vector stores, or other AI frameworks. See Tools for details.