Integrations¶
For Production Systems
This section helps you connect LionAGI to real-world systems - databases, APIs, tools, and other AI frameworks.
Connect LionAGI with external services and frameworks to build comprehensive AI systems.
Available Integrations¶
Core Infrastructure¶
- LLM Providers - OpenAI, Anthropic, local models
- Vector Stores - Qdrant, Pinecone, Chroma
- Databases - PostgreSQL, MongoDB, Redis
- Tools - External APIs and services
- MCP Servers - Model Context Protocol integration
AI Framework Orchestration¶
- LlamaIndex RAG - Wrap RAG capabilities as operations
- DSPy Optimization - Embed prompt optimization
Meta-Orchestration¶
LionAGI's key advantage: orchestrate any existing AI framework without code changes.
# Your existing framework code runs unchanged
builder.add_operation(operation=your_existing_workflow)
This works with:
- LangChain workflows
- CrewAI crews
- AutoGen conversations
- Custom Python functions
- External APIs
Integration Patterns¶
- Wrapper Operations: Embed external tools as LionAGI operations
- Multi-Framework: Coordinate multiple frameworks in single workflow
- Gradual Migration: Keep existing code while gaining orchestration benefits
When You Need Integrations¶
Use Integrations When:
- Persistent data: Need to store workflow results in databases
- External knowledge: RAG systems with vector stores and knowledge bases
- Tool augmentation: Agents need access to APIs, calculators, or specialized services
- Framework combination: Want to orchestrate existing LangChain/CrewAI workflows
- Production deployment: Need monitoring, logging, and enterprise infrastructure
Getting Started¶
Integration Strategy
Start simple: Begin with LLM Providers and Tools
Add persistence: Connect Databases for workflow state
Scale up: Add Vector Stores for knowledge-intensive workflows
Orchestrate: Integrate existing frameworks with meta-orchestration patterns