Skip to content

Installation

Recommended: Using uv

LionAGI works best with uv for fast, reliable dependency management:

uv add lionagi

Alternative: pip

You can also install with pip, but uv is faster:

pip install lionagi

Set Your API Key

API Key Required

LionAGI needs an LLM provider API key to work. Choose one:

export OPENAI_API_KEY=your-key-here
export ANTHROPIC_API_KEY=your-key-here  

No API key needed - just install Ollama

Test Installation

Verify Everything Works

from lionagi import Branch, iModel
import asyncio

async def test():
    agent = Branch(chat_model=iModel(provider="openai", model="gpt-4o-mini"))
    response = await agent.communicate("Hello")
    print(f"LionAGI says: {response}")

asyncio.run(test())

Provider Options

# OpenAI
iModel(provider="openai", model="gpt-4o-mini")

# Anthropic  
iModel(provider="anthropic", model="claude-3-5-sonnet-20241022")

# Local (Ollama)
iModel(provider="ollama", model="llama3")