Skip to main content

Minimal Usage

This example shows the simplest possible RecoAgent setup - just a few lines of code to get started with RAG.

Overview​

This example demonstrates:

  • Basic RecoAgent initialization
  • Simple question answering
  • Minimal configuration required
  • Foundation for more complex examples

Prerequisites​

  • Python 3.8+
  • RecoAgent installed (pip install recoagent)
  • OpenAI API key (or another LLM provider)

Code Implementation​

Here's the complete minimal example:

import os
from dotenv import load_dotenv
from recoagent import RecoAgent

# Load environment variables
load_dotenv()

def main():
# Initialize RecoAgent with minimal configuration
agent = RecoAgent(
llm_provider="openai",
llm_model="gpt-3.5-turbo",
embedding_model="text-embedding-ada-002"
)

# Add some sample knowledge
documents = [
"RecoAgent is an enterprise RAG platform built with LangGraph and LangChain.",
"It supports hybrid retrieval combining BM25 and vector embeddings.",
"The platform includes built-in evaluation metrics and safety guardrails.",
"RecoAgent supports multiple vector stores including OpenSearch, Azure AI Search, and Vertex AI.",
"The system includes LangSmith integration for comprehensive observability and tracing."
]

agent.add_documents(documents)
print(f"āœ… Added {len(documents)} documents to knowledge base")

# Ask a question
question = "What is RecoAgent?"
response = agent.ask(question)

# Print the response
print(f"\nšŸ¤” Question: {question}")
print(f"šŸ’” Answer: {response.answer}")
print(f"šŸŽÆ Confidence: {response.confidence:.2f}")
print(f"šŸ“š Sources: {len(response.sources)} documents")

# Show source information
if response.sources:
print("\nšŸ“– Source Documents:")
for i, source in enumerate(response.sources, 1):
print(f" {i}. {source[:80]}...")

if __name__ == "__main__":
main()

Running the Example​

  1. Install RecoAgent:

    pip install recoagent
  2. Set up your API key:

    export OPENAI_API_KEY="your-api-key-here"
  3. Run the example:

    python minimal_usage.py

Expected Output​

Question: What is RecoAgent?
Answer: RecoAgent is an enterprise RAG platform built with LangGraph and LangChain. It supports hybrid retrieval combining BM25 and vector embeddings, and includes built-in evaluation metrics and safety guardrails.
Confidence: 0.95

Understanding the Code​

Key Components​

  1. RecoAgent Initialization:

    agent = RecoAgent()
    • Uses default configuration
    • Automatically sets up vector store and LLM
    • Requires minimal setup
  2. Document Addition:

    agent.add_documents([...])
    • Adds documents to the knowledge base
    • Automatically processes and indexes them
    • Supports various document formats
  3. Question Answering:

    response = agent.ask("What is RecoAgent?")
    • Processes the question through the RAG pipeline
    • Returns structured response with answer and metadata
    • Includes confidence score and source references

Default Configuration​

When you initialize RecoAgent() without parameters, it uses these defaults:

  • LLM: OpenAI GPT-3.5-turbo
  • Embeddings: OpenAI text-embedding-ada-002
  • Vector Store: In-memory store (for development)
  • Retrieval: Hybrid retrieval (BM25 + vector)
  • Chunking: 500 tokens with 50 token overlap

Variations and Modifications​

Custom Configuration​

from recoagent import RecoAgent

# Initialize with custom configuration
agent = RecoAgent(
llm_provider="openai",
llm_model="gpt-4",
embedding_model="text-embedding-ada-002",
chunk_size=1000,
chunk_overlap=100
)

print("RecoAgent initialized with custom configuration")

Multiple Questions​

# Ask multiple questions
questions = [
"What is RecoAgent?",
"What retrieval methods does it support?",
"What safety features are included?"
]

for question in questions:
response = agent.ask(question)
print(f"Q: {question}")
print(f"A: {response.answer}")
print(f"Confidence: {response.confidence}")
print("---")

Error Handling​

try:
response = agent.ask("What is RecoAgent?")
print(f"Answer: {response.answer}")
except Exception as e:
print(f"Error: {e}")
print("Make sure your API key is set correctly")

Common Issues and Solutions​

API Key Not Set​

Error: OpenAI API key not found

Solution: Set your API key as an environment variable:

export OPENAI_API_KEY="your-api-key-here"

Import Error​

ImportError: No module named 'recoagent'

Solution: Install RecoAgent:

pip install recoagent

Rate Limit Exceeded​

Error: Rate limit exceeded

Solution: Wait a moment and try again, or upgrade your OpenAI plan.

Next Steps​

Now that you have a working minimal example, try these next steps:

  1. Simple RAG Example - Add more documents and complex questions (coming soon)
  2. Basic Agent Example - Create your first LangGraph agent (coming soon)
  3. Vector Search Example - Explore vector similarity search (coming soon)
  4. How-To: Custom Configuration - Set up production vector stores (coming soon)
  • Quickstart Tutorial - Step-by-step tutorial
  • Installation Guide - Complete setup guide (coming soon)
  • API Reference - Detailed API documentation
  • Community Examples - More examples from the community