All Projects
Compass

Compass

In Progress

AI Productivity System

PythonMulti-AgentMCPRedis
Role:Lead Developer
Duration:3 months

Overview

Compass is a sophisticated AI productivity system that leverages multiple domain-specific agents to handle complex tasks. The system uses a custom Model Context Protocol (MCP) server to orchestrate communication between agents, enabling seamless collaboration on multi-step workflows. Each agent is specialized for specific domains—research, coding, analysis, and communication—allowing for expert-level performance across diverse tasks.

Architecture

The system follows a hub-and-spoke architecture where the MCP server acts as the central coordinator. Incoming requests are analyzed by a router agent that determines the optimal execution path. Domain agents communicate through the MCP server using a standardized message protocol, with Redis providing both caching and message queuing capabilities.

Key Features

Domain-Specific Agents

Specialized agents for research, coding, analysis, and communication tasks, each fine-tuned for optimal performance in their domain.

Custom MCP Server

A Model Context Protocol server that orchestrates agent communication, manages context windows, and handles tool invocations efficiently.

Redis Caching Layer

Intelligent caching system that stores frequently used tool results and context, dramatically reducing API calls and response times.

Dynamic Task Routing

Smart routing system that analyzes incoming requests and delegates to the most appropriate agent based on task complexity and domain.

Tech Stack

Backend

Python
Core language for agent orchestration and business logic
GO
Core business logic
FastAPI
High-performance async API framework

AI/ML

MCP (Model Context Protocol)
Protocol for standardized AI model communication
AutoGen
Agentic framework for building AI applications
vLLM
LLM inference engine for local model inference

Database

Redis
High-performance caching and session management
PostgreSQL
Relational database
MongoDB
NoSQL database

DevOps

Docker
Containerization for consistent deployment

Challenges & Solutions

!
Challenge

High latency from repeated tool invocations across multiple agents, causing slow response times for complex queries.

Solution

Implemented a multi-level Redis caching strategy that stores tool results with intelligent TTL based on data volatility, reducing redundant API calls by 60%.

!
Challenge

Token usage exploding due to context being passed between multiple agents for multi-step tasks.

Solution

Designed a context compression system that summarizes relevant information before inter-agent communication, cutting token usage by 50% while maintaining accuracy.

!
Challenge

Difficulty in maintaining consistent behavior across domain-specific agents.

Solution

Created a shared behavioral framework with standardized prompts and output formats, ensuring consistent user experience regardless of which agent handles the request.

Results

Reduced response latency to <100ms via Redis caching
Cut API costs by 30% by bypassing redundant inference
Decreased input token usage by 50% using dynamic tool retrieval
95% task completion accuracy across all domains