Note: This is a fork of Karpathy's llm-council with added support for Microsoft Foundry and Azure deployment options.
The idea of this repo is that instead of asking a question to your favorite LLM provider (e.g. OpenAI GPT 5.2, Mistral Large 3, Deepseek, Kimi K2, xAI Grok 4 etc.), you can group them into your "LLM Council". This repo is a simple, local web app that essentially looks like ChatGPT except it uses OpenRouter to send your query to multiple LLMs, it then asks them to review and rank each other's work, and finally a Chairman LLM produces the final response.
In a bit more detail, here is what happens when you submit a query:
- Stage 1: First opinions. The user query is given to all LLMs individually, and the responses are collected. The individual responses are shown in a "tab view", so that the user can inspect them all one by one.
- Stage 2: Review. Each individual LLM is given the responses of the other LLMs. Under the hood, the LLM identities are anonymized so that the LLM can't play favorites when judging their outputs. The LLM is asked to rank them in accuracy and insight.
- Stage 3: Final response. The designated Chairman of the LLM Council takes all of the model's responses and compiles them into a single final answer that is presented to the user.
This project was 99% vibe coded as a fun Saturday hack because I wanted to explore and evaluate a number of LLMs side by side in the process of reading books together with LLMs. It's nice and useful to see multiple responses side by side, and also the cross-opinions of all LLMs on each other's outputs. I'm not going to support it in any way, it's provided here as is for other people's inspiration and I don't intend to improve it. Code is ephemeral now and libraries are over, ask your LLM to change it in whatever way you like.
The project uses uv for project management.
Backend:
uv syncFrontend:
cd frontend
npm install
cd ..The application supports two providers: OpenRouter (default) and Microsoft Foundry.
Create a .env file in the project root:
PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-v1-...Get your API key at openrouter.ai. Make sure to purchase the credits you need, or sign up for automatic top up.
Create a .env file in the project root:
PROVIDER=azure
AZURE_ENDPOINT=https://llm-council-foundry.openai.azure.com/openai/v1/Azure Authentication:
Microsoft Foundry uses Azure Entra (formerly Azure AD) authentication via DefaultAzureCredential. Make sure you are authenticated with Azure CLI or have appropriate environment variables set:
# Login with Azure CLI
az loginOr set environment variables for service principal authentication. See Azure Identity documentation for more details.
Edit backend/config.py to customize the council:
For OpenRouter:
PROVIDER = "openrouter"
COUNCIL_MODELS = [
"DeepSeek-V3.2",
"Kimi-K2-Thinking",
"Mistral-Large-3",
"grok-4",
]
CHAIRMAN_MODEL = "gpt-5.2"For Microsoft Foundry:
PROVIDER = "azure"
AZURE_ENDPOINT = "https://llm-council-foundry.openai.azure.com/openai/v1/"
COUNCIL_MODELS = [
"deepseek-v3", # Azure deployment name
"kimi-k2", # Azure deployment name
"mistral-large-3", # Azure deployment name
"grok-4", # Azure deployment name
]
CHAIRMAN_MODEL = "gpt-5"Note: For Microsoft Foundry, use deployment names as they appear in your Microsoft Foundry resource, not the full model identifiers.
Option 1: Use the start script
./start.shOption 2: Run manually
Terminal 1 (Backend):
uv run python -m backend.mainTerminal 2 (Frontend):
cd frontend
npm run devThen open http://localhost:5173 in your browser.
The application can be deployed to Azure Container Apps for production use. The deployment scripts are located in the scripts/deploy-container-app/ directory.
- Azure CLI installed and authenticated (
az login) - Docker installed (for building container images)
- An Azure subscription with appropriate permissions
-
Navigate to the deployment directory:
cd scripts/deploy-container-app -
Configure your deployment: Edit the
deploy.shscript to set your Azure resource names (resource group, container registry, container app environment, etc.). -
Deploy the application:
./deploy.sh
This script will:
- Create an Azure Resource Group
- Create an Azure Container Registry (ACR)
- Build and push Docker images for both backend and frontend
- Create an Azure Container Apps environment
- Deploy both containers to Azure Container Apps
- Configure environment variables and ingress settings
-
Access your application: After deployment, the script will output the URL of your deployed application.
When deploying to Azure Container Apps, make sure to set the following environment variables in the deployment configuration:
PROVIDER: Set toazureoropenrouterAZURE_ENDPOINT: Your Microsoft Foundry endpoint (if using Azure)OPENROUTER_API_KEY: Your OpenRouter API key (if using OpenRouter)
For Microsoft Foundry, the application will use Managed Identity for authentication when deployed to Container Apps.
To update an existing deployment:
cd scripts/deploy-container-app
./update.shTo remove all deployed resources:
cd scripts/deploy-container-app
./cleanup.sh- Backend: FastAPI (Python 3.10+), async httpx, OpenRouter API
- Frontend: React + Vite, react-markdown for rendering
- Storage: JSON files in
data/conversations/ - Package Management: uv for Python, npm for JavaScript