The intelligent proxy that intercepts, logs, and analyzes your LLM API traffic—giving you unprecedented visibility into your AI operations.
Everything you need to monitor and optimize your AI usage
Capture every request and response with full context, metadata, and timing information.
Monitor token usage, costs, and performance metrics across all your LLM providers.
Works seamlessly with OpenAI, Anthropic, Google, Azure, Cohere, and more.
Zero-configuration HTTP/HTTPS proxy on port 8888. Just point and monitor.
Organized file-based storage with intelligent indexing for fast searches.
Native desktop app for macOS, Windows, and Linux with system tray integration.
Choose your platform and start monitoring in seconds
Loading downloads...
Latest release from GitHub Actions • All platforms supported • Open source