Copied to clipboard

tapes

Transparent telemetry infrastructure for agents

Install

Build from source with Go 1.25+:

git clone https://github.com/papercomputeco/tapes.git
cd tapes
go build -o tapesprox ./cmd/proxy

Or use nix develop if you have Nix installed.

Run

Start the proxy pointing to your LLM provider (defaults to Ollama):

./tapesprox \
  -listen ":8080" \
  -upstream "http://localhost:11434" \
  -db "./tapes.db"
-listen Address to listen on (default: :8080)
-upstream LLM provider URL (default: localhost:11434)
-db SQLite database path (empty = in-memory)
-debug Enable debug logging

Use

Send chat requests to the proxy instead of directly to your LLM:

curl -X POST http://localhost:8080/api/chat \
  -H "Content-Type: application/json" \
  -d '{"model": "llama2", "messages": [{"role": "user", "content": "Hello!"}], "stream": false}'

The proxy forwards to your upstream LLM and stores both request and response in the DAG.

Inspect

Query the DAG to explore stored conversations:

GET /dag/stats Total nodes, roots, and leaves
GET /dag/history List all conversation histories
GET /dag/history/:hash Full history up to a specific node
GET /dag/node/:hash Get a single node by hash
View on GitHub