AI Chat Agent Ask an AI about your traffic, generate rules, and get curl commands

The dashboard includes an embedded AI chat agent powered by Claude Code. It can see your captured traffic, understand the proxy's architecture, answer questions, and take actions like creating intercept rules or generating curl commands.

Prerequisites

The Chat Panel

┌──────────────────────────────────────────┐ │ AI Chat [toggles] [-] │ ← Header (collapse/expand) ├──────────────────────────────────────────┤ │ [x] Traffic [ ] Selected [ ] Rules [ ] Src │ ← Context checkboxes │ [====traffic====|==rules==| ] │ ← Token budget bar ├──────────────────────────────────────────┤ │ │ │ You: Show me all POST requests to /api/* │ │ │ │ Claude: I found 12 POST requests... │ ← Scrollable messages │ ``` │ │ POST /api/users (200, 42ms) │ │ ``` │ │ │ ├──────────────────────────────────────────┤ │ [Type a message... ] [Send] │ ← Input area └──────────────────────────────────────────┘

The panel is fixed to the bottom-right corner. When collapsed, it shows as a small pill button reading "AI Chat" with a green dot if Claude is available.

Context Toggles

You control exactly what information the agent sees using the checkbox toggles in the panel header. Each toggle includes a section of data in the system prompt sent to Claude:

ToggleWhat It IncludesToken Cost
TrafficSummary table of the most recent 200 captured requests (method, status, host, path, type, size, time)Medium (~2-5k tokens)
SelectedFull detail of the currently selected traffic entry (headers, bodies, timing). Select an entry in the traffic list first.Variable (depends on body size, max ~10k tokens)
RulesAll current intercept rules as JSONLow (~100-500 tokens)
SourceKey ProxyServer source files — server.js, proxy-server.js, tls-handler.js, traffic-store.js, etc.High (~15-25k tokens)
BrowserCookies and localStorage from the browser extension (if installed)Variable

Token Budget Bar

The colored bar below the toggles shows the approximate token breakdown of the current context. Each segment represents one context block, and the total is displayed as "~Nk tokens". This helps you understand how much context you're sending per message.

What You Can Ask

Traffic Analysis

Request Details

(Toggle "Selected" ON and click a request first)

Rule Management

Architecture Questions

(Toggle "Source" ON)

Slash Commands

CommandAction
/resetClear the conversation history and start fresh
/compactSummarize the conversation to reduce context size
/helpShow available commands and tips

How It Works Internally

┌────────────┐ WebSocket ┌─────────────┐ stdin/stdout ┌──────────┐ │ Chat UI │ chat:send │ ChatHandler │ per-turn spawn │ claude │ │ (chat.js) │ ────────────► │ │ ──────────────────► │ CLI │ │ │ │ │ │ process │ │ │ chat:chunk │ ContextBldg │ ◄────streaming JSON──│ │ │ │ ◄──────────── │ │ │ (tools: │ │ │ chat:done │ ClaudeSessn │ │ Read, │ │ │ ◄──────────── │ │ │ Write, │ └────────────┘ └─────────────┘ │ Edit, │ │ Bash) │ └──────────┘
  1. User types a message in the chat panel
  2. Chat.js sends a chat:send WebSocket message with the text, selected entry ID, and context toggles
  3. WSBridge routes the chat:* message to ChatHandler
  4. ChatHandler calls ContextBuilder to assemble the system prompt from enabled context blocks
  5. ClaudeSession spawns claude -p <prompt> --output-format stream-json as a subprocess
  6. Streaming JSON events are parsed and forwarded as chat:chunk messages to the client
  7. When the process completes, a chat:done message is sent with the full response
  8. Conversation history is maintained server-side and replayed as context in subsequent turns

Browser Extension (Optional)

An optional Chrome extension in the extension/ directory sends the current page's cookies and localStorage to the proxy, making them available to the AI agent.

Installing

  1. Open chrome://extensions/
  2. Enable "Developer mode"
  3. Click "Load unpacked" and select the extension/ directory
  4. Click the extension icon and "Send Context" while on any page

Once sent, toggle "Browser" ON in the chat panel to include this data in the AI's context.