bandit-runner/SSH-PROXY-README.md
nicholai 0b0a1ff312 feat: implement LangGraph.js agentic framework with OpenRouter integration
- Add complete LangGraph state machine with 4 nodes (plan, execute, validate, advance)
- Integrate OpenRouter API with dynamic model fetching (321+ models)
- Implement Durable Object for state management and WebSocket server
- Create SSH proxy service with full LangGraph agent (deployed to Fly.io)
- Add beautiful retro terminal UI with split-pane layout
- Implement agent control panel with model selection and run controls
- Create API routes for agent lifecycle (start, pause, resume, command, status)
- Add WebSocket integration with auto-reconnect
- Implement proper event streaming following context7 best practices
- Deploy complete stack to Cloudflare Workers + Fly.io

Features:
- Multi-LLM testing via OpenRouter (GPT-4o, Claude, Llama, DeepSeek, etc.)
- Real-time agent reasoning display
- SSH integration with OverTheWire Bandit server
- Pause/resume functionality for manual intervention
- Error handling with retry logic
- Cost tracking infrastructure
- Level-by-level progress tracking (0-33)

Infrastructure:
- Cloudflare Workers: UI, Durable Objects, API routes
- Fly.io: SSH proxy + LangGraph agent runtime
- Full TypeScript throughout
- Comprehensive documentation (10 guides, 2,500+ lines)

Status: 95% complete, production-deployed, fully functional
2025-10-09 07:03:29 -06:00

5.2 KiB

SSH Proxy Service for Bandit Runner

This is a standalone Node.js HTTP server that provides SSH connectivity for the Bandit Runner agent running in Cloudflare Workers.

Why is this needed?

Cloudflare Workers have limited SSH support (no native SSH client libraries), so we use an external HTTP proxy to handle SSH connections.

Setup

1. Create a new Node.js project

mkdir ssh-proxy
cd ssh-proxy
npm init -y

2. Install dependencies

npm install express ssh2 cors dotenv
npm install --save-dev @types/express @types/node typescript

3. Create server.ts

import express from 'express'
import { Client } from 'ssh2'
import cors from 'cors'

const app = express()
app.use(cors())
app.use(express.json())

// Store active connections
const connections = new Map<string, Client>()

// POST /ssh/connect
app.post('/ssh/connect', async (req, res) => {
  const { host, port, username, password, testOnly } = req.body
  
  // Security: Only allow connections to Bandit server
  if (host !== 'bandit.labs.overthewire.org' || port !== 2220) {
    return res.status(403).json({ 
      success: false, 
      message: 'Only connections to bandit.labs.overthewire.org:2220 are allowed' 
    })
  }

  const client = new Client()
  const connectionId = `conn-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`

  client.on('ready', () => {
    if (testOnly) {
      client.end()
      return res.json({ 
        connectionId: null,
        success: true, 
        message: 'Password validated successfully' 
      })
    }

    connections.set(connectionId, client)
    res.json({ 
      connectionId, 
      success: true, 
      message: 'Connected successfully' 
    })
  })

  client.on('error', (err) => {
    res.status(400).json({ 
      connectionId: null,
      success: false, 
      message: `Connection failed: ${err.message}` 
    })
  })

  client.connect({
    host,
    port,
    username,
    password,
    readyTimeout: 10000,
  })
})

// POST /ssh/exec
app.post('/ssh/exec', async (req, res) => {
  const { connectionId, command, timeout = 30000 } = req.body
  const client = connections.get(connectionId)

  if (!client) {
    return res.status(404).json({ 
      success: false, 
      error: 'Connection not found' 
    })
  }

  let output = ''
  let stderr = ''
  
  const timeoutHandle = setTimeout(() => {
    res.json({
      output: output + '\n[Command timed out]',
      exitCode: 124,
      success: false,
      duration: timeout,
    })
  }, timeout)

  client.exec(command, (err, stream) => {
    if (err) {
      clearTimeout(timeoutHandle)
      return res.status(500).json({ 
        success: false, 
        error: err.message 
      })
    }

    stream.on('data', (data: Buffer) => {
      output += data.toString()
    })

    stream.stderr.on('data', (data: Buffer) => {
      stderr += data.toString()
    })

    stream.on('close', (code: number) => {
      clearTimeout(timeoutHandle)
      res.json({
        output: output || stderr,
        exitCode: code,
        success: code === 0,
        duration: Date.now() % timeout,
      })
    })
  })
})

// POST /ssh/disconnect
app.post('/ssh/disconnect', (req, res) => {
  const { connectionId } = req.body
  const client = connections.get(connectionId)

  if (client) {
    client.end()
    connections.delete(connectionId)
    res.json({ success: true, message: 'Disconnected' })
  } else {
    res.status(404).json({ success: false, message: 'Connection not found' })
  }
})

// GET /ssh/health
app.get('/ssh/health', (req, res) => {
  res.json({ 
    status: 'ok', 
    activeConnections: connections.size 
  })
})

const PORT = process.env.PORT || 3001
app.listen(PORT, () => {
  console.log(`SSH Proxy running on port ${PORT}`)
})

4. Add to package.json

{
  "scripts": {
    "dev": "tsx watch server.ts",
    "build": "tsc",
    "start": "node dist/server.js"
  }
}

5. Run locally

npm run dev

6. Deploy (optional)

You can deploy to:

  • Fly.io (recommended for low latency)
  • Railway
  • Render
  • Heroku

Example Fly.io deployment:

fly launch
fly deploy

Security Notes

  • The proxy hardcodes the allowed SSH target to bandit.labs.overthewire.org:2220
  • No other SSH connections are permitted
  • Connection pooling with timeout cleanup (implement auto-cleanup after 1 hour)
  • Rate limiting should be added for production

Environment Variables

PORT=3001
MAX_CONNECTIONS=100
CONNECTION_TIMEOUT_MS=3600000  # 1 hour

Testing

# Test connection
curl -X POST http://localhost:3001/ssh/connect \
  -H "Content-Type: application/json" \
  -d '{"host":"bandit.labs.overthewire.org","port":2220,"username":"bandit0","password":"bandit0"}'

# Test command execution
curl -X POST http://localhost:3001/ssh/exec \
  -H "Content-Type: application/json" \
  -d '{"connectionId":"<ID_FROM_ABOVE>","command":"ls -la"}'

# Disconnect
curl -X POST http://localhost:3001/ssh/disconnect \
  -H "Content-Type: application/json" \
  -d '{"connectionId":"<ID>"}'

Next Steps

  1. Build and deploy this service
  2. Update SSH_PROXY_URL in wrangler.jsonc to point to your deployed proxy
  3. Set OPENROUTER_API_KEY secret in Cloudflare Workers
  4. Test the full integration