{
  "$schema": "https://pointcast.xyz/BLOCKS.md",
  "id": "0287",
  "url": "https://pointcast.xyz/b/0287",
  "channel": {
    "code": "FD",
    "slug": "front-door",
    "name": "Front Door",
    "purpose": "AI, interfaces, agent-era thinking.",
    "color600": "#185FA5",
    "color800": "#0B3E73"
  },
  "type": {
    "code": "READ",
    "label": "READ",
    "description": "Long-form text — essay, dispatch, article."
  },
  "title": "Codex project #5 — VideoLens · analyze any YouTube block",
  "dek": "Mike remembered a neat service that did data + sentiment analysis on YouTube videos. Instead of folding it into TrackLab, spinning it off as a standalone primitive: paste a URL, get metadata + audio features + transcript + sentiment arc + topics + palette + engagement, all composed from 7 APIs into one JSON.",
  "body": "Author: mh+cc. Source: Mike chat 2026-04-19 18:15 PT — 'yah, and it'd be neat to do data and sentiment analysis on the youtube video as a feature, and any other interesting data, i saw a neat service like this once'. cc spun the idea out to its right shape.\n\nThe right PointCast expression: VideoLens, a standalone analysis primitive, not a TrackLab feature. Reasoning — PointCast has ~15 WATCH-type YouTube embeds today (Alan Watts, November Rain, Purple Rain, Chakra Tune-Up, plus others). Each could benefit from a lens of 'what's actually going on in this video.' If VideoLens is a separate endpoint, TrackLab consumes it, AND every /b/{id} WATCH page can optionally toggle a LENS panel, AND /tv can reference it in future slide types.\n\nVideoLens composes 7 APIs into one payload: YouTube Data API (metadata, engagement), Spotify (audio features when track matches), AssemblyAI or YouTube auto-captions (transcript), HuggingFace (sentiment arc, topics), Cloudflare Images or node-vibrant (visual palette), YouTube comments sample (audience sentiment). Each component gracefully degrades — no matching Spotify track = audio features null, comments disabled = sampleSize 0, payload shape stays stable.\n\nArchitecture Codex answers: A1 composition strategy (recommend streaming endpoint for UX, simple fan-out-wait for simplicity), A2 caching (30-day KV cache keyed on YouTube ID, PC_VIDEOLENS_KV namespace), A3 rate limiting (Mike wallet-authed = unlimited, anons = 3/IP/day), A4 partial-success handling (every field nullable, warnings array at root), A5 consumers (TrackLab, /b/{id} WATCH panel, future /tv slide).\n\nSecrets needed: YOUTUBE_API_KEY, SPOTIFY_CLIENT_ID + SPOTIFY_CLIENT_SECRET, ASSEMBLYAI_API_KEY, HUGGINGFACE_API_KEY. Mike binds these in Cloudflare Pages dashboard (same pattern as RESEND_API_KEY for the email outbound work). API code reads via env.XXX.\n\nDeliverables: architecture doc, main /api/videolens/analyze endpoint, client-side lib + type definitions, VideoLensPanel component, standalone /videolens demo page. Linkage: /b/{id} WATCH pages get an optional LENS chip, TrackLab uses VideoLens as first fetch, /for-agents + /agents.json document the endpoint. Budget ~6-10 hours — most API-integration-heavy of the five projects.\n\nEnd of PR: run VideoLens on /b/0262 (Alan Watts), commit the resulting payload as docs/samples/videolens-0262.json. Proof of concept.\n\nCodex queue now at 5: Pulse (game), STATIONS (channel-flip), YeePlayer v1 (multiplayer), TrackLab (content creation), VideoLens (content enrichment). Five substantive projects, ~17-30 hours total if all five ship. Mike's ChatGPT Pro tier + Max Codex access supports the batch.\n\nFull spec: docs/briefs/2026-04-19-codex-videolens.md.",
  "timestamp": "2026-04-20T02:15:00.000Z",
  "size": "2x1",
  "noun": 287,
  "readingTime": "2 min",
  "external": {
    "label": "Full VideoLens brief ↗",
    "url": "https://github.com/MikeHoydich/pointcast/blob/main/docs/briefs/2026-04-19-codex-videolens.md"
  },
  "meta": {
    "tag": "broadcast",
    "surface": "codex-handoff",
    "primitive": "videolens"
  },
  "author": "mh+cc",
  "source": "Mike chat 2026-04-19 18:15 PT — 'yah, and it'd be neat to do data and sentiment analysis on the youtube video as a feature, and any other interesting data, i saw a neat service like this once'. cc scoped the primitive + wrote the spec.",
  "mood": "sprint-pulse",
  "moodUrl": "https://pointcast.xyz/mood/sprint-pulse",
  "companions": [
    {
      "id": "0286",
      "label": "Codex project #4 — TrackLab (its sibling creator)",
      "surface": "block"
    },
    {
      "id": "0285",
      "label": "Codex project #3 — YeePlayer v1",
      "surface": "block"
    },
    {
      "id": "0262",
      "label": "Alan Watts — the demo track for VideoLens proof",
      "surface": "block"
    },
    {
      "id": "0282",
      "label": "Broadcast mode — the arc this extends",
      "surface": "block"
    }
  ],
  "clock": null
}