Internal Platform | Social Intelligence & Analytics
Agent Vibes is an internal Next.js 15 application for collecting, analyzing, and visualizing social media sentiment about AI coding agents. The platform automates tweet collection via Apify, processes sentiment using Google Gemini, and presents insights through interactive dashboards.
🔒 Internal Use Only - Currently deployed on Vercel for internal team access. Not yet public.
🚀 Features
- ✅ Apify Pipeline - Automated tweet collection and sentiment analysis
- ✅ Dashboard - Real-time visualization of social sentiment trends
- ✅ Mock Prototypes - Static design references for rapid iteration
- Node.js 20+
- Access to internal Supabase, Apify, and Gemini accounts
# Clone and install
git clone https://github.com/sourcegraph-community/agent-vibes.git
cd agent-vibes
npm install
# Configure environment (copy from team 1Password vault or ask #ops-oncall)
cp .env.example .env.local
# Edit .env.local with credentials
# Start development server
npm run devVisit http://localhost:3000/dashboard
For detailed setup instructions, see Apify Pipeline Testing Guide
This project follows Vertical Slice Architecture (VSA) - features are organized as self-contained slices rather than horizontal layers.
src/
  ApifyPipeline/              # Feature: Social media intelligence pipeline
    Web/                      # User-initiated HTTP requests
      Application/            # Command handlers & orchestration
      Core/                   # Pure business logic
      DataAccess/             # Database operations (Supabase)
      ExternalServices/       # Third-party integrations (Apify, Gemini)
    Background/               # Time-triggered scheduled jobs
    Tests/                    # Unit & integration tests
    Docs/                     # Feature documentation
- Framework: Next.js 15 (App Router, Turbopack)
- Language: TypeScript (strict mode)
- Styling: Tailwind CSS v4
- Database: Supabase (PostgreSQL)
- Testing: Vitest
- Linting: ESLint v9 + ESLint Stylistic (no Prettier)
npm run dev              # Start dev server with Turbopack
npm run build            # Production build
npm run start            # Start production servernpm test                 # Run unit tests (Vitest)
npm run test:watch       # Watch mode
npm run test:ui          # Vitest UI
npm run check            # TypeScript + ESLint
npm run check:fix        # Auto-fix and format code
npm run typecheck        # TypeScript only
npm run lint             # ESLint only
npm run lint:fix         # Fix linting issuesnpm run health-check               # Validate environment & connections
npm run apply-migrations           # Apply database migrations programmatically (requires psql)
npm run enqueue:backfill           # Queue historical data (run once, configurable)
npm run process:backfill           # Process backfill batch (manual, repeat per batch)
npm run replay:sentiments          # Retry failed sentiment processing
npm run cleanup:raw-tweets         # Archive old raw data
npm run cleanup:sentiment-failures # Remove stale failure records
npm run rotate:supabase            # Rotate Supabase secrets (ops only)Note: All Apify Pipeline scripts automatically load .env.local via dotenv. Ensure environment variables are configured before running. Set DATABASE_URL (or SUPABASE_URL/SUPABASE_DB_PASSWORD with optional SUPABASE_DB_HOST and SUPABASE_DB_PORT) to the Supabase session pooler connection string so migrations run over IPv4, and install the psql client locally.
Automated social intelligence system that collects tweets about AI coding agents, analyzes sentiment, and stores insights for visualization.
Status: ✅ Production-ready, deployed on Vercel
Key Components:
- Tweet Collection - Apify actor scrapes Twitter based on tracked keywords
- Normalization - Standardizes tweet data and deduplicates
- Sentiment Analysis - Google Gemini classifies sentiment (positive/neutral/negative)
- Storage - Supabase PostgreSQL with views for analytics
Quick Links:
- Dashboard: /dashboard(overview, keywords, tweets)
- API Endpoints: /api/start-apify-run,/api/process-sentiments,/api/process-backfill
- Documentation: src/ApifyPipeline/README.md
- Collection Strategy: docs/apify-pipeline/date-based-collection-strategy.md - Backfill vs Regular Collection
- Testing Guide: docs/apify-pipeline/local-testing-guide.md
- Operational Runbook: src/ApifyPipeline/Docs/ApifyPipeline-start-apify-run-runbook.md
Static design references for rapid iteration and stakeholder previews.
Available Mocks:
- 
Apify Tweet Scraper: http://localhost:3000/mocks/apify-tweet-scraper - Data: mocks/apify-tweet-scraper/data/*.json
 
- Data: 
- 
Agent Intelligence Dashboard: http://localhost:3000/mocks/analytics-dashboard - Data: mocks/analytics-dashboard/data/dashboard.json
 
- Data: 
Update JSON fixtures to adjust stats, copy, or chart data - routes reload on every request.
| Variable | Purpose | Required | Where to Get | 
|---|---|---|---|
| SUPABASE_URL | Database connection | ✅ Yes | Team 1Password vault | 
| SUPABASE_SERVICE_ROLE_KEY | Server DB access | ✅ Yes | Team 1Password vault | 
| NEXT_PUBLIC_SUPABASE_URL | Client DB access | ✅ Yes | Same as SUPABASE_URL | 
| NEXT_PUBLIC_SUPABASE_ANON_KEY | Client DB access | ✅ Yes | Team 1Password vault | 
| APIFY_TOKEN | Tweet collection | ✅ Yes | Ask #ops-oncall | 
| APIFY_ACTOR_ID | Actor to run | ✅ Yes | apidojo/tweet-scraper | 
| COLLECTOR_LANGUAGE | Tweet language filter (default 'en') | n/a | |
| COLLECTOR_REUSE_EXISTING | Reuse recent Apify run (default false) | n/a | |
| GEMINI_API_KEY | Sentiment analysis | ✅ Yes | Ask #ops-oncall | 
| INTERNAL_API_KEY | Manual API auth | Generate: openssl rand -hex 32 | 
Secrets Management:
- Development: Stored in .env.local(git-ignored)
- Production: Stored in Vercel project environment variables
- Rotation: Quarterly via npm run rotate:supabase(ops team)
- ESLint v9 flat config - see eslint.config.mjs
- ESLint Stylistic handles formatting (no Prettier)
- Run npm run check:fixto auto-format before committing
- CI checks enforce these standards
- Strict mode enabled
- Explicit types for public APIs
- Rely on inference internally
- Path alias: @/*maps to project root
| Artifact | Naming Scheme | Example | 
|---|---|---|
| Components | PascalCase | DashboardHeader.tsx | 
| Hooks | camelCase use* | useKnockNotifications.ts | 
| API Routes | route.tsin folders | app/api/start-apify-run/route.ts | 
| Commands/Queries | {Verb}{Subject}{Type} | StartApifyRunCommand.ts | 
| Handlers | {Command}Handler | StartApifyRunCommandHandler.ts | 
- Feature ownership - Slices own their entire use case
- REPR flow - Request → Endpoint → Processing → Response
- CQRS - Separate commands (mutations) from queries (reads)
- Explicit boundaries - Cross-slice via contracts, not shared internals
See VSA Architecture Guide for details.
Current Status: Deployed internally for team access
Environment:
- Platform: Vercel
- Branch: main(auto-deploys)
- Domain: Internal Vercel URL (not public domain yet)
Cron Jobs (Vercel): Currently disabled in this repo (vercel.json has no crons). Use manual triggers from the testing guide. Re-enable later by adding cron definitions to vercel.json.
Configuration:
- Environment variables set in Vercel project settings
- Cron definitions in vercel.json
- Build command: npm run build
Schema:
- keywords- Tracked search terms (4 Amp-related keywords)
- cron_runs- Execution history and metrics
- raw_tweets- Original Apify payloads
- normalized_tweets- Standardized tweet data
- tweet_sentiments- Gemini analysis results
- backfill_batches- Historical data processing queue
- sentiment_failures- Failed processing attempts
Views:
- vw_daily_sentiment- Aggregated daily trends
- vw_keyword_trends- Keyword-level analytics
Migrations:
- Primary: src/ApifyPipeline/DataAccess/Migrations/20250929_1200_InitApifyPipeline.sql
- Seeds: src/ApifyPipeline/DataAccess/Seeds/20250929_1230_KeywordsSeed.sql
Apply Migrations:
# Option 1: Programmatically (recommended for local)
# Requires psql client + Supabase session pooler DATABASE_URL in .env.local
npm run apply-migrations
# Option 2: Supabase CLI
supabase db push
# Option 3: Manual via Supabase Studio SQL Editor
# Execute SQL files in order from src/ApifyPipeline/DataAccess/Migrations/- DATABASE_URLshould point at the Supavisor session mode pooler (e.g.- postgresql://postgres.<ref>@aws-1-<region>.pooler.supabase.com:5432/postgres).
- If DATABASE_URLis absent, the script falls back toSUPABASE_URL+SUPABASE_DB_PASSWORD, with optionalSUPABASE_DB_HOST/SUPABASE_DB_PORT.
- The SQL is idempotent: rerunning migrations will skip existing enums, triggers, and policies, and the seed inserts their dependent raw_tweetsrows automatically.
"Environment variable not found"
# Verify .env.local exists and variable names match
cat .env.local | grep -E "SUPABASE_URL|APIFY_TOKEN|GEMINI_API_KEY"
# Restart dev server after changes"Supabase connection failed"
- Check if project is active (not paused)
- Verify URL format: https://[project-ref].supabase.co
- Confirm service role key (not anon key)
"No keywords available"
-- Check keywords in Supabase Studio SQL Editor
SELECT * FROM keywords WHERE enabled = true;
-- Re-run seed if empty
-- Execute: src/ApifyPipeline/DataAccess/Seeds/20250929_1230_KeywordsSeed.sql"Scripts not loading .env.local"
# All Apify Pipeline scripts (npm run health-check, enqueue:backfill, etc.) 
# automatically load .env.local via dotenv (installed as dev dependency)
# If you see "Missing required environment variables", verify:
# 1. .env.local exists in project root
# 2. Variable names match exactly (SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY, etc.)
# 3. No syntax errors in .env.local fileMore troubleshooting: See Local Testing Guide
Full end-to-end testing guide: docs/apify-pipeline/local-testing-guide.md
Quick validation:
# Validate environment
npm run health-check
# Run unit tests
npm test
# Trigger manual collection
curl -X POST http://localhost:3000/api/start-apify-run \
  -H "Content-Type: application/json" \
  -d '{"triggerSource": "manual-test", "ingestion": {"maxItemsPerKeyword": 10}}'- Framework: Vitest
- Config: vitest.config.ts
- Location: src/ApifyPipeline/Tests/Unit/
- Coverage: Core transformations, validators, business rules
- Apify Pipeline Feature README - Feature architecture & development guide
- Local Testing Guide - Comprehensive testing procedures
- Internal Testing Quickstart - 10-minute quick reference
- Readiness Checklist - Pre-deployment validation
- Operational Runbook - Production procedures & monitoring
- Incident Response Guide - Troubleshooting & recovery
- Specification - Technical requirements
- Overview - System architecture & data flow
- Implementation Plan - Development roadmap
@stylistic/indent triggers a stack overflow on large TSX trees with TypeScript 5.9.
Mitigation: app/mocks/** and mocks/** are ignored in eslint.config.mjs.
Resolution: Remove ignores once upstream fix ships (eslint-stylistic#915).
| Area | Contact | Channel | 
|---|---|---|
| Development | Engineering Team | #agent-vibes-dev | 
| Operations | Platform Ops | #ops-oncall | 
| Analytics | Analytics Guild | #analytics-insights | 
| Incidents | On-call Engineer | #backend-support | 
| Secrets/Access | DevOps Team | #ops-oncall | 
- 
Branch from maingit checkout -b feature/your-feature-name 
- 
Follow code standards - Run npm run checkbefore committing
- Use npm run check:fixto auto-format
 
- Run 
- 
Write tests - Unit tests for business logic
- Manual testing using test guide
 
- 
Submit PR - Link to any related issues or Slack threads
- Request review from team lead
 
- 
Deploy - Merge to maintriggers auto-deploy to Vercel
 
- Merge to 
Internal Use Only - Not licensed for public distribution.
- Next.js Docs: https://nextjs.org/docs
- Apify Docs: https://docs.apify.com/
- Supabase Docs: https://supabase.com/docs
- Vercel Docs: https://vercel.com/docs
- Gemini API Docs: https://ai.google.dev/docs