Menu

Connect

📝 Post

Programmatic SEO in 2026: How I'm Using AI + n8n to Generate Hundreds of Pages

seoautomationn8nai
By Ryan Cwynar4 min read

I've been building a system that automatically generates hundreds of SEO-optimized pages without me lifting a finger. Here's the full technical breakdown.

The Problem

Local service businesses live and die by search. If you're a sign company in Palm Beach, you need to rank for:

  • "custom signs palm beach"
  • "vehicle wraps west palm beach"
  • "banners boynton beach"
  • ...and hundreds of variations

Manually creating pages for every service + location combo? Impossible at scale. That's where programmatic SEO comes in.

The Architecture

My pipeline has three main components:

  1. Keyword Mining (n8n workflow, runs every 8 hours)
  2. Keyword Storage (PostgreSQL with deduplication)
  3. Page Generation (Node.js scripts + AI)

Let me break down each piece.

Part 1: Keyword Mining with n8n

n8n is my secret weapon for automation. It's self-hosted, visual, and connects to everything.

My keyword mining workflow:

Schedule (every 8h) → Select Base Keywords → Google Autocomplete API → Process Results → Store in Postgres

Smart Query Selection

Instead of hammering the same queries repeatedly, I built intelligence into the selection:

  • Random base selection: Pick from a pool of seed keywords
  • Expand from existing: Take successful keywords and find variations
  • Location cycling: Rotate through target cities

This keeps the mining fresh and discovers long-tail opportunities.

The Numbers

  • ~25 queries per run
  • ~100 keywords harvested every 8 hours
  • Deduplication prevents bloat
  • Running cost: $0 (self-hosted n8n + free autocomplete APIs)

Part 2: Database Schema

PostgreSQL handles the keyword storage:

CREATE TABLE seo_keywords (
  id SERIAL PRIMARY KEY,
  keyword TEXT NOT NULL,
  source_query TEXT,
  project TEXT NOT NULL,
  processed BOOLEAN DEFAULT FALSE,
  page_slug TEXT,
  created_at TIMESTAMP DEFAULT NOW()
);

CREATE TABLE seo_pages (
  id SERIAL PRIMARY KEY,
  slug TEXT UNIQUE NOT NULL,
  service TEXT NOT NULL,
  location TEXT NOT NULL,
  title TEXT,
  description TEXT,
  h1 TEXT,
  content TEXT,
  keywords TEXT[],
  created_at TIMESTAMP DEFAULT NOW()
);

The processed flag is key—it tracks which keywords have been turned into pages.

Part 3: AI-Powered Page Generation

This is where it gets interesting. A Node.js script:

  1. Groups keywords by service + location
  2. Generates content using Claude API
  3. Creates Next.js pages with proper metadata
  4. Marks keywords as processed

Content Generation Prompt

The AI prompt is crucial. I include:

  • Business context (what the company does)
  • Target service and location
  • Related keywords to naturally include
  • Desired tone and structure
const prompt = `Write a comprehensive service page for ${service} in ${location}.

Business: ${businessContext}
Related keywords to include naturally: ${keywords.join(", ")}

Requirements:
- Local focus with neighborhood mentions
- Specific benefits for this service
- Clear calls to action
- ~800-1200 words
- SEO-optimized but readable`;

Page Structure

Each generated page includes:

  • Dynamic title: "[Service] in [Location] | [Business Name]"
  • Meta description: AI-generated, keyword-rich
  • H1: Service + location focused
  • Content: Unique, locally relevant copy
  • Schema markup: LocalBusiness + Service schemas

The Results

After 2 weeks of running:

  • 347 keywords harvested
  • 89 pages generated
  • 12 first-page rankings for long-tail terms
  • 3x organic traffic increase

The best part? It runs on autopilot. I check in occasionally to review quality, but the machine does the work.

Avoiding Google Penalties

Programmatic SEO has a bad reputation because people do it wrong. Here's how I avoid penalties:

  1. Unique content: Every page has genuinely different copy
  2. Real value: Pages answer actual search intent
  3. No thin content: Minimum 800 words per page
  4. Internal linking: Pages link to each other naturally
  5. Human review: I spot-check generated content

Google's spam guidelines specifically allow programmatic content that provides value. The key word is value.

Tech Stack Summary

  • n8n: Workflow automation (self-hosted)
  • PostgreSQL: Keyword and page storage
  • Node.js: Page generation scripts
  • Claude API: Content generation
  • Next.js 16: Frontend with App Router
  • Vercel/Self-hosted: Deployment

Total monthly cost: ~$20 (mostly Claude API for generation)

Try It Yourself

The core concept is simple:

  1. Mine keywords for your niche automatically
  2. Store and dedupe in a database
  3. Generate pages with AI assistance
  4. Deploy and let Google index

Start small—one service, a few locations. Validate it works before scaling.

What's Next

I'm adding:

  • Automatic performance tracking: Which pages rank, which don't
  • Content refresh: Update underperforming pages
  • Competitor monitoring: What keywords are they ranking for?

Programmatic SEO isn't about gaming Google. It's about efficiently creating the content your audience is searching for. The AI just makes it possible to do at scale.


Building something similar? I'd love to hear about it. Find me on LinkedIn or Twitter.