< Back to Blog

The AI-Powered Job Search Pipeline That Applies While You Sleep

A 6-layer automated system: scrape, score, generate, apply, track - with a human in the loop.

The modern job search is broken. You spend 20 minutes per application: reading the job description, tailoring your resume, writing a cover letter, filling out the same form fields you've filled out a thousand times. Multiply that by 10 applications a day and your entire week is gone - with nothing to show for it except a growing spreadsheet of unanswered submissions.

I decided to engineer my way out of it. Over a weekend, I designed a 6-layer pipeline that scrapes jobs from multiple platforms, scores them against my profile using AI, auto-generates tailored cover letters, automates LinkedIn Easy Apply submissions, and tracks everything in a CRM-style pipeline. The whole thing runs on n8n with a human approval gate - nothing fires without my sign-off.

System Architecture

Layer 1: Multi-Platform Search (JobSpy)
    │ Cron: daily 8am
    ▼
Layer 2: AI Scoring (Claude API)
    │ Score 0-100 per job
    ▼
Layer 3: Cover Letter Gen (Claude + Qdrant RAG)
    │ Tailored per role
    ▼
Layer 4: Human Approval Queue
    │ < YOU REVIEW HERE
    ▼
Layer 5: Browser Automation (Playwright MCP)
    │ LinkedIn Easy Apply
    ▼
Layer 6: Tracking (NocoDB + HubSpot Pipeline)

Layer 1 - Multi-Platform Job Search

JobSpy is a Python library that scrapes Indeed, LinkedIn, Glassdoor, ZipRecruiter, and Google Jobs in a single call. It replaces the limited single-platform API integrations and gives you broad coverage with one script:

from jobspy import scrape_jobs

jobs = scrape_jobs(
  site_name=["linkedin", "indeed", "glassdoor", "zip_recruiter"],
  search_term="Marketing Automation Manager",
  location="Dallas, TX",
  results_wanted=50,
  hours_old=48,
  country_indeed='USA',
  linkedin_fetch_description=True
)
jobs.to_csv('jobs.csv', index=False)

In n8n, this runs as an Execute Command node triggered by a daily cron at 8am. The CSV output is parsed into JSON and each job creates a record in NocoDB with status "new".

Layer 2 - AI Job Scoring

For each new job in NocoDB, an n8n webhook triggers a Claude API call. Claude reads the full job description and scores it 0–100 against my profile:

SYSTEM: You are a job fit scoring engine.
Score this job 0-100 for this candidate.
Return JSON only: { score, salary_est, title_fit,
  skills_match[], red_flags[], summary }

CANDIDATE PROFILE:
- Titles: Marketing Manager, Director of Marketing, Demand Gen
- Min salary: $90,000
- Skills: HubSpot, Meta Ads, Google Ads, n8n, Python, AI/automation
- Location: Dallas TX or Remote

JOB DESCRIPTION: {job_description}

Scoring thresholds:

  • 80–100: Auto-add to apply queue - strong match, worth pursuing
  • 60–79: Add to review queue - I decide manually
  • Below 60: Archive - don't waste time

Layer 3 - Cover Letter Generation

For jobs in the apply queue, Claude generates a tailored 3-paragraph cover letter. The key differentiator: my resume is stored as vector embeddings in Qdrant, so Claude retrieves only the most relevant experience for each specific role:

Write a concise 3-paragraph cover letter.
P1: Why I'm excited about THIS company specifically.
P2: Most relevant experience (from context below).
P3: Clear ask - request interview, confident close.

Tone: confident, specific, not generic.
Max 250 words. No 'I am writing to apply'.
Company: {company} | Role: {title}
Relevant resume context: {qdrant_retrieved_chunks}

Layer 4 - The Human Gate

This is the critical safety layer. Nothing submits without my approval. The n8n workflow presents each batch of scored, letter-generated applications in an approval interface:

  • n8n Wait node with approval webhook - simplest approach, already in the stack
  • NocoDB form view filtered to "pending_approval" status
  • Each entry shows: job title, company, score, salary estimate, cover letter preview, red flags
  • I approve, reject, or edit before anything fires

Layer 5 - Browser Automation

Approved applications get submitted via Playwright MCP, which automates LinkedIn Easy Apply. Claude reads the accessibility tree of the application form and fills fields dynamically. For complex ATS systems (Workday, Greenhouse), the system drafts everything and sends me a review link - I click submit manually.

Layer 6 - Application Tracking

Applied jobs sync to HubSpot as Deals in a "Job Search" pipeline, giving me a familiar kanban view from Applied through Phone Screen, Interview, and Offer stages.

Real Results

Running this system passively for two weeks while still employed full-time, I was able to identify and score 200+ jobs, generate tailored applications for the top 30, and submit 15 targeted applications - all with roughly 20 minutes of daily review time. The callback rate on AI-scored, AI-lettered applications was significantly higher than my previous spray-and-pray approach.


Edward Chalupa is a digital marketing specialist and founder of Whtnxt, a digital marketing and automation consultancy. Connect with him on LinkedIn or explore more at echalupa.com.

If you're interested in automating LinkedIn applications specifically, I covered that in a separate deep-dive.

Key Takeaways

Automate the mechanical, not the strategic. The system handles scraping, scoring, and form-filling. I still decide which roles to pursue and how to position myself in interviews.

RAG beats templates every time. Generic cover letters get ignored. Cover letters grounded in your actual work - pulled dynamically from a vector store - feel authentic because they are.

Track everything. The NocoDB pipeline isn't just for organization. The data it generates (response rates by platform, score correlation with interview rates) feeds back into optimizing the system over time.

Download the Workflow

The full Job Search Pipeline workflow is available as a ready-to-import n8n JSON file. It includes the Schedule Trigger, Python/JobSpy scraping node, Claude scoring with the prompt template, NocoDB integration for pipeline tracking, cover letter generation, and the human approval gate. All credentials and API keys have been replaced with placeholders.

Download Job Search Pipeline Workflow

Requires: Python + jobspy installed on your n8n server, Anthropic API key, NocoDB API token, SMTP.


Edward Chalupa is a digital marketing specialist and founder of Whtnxt, a digital marketing and automation consultancy. Connect with him on LinkedIn or explore more at echalupa.com.