Skip to content
← Projects
SOFTWAREAI

MyScout: An Autonomous AI Career Agent

1 January 2026

The modern career search is fundamentally a massive data-processing problem. You spend 90% of your time scrolling through irrelevant job descriptions, fighting with Applicant Tracking Systems (ATS), and drafting repetitive cover letters, leaving very little energy for the actual high-value work: human networking and interview preparation.

I wanted to completely invert this dynamic.

I designed and engineered MyScout: a local, autonomous Python agent that handles the entire discovery, evaluation, and drafting lifecycle. It acts as my personal recruiter, scraping job boards while I sleep, mathematically scoring roles against my personal goals, and teeing up highly tailored application assets for my review.

Here is a deep dive into the architecture and the engineering decisions behind it.


⚙️ The System Architecture

Instead of building a monolithic script, I designed MyScout with a modular pipeline. It separates discovery (Scraping), evaluation (Matching), and creation (Generating) into distinct engines.

Here is the data flow:

[Obsidian Knowledge Base] ──(Context)──> [Search Engine] ──(Queries)──> [WebScout (Playwright)]
                                                                               │
                                                                         (Raw Job Data)
                                                                               ↓
[CRM Dashboard] <──(Markdown Assets)── [Generator (Gemini Pro)] <──(Score > 60%)── [Matcher (Gemini Flash)]
       │
       └──(Handoff)──> [MdToLtx Engine] ──(Compiles)──> [Beautiful PDF Resume]

(Note: I built a completely separate rendering engine to compile these Markdown assets into professional LaTeX PDFs. You can read about the MdToLtx Engine here).


🕵️ The "Hunter" & Stealth Engineering

Scraping dynamic Single Page Applications (SPAs) like LinkedIn and Seek is notoriously difficult. Their DOM structures are fragile, CSS classes are hashed on every deploy, and aggressive anti-bot software blocks automated traffic.

To solve this, I built the Hunter module using Playwright:

  • WCAG Exploitation: Instead of relying on ever-changing div classes or data-* attributes to find job cards, I programmed the scraper to target aria-label attributes (e.g., button[aria-label^='Dismiss']). Because web accessibility (WCAG) compliance is a legal requirement, these tags almost never change, making the scraper incredibly resilient to UI redesigns.
  • The "Humanizer": To bypass CAPTCHAs and bot-detection, the system injects randomized behavioral delays (1.5s - 3.5s), simulates mouse movements, and occasionally scrolls down the page to mimic human reading patterns.

🛡️ The "Governor" & State Management

A major risk with web automation is getting your accounts banned for rate-limit violations.

I built a Governor module backed by a local SQLite database. It strictly enforces a daily limit (e.g., 200 actions/day). It tracks page navigations, detail scrapes, and API generation calls. To maximize efficiency, the database hashes job URLs to prevent duplicate processing. If MyScout has already generated a resume for a job (content_created=True), it skips it entirely on the next run, saving API tokens and browser actions.

🧠 The AI Scoring Engine

I didn't just want a bot that spammed resumes to every "Mechanical Engineer" job it found. I wanted it to filter for quality.

Once the Hunter scrapes a job, it passes the text to the Matcher (powered by Google Gemini Flash). The LLM evaluates the job against my personal profile using a strict, weighted scoring algorithm:

  1. Technical Match (50%): Does the stack align with my core skills?
  2. Career Goal Match (30%): Is this a lateral move, or does it advance my trajectory?
  3. Values & Culture Match (20%): Does the company align with my work style?

If a job scores below 60%, the agent instantly discards it and updates the SQLite database. If it passes, the job is promoted to the Generator module (using the heavier Gemini Pro model) to draft a highly tailored resume and cover letter.

📚 Obsidian Context Injection

Where does the AI get the context to score the jobs and write the resumes? My personal knowledge base.

I built a custom ObsidianLoader that recursively parses my local Obsidian Markdown vault. It follows my [[WikiLinks]] to pull in my exact work history, technical skills, and career goals dynamically. This means I never have to update a hard-coded prompt. If I learn a new skill, I just type it into my Obsidian notes and MyScout instantly incorporates it into the next application it writes.

🚀 The Result

Every morning, I wake up to a centralized Markdown CRM dashboard. MyScout has already searched the market, discarded the garbage, and placed 3-5 highly curated, perfectly drafted applications into a QUEUED folder for my review.

By automating the highest-friction parts of the job hunt, I reclaimed hours of my day to focus on building actual projects and preparing for interviews.