Tag: AI-in-practice

  • I Automated My Job Search, Then Deliberately Broke the “Apply” Button

    I built an AI tool to automate my job search.
    Then I intentionally stopped it from applying anywhere.

    Not to move faster. To make better decisions.

    The problem: high effort, low signal

    Ten days into my job search, I realized something was off.

    I was spending more than two hours a day across LinkedIn, Xing, StepStone, and Glassdoor. Multiple logins. Duplicated listings. Half-maintained notes and spreadsheets.

    Despite all that effort:

    • ~50 listings reviewed per day
    • ~8 applications sent

    The bottleneck wasn’t effort. It was signal quality.

    Most of my time went into false starts. Roles that looked promising until you noticed a German language requirement halfway through. Or a seniority mismatch buried at the end of the JD.

    The scroll felt productive. It wasn’t.

    The actual constraints

    Once I stepped back, the problem reduced to three concrete constraints:

    1. Recency
      Job boards don’t reliably prioritize fresh listings. I was spending time on roles that were likely already filled.
    2. Relevance
      Titles are noisy. “Product Manager” spans a massive experience range, and you only know which one it is after reading the JD.
    3. State management
      The same role appears on multiple platforms. I couldn’t reliably tell what I’d already seen, shortlisted, or applied to.

    That’s when I stopped searching and started building.

    Here’s how it works:

    1. Searches multiple job platforms simultaneously using Exa, a search API designed for this kind of structured content extraction.
    2. AI filters each listing against my criteria. Role type, experience level, location, language requirements. Claude Haiku reads each job and decides if it’s a match.
    3. Extracts metadata like posting date, remote/hybrid/onsite, and whether the position is still active.
    4. Extracts relevance score based on my CV’s match to the job description
    5. Deduplicates across sources. Same job posted on LinkedIn and Indeed? Shows up once.
    6. Exports to Excel with columns for my notes, application status, and tracking.
    7. Backs up the sheet before each run so I never lose my notes.

    Total build time: ~1.5 hours.

    Results

    First run:

    • 127 raw listings
    • 97 after deduplication
    • 43 after filtering

    Two-thirds of the listings were never worth my attention.

    The important part wasn’t efficiency. It was cognitive load.

    Instead of scanning 100+ listings to find 20 usable ones, I now review ~40 pre-qualified roles, sorted by recency, with my notes preserved across days.

    The trade-offs

    Every tool has trade-offs. I made mine deliberately.

    Haiku over more powerful models. I used Claude Haiku 3, the cheapest model in the Claude family. Smarter models exist, but Haiku is good enough for this task. It correctly identifies whether a JD matches my criteria. That’s all I need. Cost per run: less than $0.01.

    Batch processing over real-time alerts. I run the script once a day. No push notifications, no live feed. Fresh listings once a day is plenty.

    Excel over a custom dashboard. I wanted to build something and ship it quickly, not spend weeks on a SaaS product. Zero learning curve. I already live in spreadsheets. The output is a file I can open, filter, sort, and annotate without building a UI.

    The feature I chose not to build:

    My first instinct was to go further. Auto-fill applications. Pre-populate cover letters. Maybe even auto-submit.

    I stopped myself.

    Because a job application isn’t just a form. It’s a representation of me. No company is looking for a bot that can fill fields quickly. They’re looking for someone who understood the role, thought about the team, and made a conscious decision to apply.

    I didn’t want my job search to become mechanical. I wanted it to have intention. To reflect how I think, what I’ve done, and what I’m actually excited about.

    So, I broke the Apply button on purpose.

    The scraper handles discovery. Every application decision is still mine.

    What changed

    Before, I spent entire days scrolling. Twenty listings reviewed. Eight applications sent. Exhausted by mis-fits. Constantly unsure of what I’d already seen or applied to.

    Now, I spend about thirty minutes reviewing a filtered list. I apply to roughly the same number of roles, but each one is deliberate. Each cover letter is written for that role. Each resume tweak has a reason behind it. I’m not reacting anymore. I’m choosing.

    The quality of each application went up. More importantly, I felt good about every single one I sent.

    The line I drew

    Automate the noise. Stay present for the signal.

    The soul-crushing scroll through irrelevant listings is gone. The meaningful decisions stayed human.

    The code is open source. You can fork it, adapt the criteria, and run it yourself.

    GitHub Repo Link: https://github.com/ruby-catharin/ai-job-search

    What would you never automate in your job search?