Skip to main content

Automating Job Applications with AI

990 words·5 mins
Data Processing N8N AI Automation JSON Web Scraping API Integration
Seth Martin
Author
Seth Martin
Public Sector IT Specialist

The Problem
#

Downsizing in IT roles and oversaturation of talent have made for an increasingly competitive job market. While HR departments have streamlined the application process through ATS systems like BambooHR, many job applicants are still relying on manually repeating their resume for every company website’s proprietary forms. As ATS systems further integrate AI agents into their frameworks, the efficiency discrepancy becomes too great to ignore. I knew to get ahead, I would have to leverage the same technology. The general concept of AI for job applications is certainly not novel, but it is closely guarded like a trade secret. AI Automation, in general, suffers from a “get rich quick” mentality that stifles its growth; most apparent in this specific application. Looking up AI Job Applications showcases page after page of cookie-cutter SaaS landing pages, selling someone’s simple (but hidden) Zapier or n8n automation. I was certainly not about to pay someone for something I could make myself. So I dedicated a couple of days to learning the current ecosystem of AI automation and crafted a consistently efficient job matching and applying machine.

Overview of the Automated n8n Workflow
Complete Workflow

The Outline
#

The workflow required for such a system is quite broad, encompassing initial data scraping, AI processing, and output handling. I broke the process down into four steps.

  1. Scrape and handle initial data from external sources.
  2. Process data through an AI Agent.
  3. Enrich data with outreach information.
  4. Perform outreach and populate spreadsheets.

Scraping Data
#

HTTP request nodes were created in n8n, allowing me to send API requests to popular scrapers on Apify. Apify is cloud-based and enables users to access popular actors for web scraping, running them in Docker containers in the cloud. Specific input requests can be configured in a JSON file on n8n, preventing you from needing to access Apify directly.

Zoom-in of n8n Nodes
Scraping Nodes

Wanting to maximize the total number of jobs I could scrape, I decided to gather results from two major job boards: Indeed and LinkedIn. While these scrapers worked well on their own, combining them required data normalization as different developers created each. Data normalization was performed using the rename keys node.

Zoom-in of n8n Node
Normalization Node

The data is then merged using a merge node. What results is one (very long) JSON array filled with normalized fields of data for every scraped job. My first run of this automation (with very liberal job search criteria) returned me with around 1700 total jobs. Not all of the data passed through this merge is needed. Redundant fields are pruned with a set node.

Zoom-in of n8n Node
Merge Node
Zoom-in of n8n Node
Set Node

AI Processing
#

An AI agent is then utilized, which compares my resume and skill sheet to each scraped job title and description. The agent is prompted through a system role to condition it as an employment matchmaker. It is then told to respond with either a “successful” or “unsuccessful” match. This is the longest-running API call in the workflow, as the AI ensures that it matches me with roles that genuinely align with my skills.

Zoom-in of n8n Node
Message a Model Node

Enriching Data With Outreach Information
#

Simply submitting your resume to overflooded ATS systems, increasingly judged with AI, does not appear to be as effective as it once was. Instead, attempting to connect with the decision-makers who request these job postings directly has proven more impactful.

Unfortunately and unsurprisingly, neither Indeed nor LinkedIn openly provides a decision-maker’s email address alongside a job posting. Thankfully, the sales field has similar needs and created a market with several excellent email finders, many of which offer direct implementation into AI automation tools. One of these mail finders with powerful APIs and good documentation was AnyMailFinder.    

The catch with AnyMailFinder is that it requires a registered domain to perform a search. A filter node is used to toss out any items (job postings) that do not have a company URL in their original listing. This filter also eliminates those “unsuccessful” matches from earlier.

Zoom-in of n8n Node
Filter Node

Following the filter, another HTTP request is made with an API call to AnyMailFinder.

Zoom-in of n8n Node
Enrichment Node

Due to how AnyMailFinder works, there must be public information connected to the previously registered domain. Some companies have cybersecurity initiatives to keep their public domain and contact information strictly separate, which dramatically limits the use of the tool. These companies, for which I can’t receive a decision-maker’s email, are tossed out.

Zoom-in of n8n Node
Filter Node

To avoid coming across as too spammy, duplicate emails (from companies with several job postings) are removed in this remove duplicates node.

Zoom-in of n8n Node
Remove Duplicates Node

Outreach and Logging
#

With all the needed data, it is time to perform the output. The first node in this sequence gets my resume from my Google Drive. The connection between the two was slightly tedious to set up and involved creating a Google Cloud Console project.

Zoom-in of n8n Node
Download File Node

Following this node, we send an email using several expressive fields from the data we normalized earlier. This allows us to change certain aspects of the email for each job posting (such as a decision maker’s name, email address, job title, etc). We also attached our resume to this node.

Zoom-in of n8n Node
Send Email Node

Finally, I output a selection of fields into a Google Sheet for ease of access later.

Zoom-in of n8n Node
Append Row in Sheet Node

Results
#

Within roughly thirty minutes, I was able to scrape over 1700 jobs from Indeed and LinkedIn and automatically send application emails to the potential decision-makers of over 300 relevant positions. Instead of my resume getting lost in large-scale ATS systems, it is being viewed by those who need it, in roles where I would excel. As I write this, it is just the day after implementing this automation, and I have received several positive email replies and directors checking out my LinkedIn profile. I will leave the system running weekly, continuously scraping for new opportunities while I focus on expanding my technical skills. There is more cool stuff to play with!