Template

Keyword Auto-Optimizer (GSC)

Semantically optimize pages for keywords using N-grams and AI!

Loading

Loading

Workflow

Workflow

Loading

Loading

Workflow

Workflow

Loading

Loading

Workflow

Workflow

Most pages attract impressions for far more queries than they actually mention on-page. Those “ghost keywords” sit in Google Search Console (GSC) reports hinting at opportunity, yet combing through them manually is tedious. Keyword Auto-Optimizer (GSC) bridges that gap: it pulls real GSC queries for a URL, scans the page, uncovers missing or lightly-used phrases, and suggests natural placements—all in one run.

Walkthrough: how the app is wired in Moonlit

Step 1 – Bring in the evidence

Google Search Console node calls the Search Analytics API for the chosen property, filtered to the exact URL and last 90 days. We only keep the “query” dimension plus metrics, ensuring we’re working with terms that already earn clicks. In parallel a Web Scraper grabs the body text of the same page. Running both at once keeps the flow snappy. This data-gathering approach is similar to what we discuss in our post on auditing website content at scale.

Step 2 – Crunch n-grams & relevance

A custom Python Function does the heavy lifting:

  • Tokenises the page, builds 2- to 6-word n-grams, and cleans punctuation.

  • Uses a SentenceTransformer model (all-MiniLM-L6-v2) to embed both the n-grams and the GSC queries (plus simple plurals).

  • Measures cosine similarity; only n-grams scoring ≥ 0.70 to the query centroid make the cut—capturing semantically close phrases, not just exact matches.

  • Counts how often each relevant n-gram appears, then outputs two things: a flat list of the first 20 GSC queries and a tidy JSON table with occurrence counts and relevance scores.

Step 3 – Draft the optimisations

A GPT-4o Chat Model receives the n-gram table, full keyword list, and the raw page text. It performs two passes:

  1. For any n-gram that never appears, it writes a suggestion showing the current sentence and a revised version with the phrase inserted (bolded for clarity).

  2. For n-grams that appear 1-4 times, it flags them as “under-optimised” and proposes additional natural placements—respecting rules like changing “near me” to “near you”.

These techniques reflect broader strategies detailed in our building high-quality AI content pipelines article.

Original Sentence

Optimised Sentence (with inserted n-gram highlighted in bold)

Our service is available in multiple locations across the city.

Our service is available in multiple locations near you across the city.

We offer fast and reliable support for all customers.

We offer 24/7 customer support for all customers.

Contact us today to learn more about our solutions.

Contact us today to learn more about our AI-powered solutions.

Step 4 – Quality control

A second Chat Model quickly audits the output: removes duplicate or unchanged sentences, edits awkward wording, and ensures every keyword actually exists in the GSC list. The final Markdown is routed to the “Optimized” output.

Ways to customise

Customisation Option

Implementation Details

Expected Benefit

Tighten focus

Filter queries to only those with >50 impressions or positions 8–20 in the script.

Targets high-potential keywords for more impactful optimisations.

Adjust similarity threshold

Change the 0.70 semantic similarity bar up or down.

Controls how closely suggestions match the original query intent.

Swap embedding model

Use a language-specific embedding model for non-English content.

Improves relevance and accuracy for multilingual sites.

Add CMS update step

Push approved sentences to WordPress or Webflow via API after QC.

Automates publishing, reducing manual effort and errors.

Bring in competitor context

Scrape SERP top-5 pages, extract n-grams, and feed into relevance filter.

Inspires new ideas and ensures content is competitive.

Surface UX flags

Chain a Chat Model to check suggestions against style or tone guidelines.

Ensures consistency and adherence to brand voice before publishing.

Running at scale with Bulk Runs

  1. Export a CSV containing two columns: URL and its matching GSC Property Name.

  2. Open Moonlit’s Bulk Runs, select the Keyword Auto-Optimizer app, and upload the file.

  3. Map the columns to the two inputs and launch. Each row spins up an independent run, and you can download a combined CSV/Markdown report when the job finishes.

Why it matters

Refreshing existing content routinely pays bigger, faster dividends than publishing something brand new. By automating the grunt work—query extraction, semantic comparison, sentence rewrites—you free up hours per page while making sure every suggestion is grounded in real user searches. Early testers have reported quick CTR lifts simply by adding the missing phrases surfaced by this workflow.

Start Engineering your
Content Growth Engine

Start Engineering your
Content Growth Engine

Start Engineering your
Content Growth Engine

Title Title Title Title Title Title Title Title Title Title Title