Three interlocking gears representing IndexNow, Google API, and XML Sitemaps working together

The Three-Protocol Indexing Stack

You can build the best content on the internet and still fail if search engines never find it. Most publishers treat indexing as a one-step process: submit a sitemap, wait, hope. That passivity costs real money. Every day a page sits unindexed is a day it earns nothing.

The problem compounds across a network. Sixteen sites means sixteen domains competing for crawler attention, sixteen pools of new URLs that search engines must discover, evaluate, and decide whether to render. Leaving this to chance is not a strategy. You need a stack.

This post covers the three-protocol indexing stack: IndexNow, Google URL Inspection API, and XML Sitemaps — all three running simultaneously, triggered automatically on every deploy. It is the single highest-impact piece of infrastructure you can add to a multi-site operation.

Protocol 1: IndexNow

IndexNow is the closest thing the web has to real-time indexing notification. You publish a page, POST its URL to the IndexNow API, and within minutes Bing, Yandex, Naver, Seznam, and Yep know it exists.

The scale of IndexNow adoption is worth understanding:

  • 10,000 URLs per single POST request with no daily submission limits
  • 5 billion daily submissions globally across all publishers
  • 22% of Bing's clicked URLs originate from IndexNow-submitted pages

That last number is the one that should change your behavior. Nearly a quarter of the URLs people actually click on Bing got there because someone proactively told Bing they existed. If you are not using IndexNow, you are leaving Bing traffic on the table.

How IndexNow Works

The protocol uses a shared key model. You generate a key (a UUID works), host a verification file containing that key at your domain root, and POST URLs to the IndexNow endpoint. Every participating search engine receives the notification through a shared hub — you submit once, all engines get it.

# Generate your key
KEY=$(uuidgen | tr '[:upper:]' '[:lower:]')

# Create verification file at your site root
echo "$KEY" > "src/${KEY}.txt"

# Submit URLs
curl -X POST "https://api.indexnow.org/indexnow" \
  -H "Content-Type: application/json" \
  -d "{
    \"host\": \"example.com\",
    \"key\": \"${KEY}\",
    \"keyLocation\": \"https://example.com/${KEY}.txt\",
    \"urlList\": [
      \"https://example.com/new-article/\",
      \"https://example.com/updated-guide/\"
    ]
  }"

A 202 Accepted response means the submission was received. A 200 OK means the key was already known. Either is success.

The Google Gap

There is one notable absence from the IndexNow participant list: Google. As of early 2026, Google has not adopted IndexNow. They have acknowledged it, they have experimented with it, but they have not committed. This is exactly why you need protocol two.

Protocol 2: Google URL Inspection API

The URL Inspection API is the programmatic version of the "Request Indexing" button in Google Search Console. Instead of clicking that button manually for every new page, you call an API endpoint.

This matters because Google's organic discovery of new pages on new domains is slow. A fresh domain with no backlinks and no crawl history might wait days or weeks for Googlebot to discover a new URL through sitemap crawling alone. The URL Inspection API cuts that timeline dramatically.

How It Works

The API requires a Google Cloud project with Search Console API enabled and a service account with access to your Search Console properties. The call itself is straightforward:

const { google } = require('googleapis');

async function requestIndexing(siteUrl, pageUrl) {
  const auth = new google.auth.GoogleAuth({
    keyFile: 'service-account.json',
    scopes: ['https://www.googleapis.com/auth/indexing']
  });

  const indexing = google.indexing({ version: 'v3', auth });

  const result = await indexing.urlNotifications.publish({
    requestBody: {
      url: pageUrl,
      type: 'URL_UPDATED'
    }
  });

  return result.data;
}

Rate Limits and Quotas

Google imposes limits: 200 requests per day per property is the documented quota for the Indexing API (which technically targets job posting and livestream pages, but the URL Inspection API in Search Console has similar effective limits). For a satellite site publishing four to eight new pages per day, this is more than sufficient.

The key insight is that you do not need to submit every URL every day. You submit new and recently updated URLs only. A diff-based approach — comparing your current sitemap against yesterday's snapshot — identifies exactly which URLs need submission.

Protocol 3: XML Sitemaps

The oldest protocol in the stack, and the one most people stop at. XML sitemaps are passive — you publish a file, search engines crawl it when they feel like it. But sitemaps serve a critical function the other two protocols do not: they provide a complete inventory.

IndexNow tells Bing about new URLs. The URL Inspection API tells Google about new URLs. But neither provides the full picture. An XML sitemap tells every search engine about every URL on your site, its last modification date, and its relative priority.

Why Sitemaps Still Matter

When Googlebot crawls your sitemap, it cross-references the URLs against its index. Pages it has not seen get queued for crawling. Pages with a lastmod date newer than its last crawl get re-evaluated. This passive discovery catches anything the active protocols might have missed.

For a network of sixteen sites, you should generate sitemaps automatically during every build:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://example.com/new-article/</loc>
    <lastmod>2026-04-07</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
</urlset>

Additionally, ping search engines when the sitemap changes:

# Ping Google
curl "https://www.google.com/ping?sitemap=https://example.com/sitemap.xml"

# Ping Bing (in addition to IndexNow)
curl "https://www.bing.com/ping?sitemap=https://example.com/sitemap.xml"

The Post-Deploy Automation Hook

Running all three protocols manually defeats the purpose. The stack should fire automatically on every deploy. Here is the flow:

  1. Build completes and generates the new sitemap
  2. Diff script compares the new sitemap against the previously deployed one
  3. New/changed URLs are identified
  4. Three parallel submissions fire:
    • IndexNow POST with the changed URL list
    • Google URL Inspection API call for each new URL (respecting rate limits)
    • Sitemap ping to Google and Bing
  5. Current sitemap snapshot is saved for the next diff

In a CI/CD pipeline (GitHub Actions, for example), this is a post-deploy step that runs after the hosting provider confirms the deployment succeeded. The script takes 5-15 seconds for a typical batch of new URLs.

Why Running All Three Matters

Each protocol covers a gap the others leave:

Protocol Engines Covered Speed Completeness
IndexNow Bing, Yandex, Naver, Seznam, Yep Minutes New URLs only
Google URL Inspection Google Hours New URLs only
XML Sitemaps All Days Full inventory

IndexNow is fast but does not reach Google. Google's API is Google-specific. Sitemaps are universal but slow. Running all three means every search engine learns about your new content through at least two channels, and usually three.

The redundancy is intentional. Search engine infrastructure is not perfectly reliable. An IndexNow submission might not propagate due to a temporary backend issue. A Google API call might hit a transient error. The sitemap provides the safety net — eventually, every URL gets discovered.

Network-Level Indexing Stats

After running the three-protocol stack across a sixteen-site network for three months, the indexing metrics speak for themselves. Pages on sites using the full stack typically appear in Bing's index within hours and in Google's index within one to three days. Sites relying solely on sitemaps often see delays of one to three weeks for new pages on fresh domains.

For new domains — which have no crawl history and minimal backlinks — the difference is particularly stark. The "Discovered — currently not indexed" status in Google Search Console clears significantly faster when you are actively pushing URLs through the inspection API.

Getting the Full Implementation

This post covers the concepts and the core code patterns. The complete implementation — including the diff-based URL detection script, the CI/CD integration for GitHub Actions, the error handling and retry logic, network-wide IndexNow key management, and the monitoring dashboard that tracks submission success rates across all sixteen sites — is covered in The $100 Network by J.A. Watte. Chapter 17 walks through the three-protocol stack end to end, and Chapter 33 covers the automated build hook implementation.

Stop waiting for search engines to find you. Tell them.


The SEO foundations for these techniques are covered in The $20 Agency, Chapters 3-5. This article builds on those basics with advanced multi-site strategies from The $100 Network.

Ready to build your network?

Learn the exact strategies to build a powerful $100 network that opens doors, creates opportunities, and accelerates your career.

Get the Book (opens in new tab)