header-logo
Global
Home
Glossary
Search Engine Optimization (SEO) Glossary Terms

Crawl Depth

Search Engine Optimization (SEO) Glossary Terms/Crawl Depth

Crawling and Indexing

Crawl Depth

Crawl Depth Impact on Discovery Homepage Depth 0 Category Depth 1 Page Depth 2 Page Depth 3+ Crawl Priority: High Med Low

The click distance tax

Crawl depth is how many clicks it takes to reach a page from your homepage. Homepage is depth 0. Pages linked directly from it are depth 1. Pages linked from those are depth 2. And so on. Google's crawler follows internal links like a user would, but it doesn't have infinite patience. Pages buried 5+ clicks deep get crawled less frequently, indexed slower, and rank worse, even if the content is excellent.

Why? Because crawl depth is a proxy for importance. If you bury a page six layers into your site, Google assumes it's not critical. You wouldn't hide your best content behind a maze of navigation. Shallow pages signal priority. Deep pages signal "archive material, low value."

The practical cutoff

Google doesn't publish official limits, but SEO data consistently shows ranking drop-offs after depth 3. Pages at depth 0-2 have strong crawl frequency and ranking potential. Depth 3 is marginal. Depth 4+ is the danger zone. Your page might still get indexed eventually, but it'll take weeks or months, and it'll struggle to rank even with good content and backlinks.

For sites under 1,000 pages, most content should be within 3 clicks of the homepage. For large sites, aim to keep high-priority pages (top products, pillar content, conversion pages) within 2 clicks. Everything else can go deeper, but expect diminishing SEO returns.

Why depth explodes

It's usually architecture laziness. You organize your site hierarchically: homepage → category → subcategory → product. Logical, right? But if you've got multiple levels of categories, products end up 5-6 clicks deep. E-commerce sites with deep taxonomies are chronic offenders.

Blogs do it with date-based archives. Homepage → Year → Month → Day → Post. Four clicks to reach any individual article. Meanwhile, your best-performing content from two years ago is invisible to crawlers because it's trapped in /2023/04/12/.

Faceted navigation adds phantom depth. The visible menu structure might be shallow, but filter combinations create URL chains that push effective depth way higher than the UI suggests.

Flattening your architecture

The fix is internal linking that shortcuts the hierarchy. Don't rely solely on vertical category trees. Add horizontal connections.

Homepage links to key pages. Featured products, top blog posts, pillar content. Put them in the footer, sidebar, or hero carousels. Direct link from depth 0 to whatever matters most, bypassing the hierarchy entirely.

Breadcrumbs with multiple paths. Instead of showing one linear path (Home > Cat > Subcat > Product), link to related categories or sibling pages. Gives crawlers alternate routes.

Related content modules. "You might also like" blocks, recommended reading, cross-category product links. These create lateral connections that flatten effective depth by giving crawlers multiple entry points to the same page.

XML sitemaps. Not a substitute for good architecture, but they help. Pages listed in your sitemap get discovered faster even if they're deep in the tree. Google treats sitemap inclusion as a weak priority signal.

The crawl budget overlap

Crawl depth and crawl budget are related but distinct. Depth is about discoverability. Budget is about resources. Deep pages consume more crawl budget because Google has to follow more links to find them. If you've got 50,000 pages at depth 4, Google might never reach them all within your daily budget. Flattening the architecture makes crawling more efficient, which stretches your budget further.

Monitoring depth in practice

Use Screaming Frog, Sitebulb, or similar crawlers to map your site and calculate depth metrics. Filter by depth and look at page types. Are important pages (products, articles, landing pages) clustered at depth 3+? That's your problem.

Check which deep pages actually get traffic in Google Analytics. If valuable pages are buried but still performing, it means users are finding them via search despite depth. That's a hint to promote them higher in your structure.

Look at Search Console's "Discovered, not indexed" report. High counts often correlate with excessive depth. Google found the pages via internal links but didn't bother indexing them because they seemed low-priority based on depth and lack of external signals.

When depth doesn't matter

If a page has strong backlinks, it can rank well even at depth 5. External links bypass your internal architecture. Google discovers the page via someone else's link, sees the authority signals, and indexes it regardless of depth. But that's the exception, not the rule. Most pages rely on internal links for discovery.

Also, some pages should be deep. Legal disclaimers, old blog archives, niche product variants with no search demand. Depth is a tool. Use it intentionally to signal priority, not accidentally because you didn't think about navigation.

Crawl Frequency by Depth Depth 1 Daily Depth 2 Weekly Depth 3 Monthly Depth 4+ Rarely Deeper pages get crawled exponentially less often
Crawling and Indexing