Glossary

How to Fix “Discovered - currently not indexed” in Google Search Console

If Google Search Console shows pages as "Discovered - currently not indexed," this guide will walk you through a clear, step-by-step process to diagnose the cause and apply practical fixes to get those pages crawled and indexed faster. You’ll learn what triggers the status, how to prioritize and resolve common issues (crawl budget, robots rules, canonicalization, content quality, and server errors), and which actions to take to prompt Google to index your pages more quickly.

Discovered

Discovered (adjective / past participle of discover): found, detected, or made known something that was previously unknown, hidden, or unrecognized.

Step 1: Request Indexing

Indexing request workflow




  • Verify the page is crawlable

    • Ensure there is no noindex header/meta, no robots.txt block, and no canonical pointing elsewhere.

    • Confirm the server returns 200 OK for the page.




  • Use URL Inspection (Google Search Console)

    • Paste the exact URL into URL Inspection.

    • Click Test live URL (or Live Test) to confirm Google can access the current version.

    • If the live test passes, click Request Indexing.

    • If it fails, fix the reported issue (robots, redirect, server error) and retest.




  • Check Coverage and canonical status

    • After requesting indexing, ensure the page’s Coverage status is Indexed or Indexing requested rather than Discovered – currently not indexed.

    • If Search Console shows a different canonical, fix canonical tags or internal links to point to the preferred URL.




  • Use sitemaps for bulk or recurring signals

    • Add the URL to an up-to-date XML sitemap and submit or refresh the sitemap in Search Console.

    • Sitemaps help Google discover and prioritize pages at scale.




  • Boost discoverability naturally

    • Add or improve internal links from indexed, high-authority pages.

    • Share the URL on social profiles and authoritative sites to create discovery signals.

    • Ensure the page is linked from your homepage or main navigation if it’s important.




  • Avoid overusing Request Indexing

    • Request indexing after you’ve fixed issues or published meaningful updates.

    • Frequent unnecessary requests won’t speed up indexing and can hit practical limits.




  • Monitor progress

    • Reinspect the URL after a few hours or days.

    • Check Coverage, Mobile Usability, and Page Experience reports for follow-up issues.




  • Optional: API and special cases

    • The Indexing API is limited to certain content types (e.g., job postings, livestreams). For regular pages, rely on URL Inspection and sitemaps.



Step 2: Improve Internal Link Structure

Step 2: Improve Internal Link Structure



Why it matters



  • Internal links signal page importance and help Google find and crawl pages faster.

  • Pages with few or no internal links often remain “discovered” but unindexed.



Concrete actions



Find orphan or weakly linked pages



  • Run a crawl with tools such as Screaming Frog, Sitebulb, or Ahrefs Site Audit to list pages with zero or very few internal incoming links.

  • Prioritize pages in GSC labeled “Discovered—currently not indexed.”



Add contextual, relevant links from high-authority pages



  • Link from your site’s top-performing pages (home, category hubs, popular posts) to target pages.

  • Use descriptive, keyword-relevant anchor text that reflects the target page’s topic.



Create or strengthen hub pages



  • Build or optimize category and pillar pages that logically group and link to related content.

  • Ensure hub pages are linked in navigation, breadcrumbs, or prominent sections so crawlers follow them easily.



Reduce click depth



  • Keep important pages within 2–3 clicks of the homepage or main hub pages.

  • Move or create links so pages aren’t buried deep in the structure.



Use navigation, breadcrumbs, and footer links sparingly and purposefully



  • Add links where they help users and bots discover content (breadcrumbs for hierarchy, footer for key resources).

  • Do not overload the footer or navigation with every page—prioritize important pages.



Ensure links are crawlable




Consolidate internal linking for similar content



  • Replace many low-value pages linking to the same target with fewer, stronger links from relevant pages.

  • Merge or canonicalize thin or duplicate pages rather than scattering internal links across many low-value URLs.



Use internal link volume wisely



  • One strong contextual link from a relevant, authoritative page is better than many shallow links.

  • Keep link placement visible (in-body) rather than buried in comments or widgets.



Update your XML sitemap and submit it in GSC after major structural changes



  • Ensure newly linked pages are in your sitemap and request indexing for prioritized URLs.



Monitor impact



  • Track changes in GSC (Index Coverage, URL Inspection) and analytics (crawl sessions, internal referral paths).

  • Re-audit internal links periodically, especially after adding new content.



Quick checklist



  • Identify orphan or weakly linked pages

  • Add contextual links from high-authority pages

  • Create or optimize hub pages and breadcrumbs

  • Reduce click depth to 3 or fewer

  • Use crawlable HTML links; avoid rel="nofollow"

  • Update the sitemap and request indexing

  • Monitor GSC for indexing changes



  1. Meaning of “Discovered – currently not indexed”



    • Google found the URL (via a link or sitemap) but has not crawled or indexed it yet.

    • Common causes include crawl budget limits, low priority, blocked resources, soft 404s, duplicate or thin content, temporary server errors, or indexing queue delays.




  2. Immediate troubleshooting checklist (quick run)



    • URL Inspection (Search Console)

      • Inspect the URL, check crawlability, run a live test, and review any detected errors.

      • If the live test is successful, click “Request Indexing” (limited use; not guaranteed).



    • Check robots and meta tags

      • robots.txt: ensure the path is not disallowed.

      • Meta robots: ensure noindex is not present.

      • X-Robots-Tag header: ensure it is not set to noindex in the server response.



    • Confirm HTTP status and redirects

      • Live test should show 200 OK for the canonical URL; avoid 3xx chains or 4xx/5xx.

      • Ensure the canonical tag points to this URL or to the intended canonical.



    • Review sitemap

      • Include the URL in the XML sitemap submitted in Search Console.

      • Ensure the sitemap is accessible (200) and up to date.

      • Remove low-value or duplicate URLs from the sitemap.



    • Check content quality and uniqueness

      • Provide substantial, original content that adds value.

      • Avoid thin pages, doorway pages, or near duplicates.



    • Improve internal linking

      • Add contextual internal links from high-traffic, regularly crawled pages.

      • Use a clear site architecture that surfaces the page within 2–3 clicks from the homepage.



    • Ensure mobile readiness and renderability

      • The page must be mobile-friendly and renderable without blocked JS/CSS.

      • Use URL Inspection’s “View crawled page” and “Screenshot” to confirm.



    • Server performance and availability

      • Check server logs for crawl errors, long response times, and frequent 5xx.

      • Fix slow responses or intermittent downtime that deter crawling.



    • Remove duplicate or parameter issues

      • Use canonical URLs or parameter handling in Search Console.

      • Implement rel=canonical to the preferred version.



    • Structured data and noindex signals

      • Ensure structured data does not contradict your indexing intent.

      • Remove any accidental meta tags or headers that instruct noindex.






  3. Advanced actions (if quick fixes do not work)



    • Submit a clean sitemap with only indexable URLs, then add or resubmit it in Search Console.

    • Fix soft 404s and thin content: merge pages or add unique content, then request indexing.

    • Increase crawl frequency: publish fresh internal content and improve internal linking; high-quality backlinks help.

    • Use hreflang correctly (if applicable) to avoid duplicate treatment across locales.

    • If sitewide issues exist (heavy JS rendering, blocked resources), fix them, then test live and request indexing for representative URLs or resubmit the sitemap.




  4. Crawl budget and queuing



    • For very large sites, prioritize important pages via internal linking and sitemap priority; low-value pages may be queued longer.

    • Avoid mass “Request Indexing” actions; focus on high-priority batches after fixes.




  5. Monitoring and verification



    • After fixes, re-inspect URLs and request indexing; allow 24–72 hours for changes (can be longer).

    • Monitor Coverage > Excluded and Indexing reports in Search Console; track server logs and crawl stats.

    • Use site-specific searches sparingly to verify indexing; rely on Search Console as the primary source.




  6. Common mistakes to avoid



    • Leaving pages in the sitemap that are blocked or noindexed.

    • Requesting indexing before fixing rendering or server issues.

    • Relying solely on Request Indexing for many pages without addressing root causes.

    • Creating thin or duplicate pages and expecting fast indexing.




  7. When to escalate



    • If “Discovered – currently not indexed” persists for more than two weeks on important pages after fixes, check server logs for Googlebot, run render tests, ensure Googlebot is not rate-limited by your firewall or CDN, and consider filing a support report in Search Console or asking for help in the Google Search Central Community.




  8. Quick action plan (priority order)



    1. Inspect the URL in Search Console and run a Live Test.

    2. Fix robots, meta, X-Robots-Tag, and HTTP status issues.

    3. Ensure the page has unique, substantial content.

    4. Add internal links and include the page in the sitemap.

    5. Request Indexing and monitor for 24–72 hours.

    6. If unchanged, check server logs, rendering, and crawl budget; iterate.




  9. Tools to use



    • Google Search Console (URL Inspection, Coverage, Sitemaps)

    • Live Test in URL Inspection

    • robots.txt tester in Search Console

    • Server logs and analytics

    • Mobile-Friendly Test and browser developer tools for rendering

    • Screaming Frog or Sitebulb for site crawling and duplicate detection




  10. Suggested meta checklist for your landing page



    • URL inspected in Search Console: [ ]

    • robots.txt allows URL: [ ]

    • No noindex meta or X-Robots-Tag: [ ]

    • 200 OK and correct canonical: [ ]

    • Included in XML sitemap: [ ]

    • Rich, unique content present: [ ]

    • Internal links from high-authority pages: [ ]

    • Page renders without blocked resources: [ ]

    • Server stable, under 500 ms typical response: [ ]

    • Request indexing after fixes: [ ]




  11. Expected timeline



    • Minor fixes: indexing often within 24–72 hours after Request Indexing.

    • Larger sites or crawl budget issues: days to weeks after structural fixes.




  12. Optional: provide your URL for a prioritized, page-specific checklist.