Informational 900 words 12 prompts ready Updated 17 Apr 2026

Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)

Informational article in the How to Consolidate Duplicate Content During a Migration topical map — Migration Implementation & QA content group. 12 copy-paste AI prompts for ChatGPT, Claude & Gemini covering SEO outline, body writing, meta tags, internal links, and Twitter/X & LinkedIn posts.

← Back to How to Consolidate Duplicate Content During a Migration 12 Prompts • 4 Phases
Overview

Pre-launch Indexability Checklist confirms that robots, meta noindex, and X-Robots-Tag settings do not prevent intended pages from being crawled or indexed, and verifies robots.txt rules against the Robots Exclusion Protocol (REP, 1994). The checklist inspects three layers—robots.txt at the site root, HTTP header X-Robots-Tag for non-HTML resources, and HTML meta robots—for correct directives, response codes and canonical behavior. Typical verifications include a live curl -I header check, Google Search Console URL Inspection for rendering and indexability status, and a crawl simulation to ensure consolidated URLs are reachable before launch. The objective is to avoid accidental suppression of migrated or consolidated content.

Mechanically, indexability works because robots.txt implements the Robots Exclusion Protocol by instructing bots which paths to avoid, while meta robots and X-Robots-Tag communicate indexation tags directly on resource fetch. Tools like Screaming Frog and Google Search Console reveal crawlability and meta/header signals, and HTTP tooling such as curl or wget validates server-side X-Robots-Tag responses. For migration-focused QA this crawlability checklist should include fetching the root /robots.txt to confirm directives and testing canonical chains, redirects, and status codes so that consolidated pages are not blocked. The duplicate-content migration workflow benefits from staging checks that mirror production hostnames and from recording differences between live headers and the pre-launch robots configuration. Checks should also include user-agent rules and basic sitemap alignment.

An important nuance is that blocking with robots.txt is not equivalent to deindexing: search engines may index a URL without crawling it if external links point to it, so relying solely on robots.txt can leave obsolete or duplicate URLs visible. Equally problematic is combining disallow in robots.txt with meta noindex on the same URL, because disallow prevents the crawler from fetching the page and thus prevents discovery of the noindex directive. For non-HTML assets such as PDFs, images, and attachments the X-Robots-Tag header is required to control indexation; forgotten X-Robots-Tag rules commonly cause legacy files to remain indexed after a relaunch. In an ecommerce category consolidation this can surface as product pages indexed under old faceted URLs despite canonical and redirect plans. A pre-launch audit should document header evidence.

The practical takeaway is to verify three control points during pre-launch: accessible /robots.txt directives, visible meta robots or X-Robots-Tag responses for pages and non-HTML assets, and live-crawl confirmation that canonical and redirect targets are fetchable and return intended status codes. Execution can use automated runs in Screaming Frog or site crawlers, scripted curl -I header checks for batches of URLs, and Google Search Console index status reports to reconcile discrepancies. Recording the pre-launch state and re-checking immediately after switch minimizes regressions in site migration indexability. This page contains a structured, step-by-step framework.

How to use this prompt kit:
  1. Work through prompts in order — each builds on the last.
  2. Click any prompt card to expand it, then click Copy Prompt.
  3. Paste into Claude, ChatGPT, or any AI chat. No editing needed.
  4. For prompts marked "paste prior output", paste the AI response from the previous step first.
Article Brief

prelaunch indexability checklist

Pre-launch Indexability Checklist

authoritative, practical, evidence-based

Migration Implementation & QA

Technical SEOs, content & migration managers, and developers preparing a site migration or major relaunch; intermediate to advanced knowledge; goal is to prevent indexability issues before launch

A concise, prescriptive pre-launch checklist focused specifically on Robots, meta noindex, and X-Robots-Tag issues within the broader duplicate-content migration workflow, with actionable verifications, tooling commands, and edge-case rules for e-commerce and multilingual sites

  • robots.txt
  • noindex
  • X-Robots-Tag
  • crawlability checklist
  • pre-launch SEO checklist
  • indexation tags
  • site migration indexability
Planning Phase
1

1. Article Outline

Full structural blueprint with H2/H3 headings and per-section notes

Setup: You are writing an SEO-optimized, ready-to-write outline for the article titled 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. This article sits in the topical map 'How to Consolidate Duplicate Content During a Migration' and the intent is informational. Produce a full blueprint the writer will use to draft a 900-word article. Include H1, all H2s, H3 subheadings, precise word targets per section, and specific notes on what each section must cover (technical checks, commands, examples, edge cases: e-commerce and multilingual). The outline must prioritize clarity for developers and SEOs, show where to include code snippets and tool outputs, and signal where to link to the pillar article 'How to Audit for Duplicate Content Before a Migration'. Do not write the article body—only the structured outline. Output format: Return a ready-to-write outline listing H1, H2, H3, word targets and per-section notes in bullet form. Keep it actionable and organized for direct handoff to a writer.
2

2. Research Brief

Key entities, stats, studies, and angles to weave in

Setup: Provide a research brief that the writer must follow when drafting 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. This article is informational and must reference authoritative sources, tooling, and recent statistics. List 8-12 specific entities, studies, statistics, tools, or expert names and a one-line note explaining why each should be woven into the article. Include trending angles like crawl budget optimization, migration-related indexation loss, and Googlebot header handling changes. Make sure to include up-to-date tools (site crawlers, curl examples, Search Console reports) and name Google documentation where appropriate. Output format: Return an ordered list with each item as 'Entity/Tool/Study — one-line justification'.
Writing Phase
3

3. Introduction Section

Hook + context-setting opening (300-500 words) that scores low bounce

Setup: Write the introduction for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. The article topic is pre-launch indexability checks for migrations and relaunches; intent is informational for technical SEOs and content teams. Start with a single-sentence hook that grabs a technical audience (highlight risk and value). Then provide a concise context paragraph explaining why robots.txt, meta noindex, and X-Robots-Tag are critical during migrations and duplicate-content consolidation. Include a clear thesis sentence: what this checklist will accomplish for the reader. Finish by telling the reader what they will learn and a one-sentence transition into the checklist. Length requirement: 300-500 words; engaging, low-bounce, and pragmatic; avoid marketing language. Output format: Return only the introduction text, ready to publish.
4

4. Body Sections (Full Draft)

All H2 body sections written in full — paste the outline from Step 1 first

Setup: You will convert a provided outline into the full body of the article 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Paste the exact outline you received from Step 1 immediately below this prompt before the AI runs. The article must be 900 words total and follow the outline exactly. Write every H2 section in full before moving to the next, include H3 subsections where indicated, and insert short code examples (robots.txt lines, meta noindex HTML, X-Robots-Tag header examples) and curl commands where relevant. Provide clear step-by-step checks, expected outcomes, and quick verification commands for devs (e.g., curl -I, site: queries, Search Console coverage checks). Include transitions between sections and keep the tone authoritative and practical. Account for edge cases (e-commerce faceted navigation, hreflang/multilingual pages). Maintain readability for technical and content readers. Output format: Return the full article body following the pasted outline, formatted with headings, code blocks, and examples, totaling ~900 words.
5

5. Authority & E-E-A-T Signals

Expert quotes, study citations, and first-person experience signals

Setup: Generate E-E-A-T signals the writer will inject into 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. The audience: technical SEOs and migration leads. Provide five specific expert quotes the author can include (write the full quote and suggest the speaker name and credentials, e.g., 'John Doe, Former Google Webmaster Trends Analyst'). Then list three real studies/reports to cite (title, publisher, year, and why relevant). Finally supply four experience-based, first-person sentences the author can personalize (short, specific lines like 'On one migration I found...'). Output format: Return 5 expert quotes with suggested credentials, 3 study citations with brief notes, and 4 first-person personalization sentences in separate labeled lists.
6

6. FAQ Section

10 Q&A pairs targeting PAA, voice search, and featured snippets

Setup: Create a 10-question FAQ for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Questions should target People Also Ask, featured snippet queries, and voice-search phrasing for technical SEOs and devs (use natural language and short interrogative forms). For each question provide a concise answer of 2-4 sentences, specific and actionable, avoiding boilerplate. Include answers for common edge cases (X-Robots-Tag for non-HTML resources, noindex vs. disallow, robots.txt for staging). Output format: Return 10 numbered Q&A pairs; each answer must be 2-4 sentences and optimized for snippet capture.
7

7. Conclusion & CTA

Punchy summary + clear next-step CTA + pillar article link

Setup: Write the conclusion for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. It must recap the checklist's key takeaways, reinforce the risks of skipping these checks, and give a single, specific next-step CTA telling the reader exactly what to do next (e.g., run checklist, schedule pre-launch audit, or follow a verification command). Include a one-sentence bridge encouraging readers to read the pillar article 'How to Audit for Duplicate Content Before a Migration' with a contextual reason to click. Length: 200-300 words, decisive, action-oriented. Output format: Return only the conclusion text ready to publish.
Publishing Phase
8

8. Meta Tags & Schema

Title tag, meta desc, OG tags, Article + FAQPage JSON-LD

Setup: Generate SEO metadata and JSON-LD for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Provide: (a) title tag 55-60 characters optimized for the primary keyword, (b) meta description 148-155 characters summarizing the article's value, (c) OG title, (d) OG description, and (e) a full Article + FAQPage JSON-LD block ready to paste into the page header. Include schema fields for author, datePublished, and include the 10 FAQ Q&A pairs inside the FAQPage. Use the primary keyword naturally in title and OG tags. Output format: Return these 5 elements and then the full JSON-LD code block only; no other commentary.
10

10. Image Strategy

6 images with alt text, type, and placement notes

Setup: Produce an image strategy for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Paste your final article draft below this prompt so the AI can align images to specific paragraphs. Recommend 6 images: for each, describe what the image shows, the exact place in the article it should appear (e.g., under H2 'Robots.txt checks'), the SEO-optimized alt text including the keyword 'Pre-launch Indexability Checklist' or related secondary keyword, and specify whether to use a photo, infographic, screenshot, or diagram. Also recommend file naming conventions (lowercase, hyphens) and preferred dimensions for hero and inline images. Output format: Return a numbered list of 6 image recommendations with all requested details.
Distribution Phase
11

11. Social Media Posts

X/Twitter thread + LinkedIn post + Pinterest description

Setup: Create three platform-native social posts to promote 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Paste your final article title and URL below this prompt before generating to allow insertion. Then provide: (a) an X/Twitter thread opener plus three follow-up tweets (each tweet 240 characters or less), using a technical hook and actionable tips; (b) a LinkedIn post (150-200 words) in a professional tone with a hook, one data-backed insight, and a clear CTA linking to the article; (c) a Pinterest description (80-100 words) that is keyword-rich and explains what the pin links to. Keep messaging consistent, use the primary keyword once in each platform text, and include a URL placeholder if no URL is pasted. Output format: Return three labeled sections: X thread, LinkedIn post, Pinterest description.
12

12. Final SEO Review

Paste your draft — AI audits E-E-A-T, keywords, structure, and gaps

Setup: Perform a final SEO audit of the draft for 'Pre-launch Indexability Checklist (Robots, Noindex, X-Robots-Tag)'. Paste the complete draft (HTML or plain text) immediately after this prompt. The AI should check: keyword placement and density for the primary keyword and 3 secondary keywords, E-E-A-T gaps (author bio, citations, expert quotes), readability estimate (Flesch or comparable) and suggestions, heading hierarchy issues, risk of duplicate-angle content vs. pillar article, content freshness signals, and mobile snippet optimization. Then return five specific, prioritized improvement suggestions with implementation steps (e.g., 'add schema for Author with affiliation; insert curl example under H2 and show exact snippet'). Output format: Return (1) a short diagnostic summary, (2) the checklist results for each audit area, and (3) five concrete recommendations with exact edits or lines to add.
Common Mistakes
  • Relying solely on robots.txt disallow to prevent indexing — robots.txt can block crawling but not indexing if other sites link to the pages.
  • Confusing noindex with disallow — placing both can lead to search engines never fetching the X-Robots-Tag or meta noindex directives.
  • Forgetting X-Robots-Tag for non-HTML resources (PDFs, images, attachments) so these assets remain indexed after migration.
  • Leaving staging or dev environments crawlable because robots.txt is not properly configured or is ignored by developers.
  • Not testing headers with curl or an HTTP inspector; authors assume meta tags exist but forget server-side header overrides (CDN, proxy).
  • Neglecting hreflang and canonical interactions with noindex and X-Robots-Tag for multilingual sites, causing loss of preferred-language pages.
  • Using wildcard robots.txt rules that unintentionally block whole directories (e.g., /product/ vs /product-images/).
Pro Tips
  • When testing X-Robots-Tag, always use curl -I to inspect the raw header from the production origin (not via CDN) and include example commands in the article: curl -I https://example.com/page | grep -i x-robots-tag.
  • Recommend a two-step pre-launch verification: automated crawler run (Screaming Frog or Sitebulb) plus manual header checks for a sample of 50 priority URLs (homepage, category pages, top 10 product pages, canonical targets).
  • Advise adding temporary server-side logging to capture 200 responses for pages expected to be noindexed to confirm search engine agents can fetch the page and see the directive before launch.
  • For e-commerce faceted navigation, suggest using canonical+noindex for parameterized pages and provide exact robots.txt and X-Robots-Tag examples instead of vague guidance.
  • Include a rollback playbook section: if a noindex is accidentally left on live pages, the fastest recovery is to remove the noindex and force a re-crawl via Search Console URL Inspection for priority pages.
  • Prefer X-Robots-Tag over meta noindex for non-HTML assets and include a short server configuration example (Apache header set, Nginx add_header) tailored to common setups.
  • Use the site: operator plus a targeted site:example.com in Google to quickly validate whether critical sections are indexed post-launch, then track with Search Console coverage and index status API for larger sites.