Free Resource

AI Visibility Checklist

25 signals that determine whether ChatGPT, Claude, Perplexity, and Gemini can find, understand, and recommend your product. Check each one manually — or run a free AI visibility audit to check all 25 at once.

6 categories 25 signals 100 points total Free — no signup required
20 pts

AI Crawl Access

Before an AI assistant can recommend your product, it needs permission to crawl your site and a structured way to understand it. These 5 signals ensure AI crawlers can index your content.

sitemap.xml is present at /sitemap.xml

Why it matters

Search engines and AI crawlers use sitemaps to discover all your pages. Without one, AI systems may miss important product pages.

How to fix it

Most frameworks (Next.js, Nuxt, Astro) can generate sitemaps automatically with a plugin or built-in route.

sitemap.xml contains 3 or more URLs

Why it matters

A sitemap with only 1 URL suggests your content architecture is shallow, which reduces crawl depth.

How to fix it

Include your homepage, features, pricing, about, and blog pages at minimum.

robots.txt allows GPTBot (ChatGPT's crawler)

Why it matters

GPTBot is the official OpenAI/ChatGPT crawler. Blocking it prevents your site from being indexed in ChatGPT's knowledge base.

How to fix it

Add 'User-agent: GPTBot' with 'Allow: /' to your robots.txt. Do not block it unless you have a specific reason.

robots.txt allows ClaudeBot (Anthropic's crawler)

Why it matters

ClaudeBot is Anthropic's official web crawler. Blocking it prevents Claude from knowing about your product.

How to fix it

Add 'User-agent: ClaudeBot' with 'Allow: /' to your robots.txt.

robots.txt allows PerplexityBot

Why it matters

PerplexityBot powers Perplexity AI's web index. If it's blocked, Perplexity cannot cite your product in answers.

How to fix it

Add 'User-agent: PerplexityBot' with 'Allow: /' to your robots.txt.

20 pts

Content Quality

AI systems analyze your site's content to understand what you do. JavaScript-heavy sites with minimal crawlable text score poorly — LLMs read HTML, not rendered JavaScript.

Homepage has 500+ words of visible, crawlable text

Why it matters

LLMs need sufficient text to extract product information, positioning, and value propositions. Sub-500 word pages give AI little to work with.

How to fix it

Ensure your key marketing content is server-rendered and appears in the HTML source, not just after JavaScript executes.

Text-to-HTML ratio is above 15%

Why it matters

A low text-to-HTML ratio indicates a JavaScript-heavy page with minimal real content — a pattern common in SPA frameworks where text is injected dynamically.

How to fix it

Use server-side rendering (SSR) or static generation for your marketing pages. Check your homepage's raw HTML source to verify.

3+ pages are reachable from internal links on the homepage

Why it matters

Internal links help AI crawlers discover more of your site. A single-page app with no deep links limits what LLMs know about your product.

How to fix it

Add navigation links to your features, pricing, about, and blog pages from your homepage.

Navigation links are present and crawlable

Why it matters

Well-structured navigation tells AI systems what sections of your site exist and how your content is organized.

How to fix it

Use standard HTML anchor tags for navigation. JavaScript-only routing may be invisible to crawlers.

15 pts

Product Clarity

AI systems recommend products when they can confidently describe what the product does. Vague or missing H1s, no features documentation, and no pricing reduce citation confidence.

A clear H1 headline is present on the homepage

Why it matters

The H1 is the most authoritative statement about what your product does. AI systems weight it heavily when generating product descriptions.

How to fix it

Your H1 should describe what the product does and who it's for in one clear sentence. Avoid 'Welcome to [Company]' — be explicit.

Features page or feature keywords are present

Why it matters

When someone asks an AI 'what does X do?', it needs feature information to give a complete answer. Missing features = incomplete AI responses.

How to fix it

Add a /features page or include a features section on your homepage with clear feature titles and descriptions.

Pricing page or pricing information is present

Why it matters

Pricing is frequently asked about in AI queries ('how much does X cost?'). A missing pricing page means AI cannot answer this question accurately.

How to fix it

Add a /pricing page and link to it from your navigation. Include your free tier if you have one.

20 pts

Structured Data & Meta

Structured data is the highest-fidelity machine-readable format for product information. JSON-LD schema and proper meta tags allow AI systems to extract accurate, structured facts about your product.

OpenGraph og:title tag is set

Why it matters

og:title is used by AI systems when generating summaries and recommendations. It should match your product name and core value proposition.

How to fix it

Set og:title to '[Product Name] — [One-sentence value proposition]'. Most meta tag libraries handle this.

OpenGraph og:description tag is set

Why it matters

og:description is frequently extracted by AI systems when generating product summaries. It should clearly state what the product does.

How to fix it

Write a 1-2 sentence og:description that explains the product's core function and primary benefit.

OpenGraph og:image tag is set

Why it matters

Visual context helps AI systems build richer product profiles. A missing og:image reduces visual citations in AI-generated responses.

How to fix it

Create a 1200x630px OG image with your product name and key value proposition.

Page title tag is present and descriptive

Why it matters

The <title> tag is one of the most heavily-weighted page signals. Vague titles like 'Home' give AI systems minimal product information.

How to fix it

Use the format '[Product Name] — [Value Proposition] | [Brand]' for your homepage title.

Meta description is present (150-160 characters)

Why it matters

Meta descriptions are frequently used as summaries by AI systems. They appear in citations and AI-generated comparisons.

How to fix it

Write a descriptive meta description that includes your product name, primary function, and one key benefit.

JSON-LD SoftwareApplication, Product, or Organization schema is present

Why it matters

This is the single highest-impact structured data signal. JSON-LD gives AI systems precise, machine-readable facts: product name, category, description, pricing URL, and more.

How to fix it

Add a <script type='application/ld+json'> block with SoftwareApplication schema. AIExposureTool generates this for you.

Canonical URL is set

Why it matters

Canonical URLs tell AI crawlers which version of a URL is authoritative, preventing duplicate content penalties.

How to fix it

Set rel='canonical' in your page head to your preferred URL. Most Next.js and meta tag setups handle this automatically.

10 pts

Agent Readiness

llms.txt files are a direct, unambiguous brief for AI agents. When a Claude or ChatGPT agent browses your site, llms.txt is the first file it should read — designed specifically for LLMs.

llms.txt file exists at /llms.txt

Why it matters

llms.txt is the AI equivalent of robots.txt — a plain-text brief that tells LLMs exactly what your product does, who it's for, and where to find key information. AI agents will look for it first.

How to fix it

Create a plain text file at /llms.txt with: product name, tagline, description, target audience, key use cases, pricing summary, and links to important pages. AIExposureTool generates this automatically.

llms-full.txt file exists at /llms-full.txt

Why it matters

llms-full.txt is an extended version of llms.txt with complete product documentation. It gives AI systems enough context to answer detailed product questions accurately.

How to fix it

Create /llms-full.txt with full feature descriptions, methodology, all pricing tiers, and comprehensive documentation. AIExposureTool generates this for you.

15 pts

Trust & Social Proof

AI systems assess trust signals before recommending products — this mirrors Google's EEAT (Experience, Expertise, Authoritativeness, Trustworthiness) framework. Real evidence of real users matters.

Testimonials are present on the homepage

Why it matters

Real testimonials with names, roles, and specific outcomes increase AI citation confidence. AI systems are less likely to recommend products that appear to have no users.

How to fix it

Add 2-3 testimonials with real names, their role, their company (if applicable), and specific outcomes ('went from score 42 to 91 in two days').

Customer logos or trust marks are visible

Why it matters

Client logos, press mentions, and 'as seen in' marks signal that the product has been vetted by recognizable third parties.

How to fix it

Add logos of notable customers, press outlets that have featured you, or integration partners. Even 3-4 logos meaningfully improve this signal.

Quantifiable metrics are present (user counts, company counts, scan counts)

Why it matters

Specific numbers ('1,000+ sites scanned', '500+ founders') are more credible to AI systems than vague claims ('many users').

How to fix it

Add a stats section or integrate metrics into your headline area. Be specific and keep numbers up to date.

Check all 25 signals automatically

Instead of checking each signal manually, paste your URL into AIExposureTool and get your AI Exposure Score (0-100) in 30 seconds — with exactly which signals you're passing and failing.

AI Visibility Checklist — 25 Signals to Get Recommended by ChatGPT & Perplexity | AIExposureTool