How We Made This Website Visible to ChatGPT: A Complete Technical Breakdown
We optimised getfoundbychatgpt.com to be recommended by AI assistants. Here's exactly what we did — structured data, llms.txt, pre-rendering, and more.
AI Visibility Consultant · 10+ years in tech
We help businesses get recommended by ChatGPT. But do we practice what we preach? This post breaks down every single thing we implemented to make getfoundbychatgpt.com visible to AI assistants — structured data, machine-readable files, pre-rendering, and more.
Consider this a technical case study. If you're a developer or business owner wanting to understand what AI visibility actually looks like in practice, this is for you.
The Goal
When someone asks ChatGPT, Claude, or Perplexity "How do I get my business recommended by AI?" or "What is AI visibility?", we want this site to be cited or recommended.
1. Structured Data (JSON-LD Schemas)
Structured data is machine-readable information embedded in your HTML that helps AI understand what your page is about. We use JSON-LD format, which Google and AI systems prefer.
Here's what we added to our homepage:
Organization Schema
Tells AI who we are, what we do, and our areas of expertise:
{
"@type": "Organization",
"name": "Get Found By ChatGPT",
"description": "AI visibility consultancy helping businesses get recommended by ChatGPT...",
"knowsAbout": [
"AI Visibility",
"ChatGPT Optimization",
"Generative Engine Optimization",
"Structured Data"
]
}Service Schema
Describes exactly what we sell, including pricing:
{
"@type": "Service",
"name": "AI Visibility Audit for Websites",
"description": "We analyse how AI assistants understand your business...",
"offers": {
"price": "190",
"priceCurrency": "GBP"
}
}FAQPage Schema
This is crucial. We added 12 questions and answers that match what people actually ask AI assistants:
- "What is an AI visibility review?"
- "How does ChatGPT decide which businesses to recommend?"
- "Why isn't my business being recommended by ChatGPT?"
- "What is Generative Engine Optimization (GEO)?"
- "Is AI visibility different from SEO?"
Each question has a detailed answer. When an AI assistant encounters a query like "How does ChatGPT recommend businesses?", it can pull directly from this structured data.
Person Schema (E-E-A-T)
Google's E-E-A-T framework (Experience, Expertise, Authoritativeness, Trustworthiness) matters for AI too. We added a Person schema for the author:
{
"@type": "Person",
"name": "Alex Rapier",
"jobTitle": "AI Visibility Consultant",
"knowsAbout": ["AI Visibility", "ChatGPT Optimization", "Structured Data"],
"description": "10+ years in software engineering..."
}HowTo Schema
A step-by-step process that AI can reference when someone asks "How do I get recommended by ChatGPT?":
- Purchase the AI Visibility Review
- We Audit Your Website Across AI Platforms
- Receive Your AI Readiness Score
- Join Your 30-Minute Walkthrough Call
- Implement the Prioritised Action Plan
2. The llms.txt File
This is an emerging standard — a plain-text file at /llms.txt that provides AI-readable information about your site. Think of it like robots.txt, but for language models.
Our llms.txt includes:
- What we do (plain English description)
- Who we help (target audience)
- Service details and pricing
- Key facts about the business
- Links to all important pages
You can view ours at /llms.txt.
3. AI Plugin Manifest
We created a /.well-known/ai-plugin.json file — the format OpenAI uses for ChatGPT plugins. Even if you're not building a plugin, this file helps AI systems understand your site:
{
"name_for_model": "getfoundbychatgpt",
"description_for_model": "UK-based AI visibility consultancy helping local businesses get recommended by ChatGPT, Claude, Gemini...",
"contact_email": "hello@getfoundbychatgpt.com"
}4. robots.txt for AI Crawlers
Many sites accidentally block AI crawlers. We explicitly allow them:
User-agent: GPTBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: Claude-Web Allow: / User-agent: PerplexityBot Allow: / User-agent: Anthropic-AI Allow: /
We also added a reference to llms.txt:
LLMs: https://getfoundbychatgpt.com/llms.txt
5. Pre-rendering (Critical for SPAs)
This is where many sites fail. Our site is built with React — a single-page application (SPA). By default, the HTML looks like this:
<body> <div id="root"></div> <!-- Empty! --> </body>
AI bots and search engines see an empty page. The content only appears after JavaScript runs.
The fix: pre-rendering. We use Puppeteer at build time to render every page and save the full HTML. Now when a bot visits, they see:
<body>
<div id="root">
<header>...</header>
<section>Be the business ChatGPT recommends...</section>
<section>Frequently Asked Questions...</section>
<!-- All content visible! -->
</div>
</body>6. Hidden Machine-Readable Content
We added a visually-hidden section specifically for AI parsing — plain text facts that don't need styling:
<section class="sr-only" data-ai-readable="true"> <h2>What is GetFoundByChatGPT.com?</h2> <p>GetFoundByChatGPT.com is a UK-based AI visibility consultancy.</p> <p>We help businesses get recommended by AI assistants like ChatGPT.</p> <h3>Service Offered</h3> <p>Service: AI Visibility Audit for websites.</p> <p>Price: £190 one-off payment.</p> <p>Turnaround: 24 hours.</p> ... </section>
The sr-only class makes it invisible to users but keeps it in the DOM for bots.
7. Content That Answers Questions
AI assistants recommend content that directly answers user queries. We created blog posts targeting the exact questions people ask:
- What is AI Visibility?
- How to Get Recommended by ChatGPT
- Why Doesn't ChatGPT Mention My Business?
- How to Optimize Your Website for AI
Each post has its own structured data (Article schema, Breadcrumb schema) and is written to directly answer the query in the title.
The Complete Checklist
Here's everything we implemented:
| Category | Implementation |
|---|---|
| Structured Data | Organization, Service, FAQPage, Person, HowTo, WebSite, Article, Breadcrumb schemas |
| AI-Specific Files | llms.txt, .well-known/ai-plugin.json |
| Crawling | robots.txt with AI bot rules, comprehensive sitemap |
| HTML | Link rel="llms", full meta tags, pre-rendered content |
| Hidden Content | AIReadableFacts component with plain-text business info |
| Technical | Puppeteer pre-rendering, static site generation |
Does It Work?
The proof is in the testing. After implementing these changes:
- Google's Rich Results Test detects our FAQPage and HowTo schemas
- Our llms.txt is accessible to AI crawlers
- The full HTML content is visible in page source (no JavaScript required)
- All AI bot user agents are explicitly allowed
We're not promising immediate results — AI training data has lag times, and recommendations depend on many factors. But we've removed every technical barrier that would prevent AI from understanding and recommending our site.
What You Can Do
If you want to implement these changes for your own site, start with:
- Add FAQPage schema — This has the highest impact. Include questions your customers actually ask.
- Create llms.txt — Plain text, easy to implement, increasingly recognised.
- Check robots.txt — Make sure you're not blocking AI crawlers.
- Pre-render if using a SPA — Critical for React, Vue, Angular sites.
- Write content that answers questions — Not marketing fluff, but direct answers.
Want us to do this for you?
Our AI Visibility Review analyses your site and tells you exactly what to fix — with a prioritised action plan and walkthrough call.
Get your AI Visibility Review →Ready to improve your AI visibility?
Get a comprehensive audit of how AI assistants see your business, with actionable recommendations to get found.
Check Your AI Visibility