Citable

Home/Services/Technical SEO

Fix the structural reasons AI can't read you.

Schema overhaul, Core Web Vitals, indexability, hreflang, internal linking, redirect cleanup. Shipped to production with deployment validation. AI crawlers reward fast, semantically clean, structurally clear sites — we ship exactly that.

Three ways in.

Pricing

01 Technical SEO Audit
900 EUR
One-off · 5 business days

Full crawl. Core Web Vitals analysis. Indexability review. Schema audit. Internal linking analysis. Hreflang verification. Redirect mapping. Mobile rendering audit. Delivered as prioritized fix list with severity grading.

02 Technical SEO Sprint
2,800 EUR
Flat fee · 6 weeks

Audit + implementation. Schema deployment (full set). Core Web Vitals optimization. Indexability fixes. Hreflang. Internal linking restructure. Robots.txt and sitemap rebuild. Redirect cleanup. Shipped to your live site with deployment validation.

03 SEO Continuity
2,000 – 3,000 EUR / mo
Retainer · 6-month minimum

Monthly technical health monitoring, content optimization, internal linking maintenance, CWV tracking, search console issue resolution, ranking tracking on priority keywords. Designed as a paired add-on for GEO Growth clients.

Six questions buyers ask first.

FAQ

01 What technical fixes most directly improve AI crawlability? +

In order of impact: first, ensure robots.txt explicitly allows GPTBot, ClaudeBot, PerplexityBot, OAI-SearchBot, and Google-Extended. Many sites that implemented broad bot blocks during the scraping concerns of 2023 and 2024 are inadvertently blocking AI crawlers. Second, eliminate JavaScript rendering barriers on content that matters for citation. AI crawlers have inconsistent JavaScript execution capability. Server-side rendered content (Next.js App Router, for example) is reliably accessible. Third, fix broken internal links that prevent crawlers from reaching important pages. Fourth, resolve Core Web Vitals failures on top pages, as page quality signals influence AI model training data selection.

02 Is metadata enough to improve AI visibility, or does it have to be in the body content? +

Metadata alone is insufficient. Meta descriptions and title tags inform traditional search but AI models primarily extract from body content: the actual sentences on the page. The most impactful changes are in the opening sentences of each section (lead with the answer), the H2 and H3 structure (use question formats), and the body paragraphs (short, direct, citation-ready). Schema markup in the head confirms what the page is about but the content of the page is what gets extracted and cited.

03 How do I know if my website is technically ready for AI search? +

Run four checks. First, search your brand name in ChatGPT and Perplexity and note whether the information returned is accurate and current. Inaccurate AI responses often indicate outdated or conflicting entity signals. Second, check your robots.txt file for blocks on AI crawlers (GPTBot, PerplexityBot, ClaudeBot). Third, validate your schema markup on your five most important pages using Google's Rich Results Test. Fourth, run a Core Web Vitals report in Search Console and note any pages failing on LCP or CLS. Citable's Technical SEO Audit covers all four systematically and delivers a prioritized fix list within 5 business days.

04 What schema markup types are most important for AI citation? +

The highest-impact schema types in order: Organization (establishes your brand identity across the knowledge graph), FAQPage (directly maps your content to question-format prompts), Service or Product (establishes what you offer and at what price), Article (for blog and journal content, enables author attribution), and BreadcrumbList (improves content hierarchy understanding). Person schema on About pages strengthens E-E-A-T signals. All schema should be implemented as JSON-LD in the document head, not as microdata inline in content.

05 Does site speed affect AI search visibility? +

Indirectly yes. Core Web Vitals performance is a signal in Google's systems, which influence AI Overviews. For standalone LLMs like ChatGPT and Perplexity, page speed affects how reliably AI crawlers can access and index content during their crawl cycles. Pages that time out or load slowly during crawler visits produce incomplete or missing index entries. Lighthouse 95+ performance is the target for every page Citable builds or optimizes.

06 What is an llms.txt file and do I need one? +

llms.txt is an emerging convention (analogous to robots.txt) that provides AI language models with a structured, human-readable summary of your site's most important content and how it should be interpreted. It is not yet universally adopted by AI systems but signals awareness and intent to AI crawlers and the teams building them. Citable deploys llms.txt on every site it builds as a forward-looking signal. For existing sites, adding llms.txt is a low-effort, high-upside technical addition that takes under an hour to implement correctly.

Start with the audit.

900 EUR. 5 business days. Prioritized fix list with severity grading.