Tutorials

How to Fix SEO on an AI-Built Website: Complete 2026 Guide

Diagnose and fix SEO on sites built with Lovable, v0, Bolt, or Claude. A step-by-step guide with checks, fixes, and when to migrate.

Amit Founder, aitowp.agency 14 min read

Your AI-built site is live. It looks good. You submitted a sitemap to Google three weeks ago and nothing has happened. Impressions: flat. Clicks: zero. Rankings: nowhere.

This guide walks through the exact checks to diagnose what’s wrong, the ten fixes that solve 95% of AI-site SEO problems, and — when fixing it in place isn’t worth the time — the migration path that gets you ranking in weeks instead of months.

This is the longest post on the site. It’s meant to be a reference you come back to, not a one-time read. Use the table of contents below to jump to the section that matches where you are right now.

What this guide covers:

  1. The five-minute diagnosis — is your SEO actually broken?
  2. Why AI-built sites fail at SEO (the short version)
  3. The 10-point SEO fix checklist
  4. Tool-specific gotchas for Lovable, v0, Bolt.new, and Claude artifacts
  5. The two paths forward — fix in place vs. migrate to WordPress
  6. FAQ
  7. The honest bottom line

The five-minute diagnosis

Before you fix anything, you need to know what’s actually wrong. Three checks take five minutes and will tell you whether you have a cosmetic SEO problem or a foundational one.

Test 1: View source on your homepage

Open your site’s homepage in Chrome. Right-click, “View page source.” Do NOT use “Inspect Element” — that shows you the rendered DOM after JavaScript has run. You want the raw HTML Googlebot receives.

In the source, search for the first paragraph of text on your homepage (use Ctrl+F). Is it there? Is your <h1> there? Are your meta tags in the <head>?

What you want to see: Real HTML with real content, complete <title>, <meta name="description">, and <meta property="og:image"> tags.

What you don’t want to see: A near-empty <body> with a <div id="root"></div> and a JavaScript bundle. That means your content is rendered client-side, and Googlebot may never see it reliably.

Test 2: Check Google Search Console

If you haven’t set up Search Console yet, do that first — it’s free and takes fifteen minutes. Once it’s set up:

  1. Open “Pages” in the Indexing section.
  2. Look at the “Why pages aren’t indexed” breakdown.
  3. The most common AI-site status is “Discovered — currently not indexed” (Google found the URL but hasn’t prioritized crawling it) or “Crawled — currently not indexed” (Google crawled it but didn’t find it worth adding to the index).

Both point to the same root cause: Google didn’t see enough signals to justify indexing. That’s what the rest of this guide fixes.

Test 3: Run a Lighthouse SEO audit

In Chrome, open DevTools → Lighthouse → check “SEO” → select “Mobile” → run the audit.

What the score means:

  • 95–100: Your HTML foundation is solid. Problems are probably in content quality, backlinks, or competitive positioning — not technical SEO.
  • 85–94: Minor technical gaps. Fixable in an afternoon.
  • 70–84: Significant technical issues. You need the full checklist below.
  • Under 70: Your site has foundational SEO problems. Fixing in place will take days, not hours.

Most default Lovable, v0, and Bolt.new builds score in the 70–85 range. That’s the honest starting point.

Why AI-built sites fail at SEO

Before we get into the fix checklist, the short version of what’s actually happening: AI builders optimize for the thing you asked them for (a live site, fast) and leave the SEO plumbing to you. That plumbing — meta tags per route, schema markup, a sitemap, server-rendered HTML — is invisible to the person generating the site and almost never gets added later.

Each tool fails in slightly different ways. If you haven’t read the specifics for your tool, the comparison posts cover this in depth:

If you know which tool you’re on and you want the tool-specific details, read the relevant comparison post first, then come back here.

The 10-point SEO fix checklist

Work through these in order. Items 1 and 2 solve the biggest problems; items 3 through 7 cover the standard technical SEO stack; items 8 through 10 handle performance and content quality.

1. Make sure Google sees real HTML

This is the foundational fix. Everything else is decoration if Google can’t read your content.

The test: In Search Console, use URL Inspection on your homepage, click “Test live URL”, then click “View crawled page” → “Screenshot”. If the screenshot matches your real site, Google can see it. If it’s blank or shows a loading spinner, you have a rendering problem.

The fix: This is not a small fix. It means either (a) server-side rendering your entire site, (b) adding prerendering for crawlers, or (c) migrating to a platform that does SSR by default. For most AI-built sites, option (c) is dramatically faster than (a) or (b).

If your tool doesn’t support SSR natively (Lovable, Bolt.new), this is the one item that single-handedly justifies a migration. No amount of meta-tag polish compensates for Google not being able to read your content.

2. Unique title and description per page

Every page needs its own <title> and <meta name="description">. Not the same one copy-pasted across the whole site. Not the tool’s default fallback. Unique, written-for-humans, under 60 characters for titles and under 160 for descriptions.

The test: View source on three different pages of your site and compare the <title> tags. Are they different?

The fix: Most AI tools let you set per-route metadata through their config files or page-level exports (Next.js export const metadata, for example). Set every page explicitly. Don’t rely on defaults.

3. JSON-LD schema markup

Schema markup is how Google decides whether your page is eligible for rich results — star ratings, FAQ dropdowns, breadcrumbs, article cards. Without schema, you’re competing in SERPs with just a blue link while better-optimized sites show rich cards.

Minimum schema every site needs:

  • Organization (sitewide, in your root layout) — tells Google who publishes the site.
  • WebSite (sitewide) — enables sitelink search box in Google.
  • BreadcrumbList (per page) — powers breadcrumb trails in SERPs.

Additional schema per content type:

  • Article or BlogPosting — for blog posts.
  • FAQPage — for any page with a Q&A section.
  • Product, Service, or LocalBusiness — for commercial pages.

Generate schema as JSON-LD in a <script type="application/ld+json"> block in your <head>. Test with Google’s Rich Results Test before shipping.

4. XML sitemap

Google needs a sitemap to efficiently discover every page on your site. Most AI builders don’t generate one automatically.

What good looks like:

  • Sitemap available at a predictable URL (/sitemap.xml or /sitemap-index.xml).
  • Every indexable page is listed.
  • lastmod dates update when content changes.
  • Submitted to Google Search Console under “Sitemaps.”
  • Referenced in your robots.txt (Sitemap: https://yoursite.com/sitemap.xml).

If your tool doesn’t generate one, you’ll need to build it yourself or use a static generator.

5. robots.txt configuration

robots.txt tells crawlers what they can and can’t access. Most AI-built sites either lack it entirely or ship with overly-restrictive defaults that block useful paths.

Minimum viable robots.txt:

User-agent: *
Allow: /

Sitemap: https://yoursite.com/sitemap.xml

Don’t block /api/, /_next/, or asset directories unless you have a specific reason. Check what your tool ships by default and adjust.

6. Canonical URLs

Canonical URLs tell Google which version of a page is the “real” one when duplicate or near-duplicate URLs exist. Without canonicals, Google picks one — and it’s often not the one you want.

The fix: Every page should have a <link rel="canonical" href="..."> tag in the head, pointing to the definitive URL of that page. Self-referencing canonicals are normal and correct.

Watch for: Trailing slash inconsistencies (/about vs /about/), http vs https, www vs non-www. Pick one and canonicalize to it everywhere.

7. Open Graph and Twitter Card meta

These don’t directly affect Google rankings, but they affect click-through rates from social shares, which affects traffic, which Google notices. Also: when someone pastes your URL into LinkedIn or Slack, you want a real preview, not a blank card.

Minimum OG tags per page:

<meta property="og:title" content="Your page title" />
<meta property="og:description" content="Your page description" />
<meta property="og:image" content="https://yoursite.com/og-image.jpg" />
<meta property="og:url" content="https://yoursite.com/this-page/" />
<meta property="og:type" content="article" />

Image should be 1200×630 pixels, under 5MB, and a JPG or PNG. Test your implementation with Meta’s Sharing Debugger and Twitter’s Card Validator.

8. Core Web Vitals

Google uses Core Web Vitals — LCP, CLS, INP — as a ranking signal. Not a huge one, but real, and the numbers also correlate strongly with conversion rates.

Targets on mobile:

  • LCP (Largest Contentful Paint): under 2.5 seconds
  • CLS (Cumulative Layout Shift): under 0.1
  • INP (Interaction to Next Paint): under 200ms

Most AI-built sites fail on LCP because they ship heavy JavaScript bundles that delay the first meaningful render. Fixes include code splitting, image optimization, font preloading, and edge caching. On Lovable and Bolt.new in particular, optimizing CWV without migrating is hard because you have limited control over the build output.

9. Internal linking structure

Internal links are how Google’s crawler discovers pages and how PageRank flows through your site. A site with no internal links is a site where every page is fighting for attention alone.

Rules of thumb:

  • Every page should be reachable from the homepage in three clicks or fewer.
  • Important pages (your tool pages, pricing, core service pages) should have the most internal links pointing to them.
  • Anchor text should describe the destination, not say “click here.”
  • Related content blocks (like the one at the bottom of this post) multiply link juice naturally.

AI-built sites often have weak internal linking because they’re generated page by page without a holistic view of the site graph. This is a fixable problem — just takes deliberate editing.

10. Content depth and quality

This is the one nobody wants to hear. Thin content — pages under 300 words, pages with mostly boilerplate copy, pages that exist only to target a keyword — doesn’t rank in 2026. Google’s content quality thresholds have tightened every year since 2022, and the bar is now higher than most AI-generated copy.

What this looks like in practice:

  • Blog posts under 800 words rarely rank for competitive queries.
  • Service pages need real specifics, not just “we do X.”
  • FAQ sections with five real questions beat FAQ sections with twenty vague ones.
  • Content written for a specific persona outperforms content written “for Google.”

If you’re ten items deep in this checklist and still not ranking, it’s probably content quality — not technical SEO.

Tool-specific gotchas

Each AI builder has quirks that need specific workarounds beyond the general checklist above.

Lovable-specific issues

Lovable outputs a client-rendered React + Vite application. The biggest gotcha: the default build has almost no server-rendered HTML, so every SEO signal has to be hand-rolled. Even with per-route metadata configured, Googlebot often sees a blank shell.

If you’re on Lovable and committed to staying, investigate whether you can export the code and add prerendering. If you’re going to migrate, Lovable to WordPress is the fastest path. Most projects ship in 7 days with full SEO setup included.

v0-specific issues

v0 uses Next.js, which supports SSR. That’s good news — the foundational rendering problem is smaller. But v0’s defaults still don’t generate a sitemap, set up schema, or configure canonicals. You can add all of these in Next.js, but the time cost surprises people.

The other v0 gotcha: Vercel’s edge functions can produce URL parameters that look like duplicate content to Google. Watch your Search Console “Crawled — currently not indexed” report for signs of this. If you need a migration, v0 to WordPress handles the rebuild while preserving your design.

Bolt.new-specific issues

Bolt.new’s default output is a pure client-rendered React + Vite SPA hosted on .bolt.host. The three compounding issues: no SSR, no SEO plumbing, and limited hosting control.

A popular community workaround is <noscript> blocks — giving Googlebot a fallback copy of your content if it never runs the JavaScript. This works partially, but it’s a patch on a deeper problem. Long-term, most Bolt.new marketing sites end up migrating. Bolt.new to WordPress is a 7-day pixel-accurate rebuild.

Claude Artifact-specific issues

Claude artifacts are interactive HTML/JS previews, not full websites. They lack routing, sitemaps, meta tags per page, and structured data by design — because they weren’t meant to be public sites.

If you deployed a Claude artifact as your marketing site, the honest assessment is that fixing SEO in place isn’t really possible — the artifact format doesn’t support it. The working path is rebuilding as a real site. Claude Artifact to WordPress takes the same 7 days.

The two paths forward

Once you’ve done the diagnosis and understand where your site stands, you have two real choices. Don’t let anyone tell you there are three — “just add a few meta tags and you’ll be fine” isn’t a path, it’s wishful thinking.

Path A: Fix it yourself in place

Realistic time investment:

  • Small site (single page or under five pages): 10–20 hours of technical work.
  • Multi-page marketing site: 30–60 hours.
  • Ecommerce or multi-region: 80+ hours.

What you need:

  • Working knowledge of your AI tool’s code output.
  • Comfort editing Next.js, React, or Vite configurations.
  • Search Console access and patience to iterate.
  • Time to maintain the SEO stack going forward — schema doesn’t write itself for every new page.

When this makes sense:

  • You’re technical and enjoy this kind of work.
  • You need to stay on your current stack for business reasons (team skill, integrations, etc.).
  • You have time but not budget.

Path B: Migrate to WordPress

Realistic time investment: 7 days, with us doing the work.

What you get:

  • Pixel-accurate rebuild in WordPress (Elementor or Gutenberg).
  • Full SEO stack configured: schema, sitemap, Rank Math, Search Console, Core Web Vitals tuning.
  • Training video for your team.
  • Fourteen days of post-launch support.

When this makes sense:

  • Traffic growth is time-sensitive — every week the site isn’t ranking is a week of compounding traffic you’ll never recover.
  • You value predictability and a fixed scope over ongoing tinkering.
  • You want non-technical team members to edit content without burning AI credits.

Starter projects begin at $299, Standard at $599, Pro at $1,299. Every quote is fixed-scope, no hourly billing. If you’d like us to audit your current AI-built site before deciding, audits are free and take 24 hours.

FAQ

Will fixing these 10 things guarantee my site ranks? No. Technical SEO is necessary but not sufficient. You also need content quality, backlinks, and domain authority — which take months to build. But without the technical foundation, the rest doesn’t matter.

How long until I see results after migrating to WordPress? Typical timeline on a new domain: first indexing in 2–6 weeks, first rankings in 3–6 months, meaningful traffic in 6–12 months. An existing domain with migration-preserved URLs can see results faster because the authority carries over.

Can I just add schema markup and sitemap to my Lovable/Bolt site to fix this? You can add them, but the rendering problem is upstream of schema. Googlebot needs to see your content before schema becomes useful. Schema on a client-rendered site is decoration on a foundation problem.

What’s the single biggest SEO improvement for an AI-built site? Moving from client-side to server-side rendering. Everything else is secondary. This is why migration to a platform with SSR by default (WordPress, Astro, hand-built Next.js with proper SSR) tends to deliver more SEO lift than any amount of in-place tweaking.

How much does it cost to have someone do this? Fixing SEO in place on an AI-built site, if you hire a freelancer or agency, typically costs $1,500–$5,000 and takes 2–6 weeks — with no guarantee the foundational rendering problem gets solved. Migrating to WordPress through us costs $299–$1,299 and takes 7 days. The math tends to favor migration for most sites.

The honest bottom line

AI website builders are phenomenal prototyping tools. They are not, and were not designed to be, SEO-optimized publishing platforms. The SEO gaps in their default output aren’t bugs — they’re tradeoffs the builders made in favor of speed and simplicity.

If you’re at the point of reading a 14-minute SEO guide, the validation phase is over. You have a real product, real content, and a real need for organic traffic. The fastest honest path from where you are to where you want to be is usually a platform that prioritizes SEO by default.

If that’s WordPress and you want help, tell us about your site — the audit is free and we’ll give you a fixed quote within 24 hours. If you’d rather stay on your current stack, this guide is yours to work through. Either way, the goal is the same: a site Google can see, readers can find, and you can grow.

2 slots left this week

Ready to turn your AI prototype into a real WordPress site?

Free audit. 7-day delivery. No lock-in. Let's talk.

24-hour audit turnaround Fixed-scope quote No obligation