I Audited My Own AI Agency's Website. It Was Invisible to Google.
I built an AI agency website. It looked amazing. Google couldn't see a single word on it.
What Happened
We launched nyclaw.io with everything you would want from an AI agency site. Beautiful design. Clear messaging. Transparent pricing. An explanation of our OODA Loop consulting framework. A contact form. Service pages for AI automation and AI-powered marketing.
We built it on Next.js — one of the most popular modern web frameworks. We used Tailwind CSS for styling. We deployed it on Vercel. Everything loaded fast, looked clean, and worked perfectly on mobile.
There was just one problem: traffic was zero. Not low. Zero.
Weeks went by. No organic search impressions. No clicks. Nothing in Google Search Console. We were getting more traffic from telling people the URL in conversation than from the entire internet.
So we did what any self-respecting AI agency should do: we audited ourselves. What we found was embarrassing, educational, and — if you have a website built on a modern JavaScript framework — probably happening to you too.
The Audit: 8 Things We Got Wrong
We went through our site with fresh eyes and a technical checklist. Here is every issue we found, in order of severity.
1. The homepage was entirely client-rendered
This was the big one. Our homepage component had 'use client' at the top of the file. In Next.js App Router, that single directive tells the framework: "Do not render this on the server. Ship it as JavaScript and let the browser handle it."
What that means for SEO: when Google's crawler requests our homepage, it receives an HTML file that looks roughly like this:
<html>
<body>
<div id="__next"></div>
<script src="/_next/static/chunks/main.js"></script>
</body>
</html>That is it. No headline. No service descriptions. No pricing. No contact info. Just an empty container and a JavaScript file that the browser is supposed to execute to fill in the content. Google's crawler can execute JavaScript, but it is slower, less reliable, and deprioritized. For a new site with no domain authority, it meant we were effectively invisible.
2. No sitemap.xml
A sitemap tells search engines what pages exist on your site and how often they change. We did not have one. Google was left to discover our pages by following links — but since the homepage was invisible, there were no links to follow. It was a dead end right at the front door.
3. No robots.txt
A robots.txt file gives crawlers basic directives: what to crawl, what to skip, where to find the sitemap. We had nothing. No explicit rules, no sitemap reference. It is like opening a store with no signage — not even a door number.
4. No structured data (JSON-LD)
Structured data is machine-readable markup that tells Google: "This is a professional service business, located here, offering these services, at this price range." Without it, Google had to guess what we were. It guessed nothing, because it could not even see the page content.
5. No Open Graph tags
When someone shared our link on LinkedIn, Twitter, or iMessage, they got a broken preview. No title, no description, no image. It looked like a spam link. This killed any organic sharing potential before it started.
6. Service pages in the nav did not actually exist
Our navigation had links to service-specific pages, but those routes were never created. They were anchor links on the homepage pointing to sections that only existed once JavaScript rendered them. From a search engine's perspective, those pages did not exist at all. From a user's perspective, clicking them from Google would lead nowhere useful.
7. Only 7 indexable pages total
Our competitors in the AI consulting space had 50 to 200+ indexable pages. We had 7. Fewer pages means fewer opportunities to rank for search queries, fewer entry points for potential clients, and less overall domain authority. Content is surface area, and we had almost none.
8. No FAQ sections
FAQ sections serve double duty. They answer real questions your prospects have, and they are eligible for Google's rich results (the expandable Q&A that appears directly in search). They are also prime content for AI search tools like ChatGPT, Perplexity, and Google AI Overviews to cite. We had none. Easy wins, left on the table.
Why This Happens (And Why It Is More Common Than You Think)
Here is the core issue, explained simply.
There are two fundamental ways a website can work:
Server-Side Rendering (SSR)
The server builds the full HTML page and sends it to the browser (and to Google). The content is immediately readable. This is how traditional websites work, and it is what search engines expect.
Client-Side Rendering (CSR)
The server sends a mostly empty HTML shell. JavaScript runs in the browser and fills in the content after the page loads. The content only exists after JavaScript executes.
The analogy: Server-side rendering is like a storefront with the products displayed in the window. Anyone walking by can see what you sell. Client-side rendering is like having the world's best storefront with the shutters permanently closed. The products are in there — but only if you walk inside, flip the lights on, and wait for the display to assemble itself.
Google is the person walking by. If the shutters are closed, they keep walking.
Here is the frustrating part: Next.js App Router defaults to Server Components. Out of the box, it does the right thing. But the moment you add 'use client' at the top of a file — often because you need a single interactive element like a form or a dropdown — everything in that file becomes client-rendered. The entire page. All of it.
This is not a bug. It is how React and Next.js work. But if you do not understand the distinction, you can accidentally make your entire site invisible to search engines with two words at the top of a file. That is exactly what we did.
The Fix: What We Changed
Once we understood the problems, fixing them was straightforward. Here is exactly what we did.
Separated interactive and static content
Our homepage was 95% static text and 5% interactive elements (a contact form). We extracted the form into its own small Client Component and made the main page a Server Component. Now 95% of the content ships as ready-to-read HTML. The form still works perfectly — it just lives in its own client boundary.
Added sitemap.ts and robots.ts
Next.js has built-in API routes for generating sitemaps and robots.txt files. We created sitemap.ts and robots.ts in the app directory. Every page we want indexed is now listed. Google knows exactly what exists and where to find it.
Added JSON-LD structured data
We added three schemas: ProfessionalService (who we are and what we do), WebSite (site-level information), and FAQPage (our frequently asked questions). Google now understands our business at a machine-readable level.
Added Open Graph and Twitter Card meta tags
Every page now has proper OG title, description, and image tags. Links shared on social media show rich previews with our branding. This is table stakes for any modern website, and we were missing it entirely.
Built real service pages at real routes
Instead of anchor links to homepage sections, we created dedicated pages for AI Workflow Automation, AI Strategy & Consulting, and AI-Powered Marketing. Each page has its own metadata, structured data, and content. More pages means more surface area for Google to index and rank.
Added FAQ sections across the site
Every major page now has a FAQ section with real questions our prospects ask. These are marked up with FAQPage schema so they are eligible for Google rich results and AI search citations. It is free real estate in the search results.
Before & After
Here is the difference, side by side.
| Metric | Before | After |
|---|---|---|
| What Google sees | <div id="__next"></div> | 2,000+ words of HTML content |
| Structured data schemas | 0 | 3 (ProfessionalService, WebSite, FAQPage) |
| Sitemap | None | 10+ pages in sitemap.xml |
| Meta tags | None | Title, description, OG, Twitter, canonical on every page |
| Lighthouse SEO score | Untested (invisible) | Targeting 90+ |
| Social sharing preview | Broken / blank | Rich preview with title, description, branding |
The site went from a beautiful page that no search engine could read to a fully indexable, structured, search-optimized presence — without changing a single word of the visible copy. The content was always good. It was just locked behind a JavaScript wall.
Your Turn: Is YOUR Website Invisible?
If your website was built on React, Next.js, Vue, Angular, or any modern JavaScript framework, there is a real chance you have the same problem. Here is how to check in 60 seconds.
The 60-Second Visibility Test
- Open your homepage in a browser.
- Right-click anywhere and select "View Page Source." (Not "Inspect Element" — that shows the JavaScript-rendered version. You want the raw HTML the server sends.)
- Search for your headline text. Can you find the actual words that appear on your homepage? Your business name? Your service descriptions?
- If yes: Good. Your content is server-rendered and Google can probably see it.
- If no: You have the same problem we had. Your site looks great to humans but is invisible to search engines.
While you are at it, check these too:
- -Sitemap: Go to
yoursite.com/sitemap.xml. Do you get an XML file listing your pages? If you get a 404, you have no sitemap. - -Robots.txt: Go to
yoursite.com/robots.txt. You should see crawl directives and a sitemap reference. - -Structured data: Paste your URL into Google's Rich Results Test. Does it find any schemas?
- -Social preview: Share your link in a private Slack channel or iMessage. Does it show a rich preview with a title and description, or a blank box?
If any of these checks failed, your website is leaking potential traffic every single day. The good news: every one of these issues is fixable, usually in less than a week.
The bad news: every day you wait is another day your competitors are showing up in search results and you are not.
Not Sure If Your Site Is Visible?
We will audit your website for free and tell you exactly what Google can and cannot see. No pitch, no commitment. Just the data.
Get a Free AI Audit →Frequently Asked Questions
Why isn't my website showing up on Google?
The most common reasons are: your site is client-side rendered (Google sees an empty page), you have no sitemap.xml telling Google what pages exist, your robots.txt is blocking crawlers, or you have no meta tags or structured data helping Google understand your content. A technical SEO audit can identify exactly which issues apply to your site.
What is client-side rendering and why is it bad for SEO?
Client-side rendering (CSR) means your website content is generated by JavaScript in the browser after the page loads. Search engines like Google primarily read the initial HTML your server sends. If that HTML is mostly empty — just a shell with script tags — Google sees a blank page. Server-side rendering (SSR) sends fully formed HTML so search engines can read every word immediately.
How do I check if my website is invisible to search engines?
Right-click anywhere on your homepage and select "View Page Source." If you see your actual text content (headlines, paragraphs, service descriptions), your site is server-rendered and visible. If you see mostly empty HTML with a bunch of JavaScript file references and a single div like <div id="__next"></div>, your content is client-rendered and likely invisible to search engines.
What is structured data and why does it matter?
Structured data (JSON-LD) is machine-readable code you add to your pages that tells search engines exactly what your content is — a business, an article, a FAQ, a product. It helps Google display rich results (star ratings, FAQ dropdowns, business info panels) and helps AI search tools like ChatGPT and Perplexity cite your content accurately.
How long does it take to fix SEO problems?
The technical fixes themselves can often be implemented in a few days to a week. However, seeing results in Google takes longer — typically 2 to 8 weeks for Google to re-crawl and re-index your site, and 2 to 6 months to see meaningful ranking improvements. The sooner you fix the issues, the sooner the clock starts.
Want us to check YOUR site?
Get a Free AI Audit →