I Should Have Thought About SEO From Day One: A Belated Optimization Story

 


Two weeks after launch and my Google search traffic was exactly zero. It turned out that from Googlebot's perspective, my site was an empty HTML file. That's when I finally understood the fundamental limitation of a React SPA.

The Shock of Zero Search Traffic

After deploying the project on GitHub Pages, I checked Google Search Console daily. Zero impressions, zero clicks. After a week with no change, I told myself "indexing probably hasn't happened yet." By week two, I faced reality.

The URL Inspection tool revealed that Google's rendered version of my pages was nearly empty. Just a title tag and a loading spinner, no actual content. The reason was simple. A React SPA only renders content after JavaScript executes, and Googlebot doesn't always execute JavaScript perfectly.

The Fundamental SEO Problem with SPAs

Traditional websites send fully formed HTML from the server. When Googlebot requests a page, it receives HTML with content already in place. React SPAs work differently. The server sends an empty shell, and content only appears after JavaScript runs in the browser.

Google officially states it can render JavaScript, but in practice it uses a two-phase indexing process. The first pass reads the raw HTML. The second pass renders JavaScript. Getting to that second pass takes time, and not every page renders perfectly.

If I had used Next.js or Remix, server-side rendering would have solved this naturally. But this project started with Vite, and switching frameworks at this point would have meant rebuilding from scratch.

Pre-Rendering: The Pragmatic Compromise

If SSR wasn't an option, the alternative was pre-rendering. This means generating each page's HTML at build time. When Googlebot requests a page, it receives a static HTML file with content already present.

The concept is straightforward. After the build completes, a headless browser visits each page, executes the JavaScript, and saves the rendered HTML to a file. Deploy those HTML files, and Googlebot can read the content without executing any JavaScript.

The downside is that build time grows with the number of pages. As I added guide pages, columns, and the card encyclopedia, the pre-rendering targets expanded to dozens of pages, and build times grew accordingly. Still far simpler than maintaining an SSR server, though.

Dynamic Meta Tags with react-helmet-async

Pre-rendering solved the HTML body problem, but meta tags were another issue. Every page had the same title and description. If every page in Google's search results shows identical metadata, users have no way to tell which page to click.

I introduced react-helmet-async to set unique titles, descriptions, and og:image tags per page. For a card detail page, something like "Tarot Master | The Fool Card Interpretation" -- metadata that accurately reflects the page content.

Open Graph tags and Twitter Card tags were added too. These ensure link previews display properly when shared on social media. Not a direct SEO factor, but social sharing drives meaningful traffic.

Automated Sitemap Generation

For Google to discover all pages on the site, a sitemap is essential. Maintaining one manually is error-prone whenever pages are added, so I built a script that generates it automatically during the build process.

The script extracts all routes from the router configuration, fills in dynamic route parameters (card detail pages and so on), and produces a complete sitemap.xml. Every build produces a fresh, up-to-date sitemap.

I pointed to the sitemap in robots.txt and submitted it to Google Search Console. Within days of submission, indexing speed improved noticeably.

Search Engine Registration: Google and Naver

For Korean-language content, Google alone isn't enough. Visibility on Naver, South Korea's dominant search engine, also matters. I registered the site with Naver Search Advisor and verified ownership.

Naver's indexing behaves differently from Google's. External websites tend to rank lower compared to content on Naver's own blogging and community platforms, and indexing speed is slower. Still, if you don't register at all, you won't appear in results at all. Covering the basics is always worthwhile.

Google Search Console's URL Inspection tool proved invaluable. I could verify whether specific pages were properly indexed and check for mobile usability issues. After applying pre-rendering, a re-inspection confirmed that content was rendering correctly.

Structured Data: JSON-LD Schema Markup

To stand out in search results, structured data helps. Adding JSON-LD schema markup helps Google understand page content more precisely and can trigger rich snippets in search results.

I applied WebApplication schema to the main page, Article schema to guide pages, FAQPage schema to FAQ-containing pages, and ItemList schema to the card encyclopedia to structure all 78 cards as a list.

Whether structured data directly affects ranking is debated, but rich snippets demonstrably improve click-through rates. On the same search results page, which result would you click: the one showing star ratings and FAQ expandables, or the plain text one?

"SEO From Day One" Is the Right Call

Looking back, I should have considered SEO at the project's start. Retrofitting pre-rendering required significant build pipeline changes, and adding meta tags to every page all at once was a substantial effort.

If I had chosen Next.js from the beginning, SSR would have been built in. But at the time, Vite's fast development experience felt more important, and SEO was something I figured I'd deal with "later." A classic side project mistake.

The lesson is clear. If a project needs search traffic, SEO is a "from the start" concern, not a "later" concern. Framework choice, routing structure, and meta tag design should be decided early. The cost of fixing later always exceeds the cost of doing it right the first time.

What's Next

The technical SEO foundation was in place, but generating actual search traffic requires content worth searching for. In Part 14, I cover the strategic pivot from a feature-focused app to a content hub, and the AI-powered content production workflow that made it possible.

댓글

이 블로그의 인기 게시물

사랑을 직접 올리지 않는 설계

감정을 변수로 옮기다 — 3계층 감정 모델

시작의 충동 — "타로 웹앱을 만들어볼까?"