AI Search Optimisation: Make Your Structured Data Accessible
In the rapidly evolving landscape of search engine optimisation (SEO), adapting to AI-driven search engines is crucial. A recent investigation has highlighted a significant challenge for websites that rely on JavaScript to add structured data. AI crawlers, such as GPTBot, ClaudeBot, and PerplexityBot, struggle to access structured data injected via JavaScript, potentially leading to missed opportunities in AI search visibility.
Understanding the Challenge
Structured data, typically implemented in JSON-LD format, provides critical context to search engines about the content on a webpage. This data helps improve search visibility and enrich search results with features like rich snippets. However, when structured data is added dynamically using JavaScript, it may not be included in the initial HTML response provided by the server.
AI crawlers, unlike traditional search engine bots, do not execute JavaScript. They primarily rely on the raw HTML returned from the server. Consequently, if JSON-LD or other structured data elements are added client-side through tools such as Google Tag Manager (GTM), AI crawlers will likely miss them.
Key Findings About JSON-LD & AI Crawlers
Elie Berreby, founder of SEM King, conducted an experiment to analyse the impact of using GTM to insert JSON-LD without server-side rendering (SSR). The findings revealed several critical insights:
Initial HTML Load
When an AI crawler requests a webpage, the server sends the first HTML version. If JSON-LD is injected via JavaScript, it won’t appear in this initial response, making it invisible to AI crawlers.
Client-Side JavaScript Execution
JavaScript modifies the Document Object Model (DOM) dynamically after the page loads. AI crawlers, which do not render JavaScript, cannot access dynamically generated structured data.
Crawlers Without JavaScript Rendering
Unlike Googlebot, which can execute JavaScript and capture dynamically loaded content, AI crawlers cannot. This means any JSON-LD added post-page load will be missed.
Why Traditional Search Engines Are Different
Traditional search engines, such as Google, have sophisticated crawling capabilities that allow them to render JavaScript and detect dynamically inserted structured data. This enables them to capture JSON-LD inserted via GTM or other methods after the page has fully loaded.
In contrast, AI search engines primarily focus on extracting insights from the static HTML served by the website. As a result, businesses relying solely on client-side rendering techniques may find their structured data invisible to AI crawlers, negatively impacting their AI search rankings.
Google’s Warning on JavaScript Overuse
Google has cautioned against the excessive use of JavaScript for critical SEO elements, including structured data. In a recent podcast, Google Search Relations experts discussed how websites often complicate their structure by heavily relying on JavaScript.
Martin Splitt, Google’s Search Developer Advocate, emphasised the importance of balancing JavaScript with essential SEO practices, such as ensuring key content is available in the initial HTML response. John Mueller echoed this sentiment, recommending the use of simpler methods like static HTML to ensure accessibility.
Best Practices for Structured Data Optimisation
To maximise visibility in AI search engines, website owners should adopt strategies that prioritise accessibility. Below are some effective approaches:
- Server-Side Rendering (SSR): Pre-render structured data on the server so that it is included in the initial HTML response. This ensures AI crawlers can easily access the data without relying on JavaScript execution.
- Static HTML: Hardcode schema markup directly into the HTML file. This eliminates any dependency on JavaScript and guarantees that all crawlers can read the structured data.
- Prerendering: Generate static snapshots of web pages where JavaScript has already been executed. These pre-rendered pages provide fully rendered HTML, making structured data visible to AI crawlers.
By implementing these techniques, websites can ensure their structured data is accessible across all search platforms, including AI-driven search engines.
Why This Matters for AI Search Optimisation
AI-powered search engines are gaining prominence, and they operate under different principles compared to traditional search engines. Ensuring structured data visibility is crucial to maintaining a competitive edge in AI-driven search results.
Businesses relying on JavaScript-based solutions, such as GTM, must reconsider their approach to structured data deployment. By transitioning to server-side rendering or static HTML, they can ensure their content remains discoverable and ranks effectively in AI searches.
Final Thoughts
AI search optimisation requires a forward-thinking approach to structured data management. As AI crawlers continue to evolve, adapting best practices now will ensure sustained visibility and better search performance in the future.
Implementing SSR, static HTML, or prerendering methods will empower businesses to leverage structured data effectively and maintain a strong online presence across both traditional and AI search platforms.
Calling all Marketers!
🔴 Are you tired of searching for the perfect job?
Whether you're into content writing, SEO, social media, graphic design, or video editing—full-time, freelance, remote, or onsite—we've got your back!
👉 We post over 30 job opportunities every single day. Yes, every day (all verified).
Join the most reliable and fastest-growing community out there! ❤️
And guess what? It’s FREE 🤑
✅ Join our WhatsApp Group (Click Here) and Telegram Channel (Click Here) today for instant updates.
✅ Follow us on LinkedIn (Click Here) for some extra gyan!