Last week, I attended my first Google Search Central event in Zurich.
Many conversations centred on the latest Search Console releases, discussions around the /llms.txt file, concerns around the looming Core Update (it happened, RIP) and broader predictions about where the industry is heading into the new year.
In this post, I bring together the key highlights of the conference from my perspective.
LLMS.TXT – Just Do It (Be Careful)
The llms.txt is simply a text file with some content formatted as markdown (.md).
Despite this very fact arousing the AI space, the file is not yet being recognised as a standard across the Web. Martin Splitt pointed out that the file is harmless to your AI visibility, in fact you should probably let the ball rolling rather than spending time debating on it.
What does it mean?
Just do it, but be careful. Rather than spending 5 hours discussing with stakeholders, why don’t you use 15 minutes to add the file to the root of your site?
And that’s fair.
But the trickiest bit that went down unnoticed is how you configure the file.
I’ve seen examples of an llms.txt file dynamically appending the /llms.txt URL path to certain category pages.
It’s not a big deal as these pages would route straight to HTTP 404, and were very hardly requested by search engine bots over time.
Nevertheless, you should be cautious. Exposing or dynamically routing /llms.txt through multiple page templates may generate an attractive entry to malicious AI agents or bots. At scale, this behaviour can downstream security incidents and increase the risk of resource exhaustion.
Not to mention that you shouldn’t reference the llms.txt file as a link across your site, or that will get indexed and users might be able to surface.
Either way, the llms.txt file does not prove any visible impact AI visibility – it’s food for all headless Chrome instances commonly used during web scraping.
Unless you want to ease AI scrapers to nose into your content?
Go play around with the llms.txt file, but ensure it is configured for augmented cybersafety.
Driving success through the lenses of the user
This is a rewind to the first speech of the day.
Nikola Todorovic made a point about the usual narrative of optimising for the end user rather than obsessing over search engine algorithms. Content creators focusing on superficial metrics like title length or paragraph count, instead of the user experience, will not maintain their position for long.
SEOs are advised to use Google Search Console to map out the three pillars of search: Crawling, Indexing, and Serving.
The recent feature enhancements in GSC were also discussed:
| Purpose | Key Details | |
| Query Groups (Clusters) | Solves the problem of analysing query-level dataโwhere dozens of variations (misspellings, synonyms) reflect the same user intent | Helps understand the main user intent of the audience by showing aggregated performance data per group. This tool is AI-powered, calculated daily, and is an internal Search Console view that has nothing to do with ranking. |
| Custom Annotations | Allows users to add their own contextual notes directly to performance charts. This eliminates the need to maintain external spreadsheets to track changes. | Should be used for tracking infrastructure changes (e.g., site migration), new SEO efforts (e.g., implementing a plug-in), content focus changes, or relevant external events (e.g., elections or holidays). Notes are limited to 120 characters and are shared with everyone who has access to the property. |
| AI-powered Configuration | Uses natural language (LLM-based) to transform free text input into the appropriate report filters and settings. | Speeds up the analysis of search traffic by quickly configuring complex filters (e.g., comparing data across quarters or focusing on phone searches for specific keywords). Currently supports only the performance report for search results. |
Why does it matter?
Just like everything in tech nowadays, SEO is affected by programmed obsolescence due to the continuous evolution of search, including hundreds or thousands of launches, new ranking changes, new abuse policies, and new AI versions (Gemini e.g;).
Despite Google Search Console still catching up with bringing the data retrieval in the UI up to speed, these new releases might ease your day-to-day and please your webmaster.
Bear in mind, I believe it’s still the typical Google response to distract the public from other breakthroughs (i.e; Core Update).
Either way, it doesn’t even move the elephant in the room: Google Search Console UI is still affected by serious sampling, poor filtering and anonymisation issues.
Storing data in BigQuery and clustering keywords is now easier than ever with LLMs.
๐ Every monday Marco Giordano unpacks serious tips on making good decisions with Google data. Make sure to subscribe to seotistics for free.
Weekly/Monthly Report in Search Console
Google have just slotted in a weekly/monthly report window from the Search Console UI.
The feature will aggregate data by week and month so you get a more polished and granular view of your search traffic.
What does it mean?
It’s not granular, nor polished data. It’s ok if you need to please your client, but doesn’t solve your daily struggle with poorly sampled data and anonymisation on steroids.
- Does it solve the typical 2-day data lag in Looker Studio? No.
- Does it solve sampling and anonymisation issues? No.
The Question is when will they make AI Overview and AI Mode data available?
They say this time next year we’ll have come to a point…
Why Google Trends, Search Console, and Analytics Donโt Match
Enters Daniel Waisberg, analyst at Google Search Console.
Daniel provided an interesting breakdown between GA4 and GSC. It’s quite elementary if you ask me, but crucial to bring up.
I found the breakdown between Google Trends and Google Search Console – and that’s the part where you might say that it is BS!
Google Trends
- Data is aggregated by topic or query
- Data can include non-indexed URLs
- Long-tail queries may be underrepresented, as the data does not necessarily reflect your proprietary or first-party signals
Google Search Console
- Data is aggregated by canonical URL
- Data comes from indexed URLs only
- Long-tail queries may be overrepresented, as this is first-party data tied to your site
What’s the Difference?
- Google Trends: Uses a relative index from 0 to 100, where 100 represents peak interest within the selected timeframe
- Search Console: Provides absolute metrics, similar to a click counter, including:
- Impressions (how often a site or query appears in search results)
- Clicks (traffic received from search)
Why does it matter?
We should always prioritise site-specific demand reflected in our GSC performance report. What shows up in Google Trends focuses on relative popularity, which is an important outlook of the competitive arena, although signals may come from old or irrelevant pages to our business.
Structured Data for Shopping & Merch
Although the speech was highly tailored to shopping retailers, there were a couple of interesting takeaways that would apply to other businesses (e.g; travel).
All roads lead to @Organization schema
It is the foundation of brand recognition in Googleโs Knowledge Graph and plays a critical role in helping Google disambiguate your brand in Search. Regardless of your unique value proposition, implementing @organization markup is essential.
Once in place, it enables advanced use cases such as enriched shipping annotations (e.g. handling time, shipping conditions, member-tier eligibility) and loyalty program integrations.
Why does it matter?
They made a big deal out of the loyalty programs, and membership pages can be enhanced with schema. Brands can define membership tiers (such as Gold or Platinum), associated benefits like points, discounts, or free shipping, and even restrict specific shipping conditions to certain member tiers.
As long as you’re able to deploy schema site wide without technical restrictions orquirky JavaScript post-injections, you’re in a good place to push for marking up member programs and your shopping listings. Pay special attention to elements like prices, they need schema markup like bread.
Next Year will be all about Crawl Management
As AI bots are emerging, Google’s John Mueller suggested SEO should focus on ways to control crawl budget.
What does it mean?
If you’ve been following me, you know I’m resetting the agenda around server logs.
Whether to redesign my approach to building prompts and customise SEO tracking.
Or to simply dissect crawl patterns for specific use cases, including tracking down new user-agents to robots.txt and llms.txt files.
As the next wave of AI bots emerges, server logs will become more valuable.
The demand will not be limited to security teams; product, SEO, infrastructure, and data teams will all want visibility into how these agents interact with the site, what they request, and how they behave at scale.
In most organisations, the main challenge is not budget and often not even technical resources. The real constraint is soft skills.
The key is to build a business case that translates access to server logs into revenue, risk reduction, or operational efficiency.
It may sound like overkill, but in practice this is the direction the industry is moving. Once access is granted, it is important to become comfortable with SQL. For that, you can use LLMs to help you through.
Demonising the December Core Update Rollout (until it happened)
“This update should be out soon, but it is unclear if it will happen in the next couple of days or after the holiday season.“
“John said he wouldnโt be surprised if it was launched in the coming weeks. He then added that hopefully not before the holidays.“
That’s literally the text I sent out in an email recap to my SEO team after the event. RIP.
What does it mean?
On the core update in itself, what do you want to do? Just do your business and review after the holidays.
I quizzed John on an irky question around content generated at scale with GPT to see if he could see any future spam update clamping down on visibility.
He said:
“It doesn’t matter how you build pages as long as they serve an active search intent, complement the brand proposition and align naturally with the core business and TOV”.
Yours truly, Aunt Google