The Net Almanac is an annual report that interprets the HTTP Archive dataset into sensible perception, combining large-scale measurement with professional interpretation from business specialists.
To get insights into what the 2025 report can inform us about what is definitely taking place in search engine marketing, I spoke with one of many authors of the search engine marketing chapter replace, Chris Inexperienced, a widely known business professional with over 15 years of expertise.
Chris shared with me some surprises in regards to the adoption of llms.txt recordsdata and the way CMS techniques are shaping search engine marketing way over we notice. Little-known details that the information surfaced within the analysis, and shocking insights that normally would go unnoticed.
You may watch the complete interview with Chris on the IMHO recording on the finish, or proceed studying the article abstract.
“I feel the information [in the Web Almanac] helped to point out me that there’s nonetheless so much damaged. The net is absolutely messy. Actually messy.”
Bot Administration Is No Longer ‘Google, Or Not Google?’
Though bot administration has been binary for a while – permit/disallow Google – it’s changing into a brand new problem. One thing that Eoghan Henn had picked up beforehand, and Chris present in his analysis.
We started our dialog by speaking about how robots recordsdata are actually getting used to precise intent about AI crawler entry.
Chris responded to say that, firstly, there’s a have to be aware of the completely different crawlers, what their intention is, and basically what blocking them would possibly do, i.e., blocking some bots has greater implications than others.
Second to that, requires the platform suppliers to truly take heed to these guidelines and deal with these recordsdata as acceptable. That isn’t at all times taking place, and the ethics round robots and AI crawlers is an space that SEOs have to find out about and perceive extra.
Chris defined that though the Almanac report confirmed the symptom of robots.txt utilization, SEOs have to get forward and perceive the right way to management the bots.
“It’s not solely understanding what the influence of every [bot/crawler] is, but in addition the right way to talk that with the enterprise. In case you’ve obtained a staff who wish to minimize as a lot bot crawling as attainable as a result of they wish to lower your expenses, that may desperately influence your AI visibility.”
Equally, you may need an editorial staff that doesn’t wish to get all of their work scraped and regurgitated. So, we, as SEOs, want to know that dynamic, the right way to management it technically, however the right way to put that argument ahead within the enterprise as properly.” Chris defined.
As extra platforms and crawlers are launched, search engine marketing groups must take into account all implications, and collaborate with different groups to make sure the suitable steadiness of entry is utilized to the positioning.
Llms.txt Is Being Utilized Regardless of No Official Platform Adoption
The primary shocking discovering of the report was that adoption for the proposed llms.txt customary is round 2% of web sites within the dataset.
Llms.txt has been a heated matter within the business, with many SEOs dismissing the worth of the file. Some instruments, similar to Yoast, have included the usual, however as but, there was no demonstration of precise uptake by AI suppliers.
Chris admitted that 2% was the next adoption than he anticipated. However a lot of that progress seems to be pushed by search engine marketing instruments which have added llms.txt as a default or non-obligatory characteristic.
Chris is skeptical of its long-term influence. As he defined, Google has repeatedly said it doesn’t plan to make use of llms.txt, and with out clear dedication from the most important AI suppliers, particularly OpenAI, it dangers remaining a distinct segment, symbolic gesture moderately than a useful customary.
That mentioned, Chris has skilled log-file knowledge suggesting some AI crawlers are already fetching these recordsdata, and in restricted circumstances, they might even be referenced as sources. Inexperienced views this much less as a aggressive benefit and extra as a possible parity mechanism, one thing which will assist sure websites be understood, however not dramatically elevate them.
“Google has again and again mentioned they don’t plan to make use of llms.txt which they reiterated in Zurich at Search Central final 12 months. I feel, basically, Google doesn’t want it as they do have crawling and rendering nailed. So, I feel it hinges on whether or not OpenAI say they are going to or received’t use it and I feel they produce other issues than attempting to arrange a brand new customary.”
Totally different, However Reassuringly The Similar The place It Issues
I went on to ask Chris about how SEOs can steadiness the distinction between search engine visibility and machine visibility.
He thinks there’s “a major overlap between what search engine marketing was earlier than we began worrying about this and the place we’re initially of 2026.”
Regardless of this overlap, Chris was clear that if anybody thinks optimizing for search and machines is identical, then they aren’t conscious of the 2 completely different techniques, the completely different weightings, the truth that interpretation, retrieval, and technology are fully completely different.
Though there are completely different techniques and completely different capabilities in play, he doesn’t assume search engine marketing has basically modified. His perception is that search engine marketing and AI optimization are “type of the identical, reassuringly the identical within the locations that matter, however you’ll need to method it in another way” as a result of it diverges in how outputs are delivered and consumed.
Chris did say that SEOs will transfer extra in direction of feeds, feed administration, feed optimization.
“Google’s common commerce protocol the place you may probably transact immediately from search outcomes or from a Gemini window clearly modifications so much. It’s simply one other transfer to push the web site out of the loop. However the info, what we’re really optimizing nonetheless must be optimized. It’s simply in a special place.”
CMS Platforms Form The Net Extra Than SEOs Understand
Maybe the most important shock from Net Almanac 2025 was the dimensions of affect exerted by CMS platforms and tooling suppliers.
Chris mentioned that he hadn’t realized simply how massive that influence is. “Platforms like Shopify, Wix, and many others. are shaping the precise state of tech search engine marketing most likely extra profoundly than I feel lots of people actually give it credit score for.”
Chris went on to elucidate that “as well-intentioned as particular person SEOs are, I feel our total influence on the net is minimal exterior of CMS platforms suppliers. I might say if you’re actually decided to have an effect exterior of your particular purchasers, it’s good to be nudging WordPress or Wix or Shopify or a number of the massive software program suppliers inside these ecosystems.”
This creates alternative: Web sites that do implement technical requirements appropriately might obtain important differentiation when most websites lag behind finest practices.
One of many extra fascinating insights from this dialog was that a lot on the net is damaged and the way little influence we [SEOs] actually have.
Chris defined that “plenty of SEOs imagine that Google owes us as a result of we preserve the web for them. We do the soiled work, however I additionally don’t assume we’ve as a lot influence maybe at an business stage as possibly some wish to imagine. I feel the information within the Net Almanac type of helped present me that there’s nonetheless so much damaged. The net is absolutely messy. Actually messy.”
AI Brokers Received’t Exchange SEOs, However They Will Exchange Dangerous Processes
Our dialog concluded with AI brokers and automation. Chris began by saying, “Brokers are simply misunderstood as a result of we use the time period in another way.”
He emphasised that brokers aren’t replacements for experience, however accelerators of course of. Most search engine marketing workflows contain repetitive knowledge gathering and sample recognition, areas well-suited to automation. The worth of human experience lies in designing processes, making use of judgment, and contextualizing outputs.
Early-stage brokers might automate 60-80% of the work, just like a extremely succesful intern. “It’s going to take your data and your experience to make that relevant to your given context. And I don’t simply imply the context of internet advertising or the context of ecommerce. I imply the context of the enterprise that you just’re particularly working for,” he mentioned.
Chris would argue that plenty of SEOs don’t spend sufficient time customizing what they do to the consumer particularly. He thinks there’s a chance to construct an 80% automated course of after which add your actual worth when your human intervention optimizes the final 20% enterprise logic.
SEOs who interact with brokers, refine workflows, and evolve alongside automation are much more prone to stay indispensable than those that resist change altogether.
Nonetheless, when experimenting with automation, Chris warned we must always keep away from automating damaged processes.
“You have to perceive the method that you just’re attempting to optimize. If the method isn’t superb, you’ve simply created a machine to supply mediocrity at scale, which frankly doesn’t assist anybody.”
Chris thinks that this may give SEOs an edge as AI is extra broadly adopted. “I counsel the those that interact with it and make these processes higher and present how they are often frequently developed, they’ll be those which have better longevity.”
SEOs Can Succeed By Partaking With The Complexity
The Net Almanac 2025 doesn’t counsel that search engine marketing is being changed, however it does present that its position is increasing in methods many groups haven’t absolutely tailored to but. Core ideas like crawlability and technical hygiene nonetheless matter, however they now exist inside a extra advanced ecosystem formed by AI crawlers, feeds, closed techniques, and platform-level choices.
The place technical requirements are poorly applied at scale, those that perceive the techniques that form them can nonetheless acquire a significant benefit.
Automation works finest when it accelerates well-designed processes and fails when it merely scales inefficiency. SEOs who concentrate on course of design, judgment, and enterprise context will stay important as automation turns into extra frequent.
In an more and more messy and machine-driven internet, the SEOs who succeed will probably be these prepared to have interaction with that complexity moderately than ignore it.
search engine marketing in 2026 isn’t about selecting between search and AI; it’s about understanding how a number of techniques eat content material and the place optimization now occurs.
Watch the complete video interview with Chris Inexperienced right here:
Thanks to Chris Inexperienced for providing his insights and being my visitor on IMHO.
Extra Assets:
Featured Picture: Shelley Walsh/Search Engine Journal
