Recent tests have shown a positive correlation between blocking web fonts and increasing server response time.
An increase in the average response time not only can harm page loading performance but could have an adverse impact on the environment. If a server is stalled while attempting to parse a request, the hardware may overheat and consume unnecessary energy, thereby increasing the concentration of CO2 in the atmosphere.
Although an ecological approach to SEO is often overlooked, it would be beneficial for the environment.
The assumptions have been tested across the last two months and in this post I’m going to reveal whether blocking web fonts can ultimately impact the average response time of requests needed to request a webpage.
By the way, Google has a great resource to help you improve the server response time.
So, go and check it out!
Methodology
The test was conducted by following this statistical methodology.
H0 = Blocking web fonts correlates with an increase in avg. response time
H1 = Blocking web fonts does not correlate with an increase in avg. response time
The web fonts block was applied on seodepths.com from February 18th to April 18th and was divided into two installments.
- The first installment involved disallowing web fonts and took place during the first month.
- The second installment involved revalidating web fonts via the robots.txt file and occurred during the second month of the test.
Although the test focused on tracking the average response time that the server takes to fetch website resources, it should be noted that the total download size and the number of crawl requests were only used to complement the checks as they were not the focus of the test.
In fact, the total number of crawl requests is a vanity metric that does not necessarily correlate with improved crawlability but instead puts more strain on the server, which can result in higher costs.
The Crawl Stats section was used to track and monitor changes affecting the block, which implemented by accessing the robots.txt file and disallowing the /wp-content/ fonts URL string
💡 BONUS
I use RankMath as the main SEO plug-in. The Web Font block procedure was made possible by literally following a 3-click steps using this handy plug-in.
Tracking the Avg. Response Time
Web fonts were first disallowed on the 18th of February and reinstated on the 17th of March.
Let’s take a brief look at the most significant stages of the test throughout time.
📌 25 Feb – 2 Mar: Uptick in Average Response Time
As soon as web crawlers received the web fonts blocking signal, the server response increased and kept trending upwards. In turn, total crawl requests seemed to regress a little.
📌 10 Mar-17 Mar: Steady rise in the Avg. Response Time before the revalidation
As the server response increased, crawl requests and total download size waned a little.
Shortly before reinstating the web fonts, there was a spike in the avg. response time.📌 17 Mar-25 Mar: Avg. Response Time started to wane for the first time
Shortly after the fonts were reinstated, the server response time started to slowly decrease.
📌 25 Mar-1 Apr: Avg.Response Time saw a drastic drop.
As total crawl requests experienced a sharp uplift, the average server response time dropped. We can assume the revalidation of Web Fonts played a role in this decrease
N.B: the increase in crawl requests was likely triggered by tweaks on AMP configuration, as I've provided an AMP version and a non-AMP version of my website for nearly a week.
📌 1 Apr-10 Apr: Avg.Response Time continues to wind down.
Server response time followed the previous descending trend since the revalidation of Web fonts, as total crawl requests saw a small uplift.
📌 10 Apr-18 Apr: Avg.Response Time levels down and keep steady.
Things seem to have improved as the server correctly returns resources to the browser.
Does impeding Web Fonts increase the Server Response?
The testing track showed the web fonts block extended the server response time required to fetch the resources requested from the browser. Once web fonts have been revalidated, though, the server visibly improved the flow of conversations with the browser.
In layman’s terms, after the server and the browser had a dispute around a Web Font, they weathered the storm and restored their friendship.
So far we could support our (H0) hypothesis of an existing correlation between the average server response time and a block in page resources.
Not so fast.
There are a couple of complications revolving around Google Search Console’s data sampling procedures and the subsequent configuration of different data reports.
This could tempt me to reject the correlation assumptions supported by the (H0) hypothesis, but before panicking around let’s nail down the matter.
The Problem with GSC Data Sampling
Google Search Console was designed to help webmasters with reporting, and over the years many improvements have been made to both the UI and UX.
However, some data-minded individuals still complain about the intricacies of the platform’s data parsing procedures and setup process. Personally, I don’t have any complaints as I understand that getting across data analysis to the public can be quite challenging.
Going back to our testing, within the crawl stats you can access a report dedicated to diagnosing a specific type of Googlebot. This report is called Page Resource Load and provides information exclusively on stylesheets, scripts, and web fonts from your website.
The output of this report differs significantly from the traditional Crawl Stats that have been discussed thus far.
When the web font block was imposed on February 18th, crawl requests and server response time for page resources disappeared, which made sense given seodepths.com relies more on web fonts rather than JavaScript. During this period, the server did not return any web fonts.
After web font revalidation on March 18th, indicators started to rise again, and the server received crawl requests for reinstated woff2 web fonts. During the first few days, the overwhelming requests slowed down the server response, but the workload became more tolerable over time, and the trends eventually started to decline.
The ‘Page Resource Load‘ chart also shows when Google last crawled fonts and, thus, when they were allowed to be fetched again.
This information helps assess the impact of blocking web fonts on crawl requests.
Nevertheless, this tab has a significant drawback: resources are not always sampled accurately.
This can cause a great deal of stress for webmasters with limited knowledge of data science and especially technical SEO, as potential misinformation from Google Search Console can quickly spread.
The Problem with Search Console Samples
One of the underestimated features of GSC is its ability to store historical data for up to 90 days, providing a sample of issues from the cache.
Fortunately, this is a matter of concern for some technical SEO professionals in the industry, as the topic occasionally resurfaces.
Overall, we can approach the crawl stats section in two ways:
Pre-action: in case the crawled request samples are dated before a block was applied, then Google will showcase only a sample of instances occurring in the last 3 months.
This is one of the most common scenarios, and it’s also what happened with my test.
Post-action: in case the crawled request samples are dated after a block was applied, chances are Googlebot is rendering the content because the required resources can be loaded from the cache.
HOWEVER,
⚠️ Neither of these cases implies that Googlebot is accessing your server.
If you have correctly applied the blocking restriction via robots.txt, Googlebot will not be able to crawl that resource.
Either way, I recommend running regular tests using the robots.txt tester tool to make sure disallow rules are effective on your website.
Conclusion
Google Search Console is top-notch for pulling data straight from the horse’s mouth, but we need to pay attention to how the tool samples issues before reporting and making recommendations.
This case study highlights an important lesson:
Applied data literacy is the only SEO tool you will ever need.
Even first-party tools like Google Search Console and Google Analytics can mislead your decision-making if you can’t read between the lines of their reports.
It’s crucial to distinguish between each report provided by the Crawl Stats on Google Search Console.
In simple terms, if your audit’s scope is limited to investigating potential rendering bottlenecks, you should refer to the Page Resource Load report.
Conversely, if you need to monitor the overall server response time, the Crawl Stats report provides an overview of trends triggered not only by changes in page resource loads but also by anything else that may impact your server’s health (e.g., host status).
Since the test was performed and tracked on the largest sample of events impacting the server response, I confirmed the veracity of the H0 hypothesis.
Blocking web fonts correlates with an increase in average response time.
Related Posts
FAQ
Why does Search Console return incorrect resource samples?
The issue with Search Console returning incorrect resource samples is that the tool uses historical data for up to 90 days to provide examples. Depending on when changes were made to a webpage, Google may select samples of issues that date back up to three months or retrieve them from the cache