So then that begs the question ... may I ask why you were initially hesitant to respond to me the other week? I'm just curious.

Just a correction to my last comment: 'I love being part of the DaniWeb experience,' not Daniwev. It was a typo I noticed immediately, but I couldn't edit the comment (which could be a useful feature).

I was hesitant to respond because discussing SXG seemed to be significantly off-topic for this thread. Additionally, the fact that Lighthouse audit results are an efficient way to understand what affects CWV in a web app , seemed so obvious to me that justifying it as an opinion felt weird. I'm not saying that Lighthouse audit results are a perfect representation of CWV, they aren't. However, using it in the same PC, location, connection, etc., provides one of the few relatively reliable tools we have to understand what changes to make and observe their impact. Consequently, improving those metrics will improve them for real Chrome users, thus affecting CWV (perhaps not by the same percentage for everyone, due to varying locations, mobile devices, PCs, connections, etc., but in the same general direction).

Ultimately, I'm glad I responded. Your responses made me to realize that I need to conduct more tests with SXG. I've observed the alt-svc on DaniWeb and understand that you use different Cache-Control headers. I wasn't aware (though I should have suspected) that you use different caching expirations for older threads. However, I'm still confused about the "All of our pages (including this one!) are cached for non logged-in users, which includes Googlebot" part. If this page we're on is cached via SXG by Google, what will a non logged-in user see when they enter from a Google search result? (will they miss new posts ?)

This question is not limited in a post in a forum , it could be a product page that Google has cached it (in a web app that uses SXG) with a different price or / and availability.

It was a typo I noticed immediately, but I couldn't edit the comment (which could be a useful feature).

You should be able to click the little edit button for up to 10 or 15 minutes (I forget which) after posting in order to correct typos. We don't allow you to edit your post after that because we've run into problems in the past in situations where, for example, a student will post a question, get responses and help, and then want to cover their tracks because their professor told them they can't get online help with the take-home exam, so they go back and edit their first post saying "never mind" or something like that, making the entire thread useless for future visitors and disrespectful to the people who took the time to respond.

However, using it in the same PC, location, connection, etc., provides one of the few relatively reliable tools we have to understand what changes to make and observe their impact.

You are assuming that most people have reliable internet connections that allows them to perform the same Lighthouse test multiple times and get consistent answers. Something else I forgot to mention is that a site like DaniWeb has complicated ads that are different each time the page loads. Some ads are well written and some are a huge drain on all CWV factors, so that also makes it really hard to compare trends with Lighthouse.

Your responses made me to realize that I need to conduct more tests with SXG.

So at DaniWeb, we use Cloudflare, and have simply enabled the SXG functionality through Cloudflare which is just one-click.

However, I'm still confused about the "All of our pages (including this one!) are cached for non logged-in users, which includes Googlebot" part. If this page we're on is cached via SXG by Google, what will a non logged-in user see when they enter from a Google search result? (will they miss new posts ?)

Whether it's cached via SXG for Google first time visitors, or any non logged-in user who comes across DaniWeb (try logging out yourself), they may miss new posts and will see all cached content. However, as mentioned, it all has to do with setting appropriate Cache-control headers based on the page, and how frequently we expect new content to be relevant.

This question is not limited in a post in a forum , it could be a product page that Google has cached it (in a web app that uses SXG) with a different price or / and availability.

The same rule applies: Set appropriate Cache-control headers for your use case. If you have an e-commerce store where you frequently hold sales where the price changes frequently, you might choose to cache for just an hour or just a day at a time. If a product's price hasn't changed in 10 years, then caching for a week or a month should be fine. It's always going to be a balance between how frequently the content may change, with how much traffic you get.

For example, suppose a webpage gets frequently updated so you decide to cache for only an hour at a time. If you only get 3 visitors per hour, the overhead of setting and fetching from the cache isn't worth it, because you're creating more overhead for every 4th visitor. But if you get 1000 visitors per hour, then that would make it worth it.

I was hesitant to respond because discussing SXG seemed to be significantly off-topic for this thread.

I am all for letting discussions flow and evolve naturally, and what started as a topic about lazy loading evolved into a topic about everything that could be done to improve page loading times. However, I do agree that this information might all be more accessible to others to find if it were in a separate topic about SXG instead of buried on the 2nd page of a lazy loading thread. Oh well. :-/

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.