Questions
-
Web Core Vitals and Page Speed Insights Not Matching Scores
To my understanding, GSC is reporting based on "field data" (meaning the aggregate score of visitors to a specific page over a 28 day period). When you run Page Speed Insights, you can see both Field Data and "lab data". The lab data is your specific run. There are quite a few reasons why field data and lab data may not match. One reason is that changes have been made to the page, which are reflected in the lab data, but will not be reflected in the field data until the next month's set is available. Another reason is that the lab device doesn't run at the exact same specs as the real users in the field data. The way I look at it is that I use the lab data (and I screen print my results over time, or use other Lighthouse-based tools like GTMetrix, with an account) to assess incremental changes. But the goal is to eventually get the field data (representative of the actual visitors) improved, especially since that's what appears to be what will be used in the ranking signals, as best I can tell.
On-Page / Site Optimization | | seoelevated0 -
Moving Images to Subdomain: SEO Impacts
Hi Jared, Webmaster tools is indeed the only way you can check what the % of image search is (you can also see it in Google Analytics if you have linked both accounts - it is the same data however as it's measured via webmaster tools). Solution you put in place seems a very good idea. rgds Dirk
Vertical SEO: Video, Image, Local | | DirkC0 -
Maintaining Link Value Of Old URLS With 301 Redirects
We have changed the URLS a few times over the past decade. It is just maintaining some of the super old backlinks to category and product pages that are getting harder to maintain. I appreciate you taking the time to answer.
Technical SEO Issues | | RMATVMC0 -
URL Changes And Site Map Redirects
Don't worry too much about the sitemap. When your new URLs are up, upload your sitemaps with the new urls and you're good. Between your redirects and sitemap, Google won't have a problem knowing about the new URLs--how long it takes to crawl all of them may be a different story.
Technical SEO Issues | | Chris.Menke0 -
AJAX and High Number Of URLS Indexed
Axial Dev, Thanks for responding I have considered the Robots disallow however my worry has been several Videos by Matt Cutts talking about how now that the Google Bot can make AJAX requests that the best practice is to allow it to do so. So that is why I have not thrown on all around disallow addition to our robots.txt file, but It is clearly having issues on our site distinguishing the difference between a server side AJAX request on our site vs an actual real URL that should be indexed Below is Matt Cutts plea to allow Java script to be allowed to be crawled there are a few others out there as well. Does anyone else have experience with AJAX server side requests being indexed and how they combated the issue? watch?v=B9BWbruCiDc
Technical SEO Issues | | RMATVMC0