Questions
-
Mobile site scrolls past content straight to the products. Can this affect our seo?
Hi JH, Typically, collapsing lengthier content on a mobile site (with an option to expand/view the content if desired) in order to improve UX is not considered an issue for Google. However, anytime you're using JS, there's a risk that search engine crawlers won't be able to see what you're doing. You may want to test with JS disabled in your browser to get an idea of what they might be seeing otherwise. What is the reason for the length of the content at the top of the page? Are you creating that content for users or for search engines? I'm guessing it's not really for users if you're auto-scrolling them past it? This is often something that Google can spot and discount the value of. I would probably recommend (if possible) a "collapse and expand" approach to lengthy content at the top of the page, rather than autoscrolling down to the products. Show the beginning of that content and offer a "click to expand" if users want to read the full text. You could use JS for that expand and default a non-JS experience to simply display the full content, to ensure that search engines do see the full text. Or if the content includes images, consider removing the images or shrinking them for the mobile version of the site. You may also want to test shortening the length of the content on some of your pages and see whether this impacts performance in one direction or the other. Hope that helps!
Intermediate & Advanced SEO | | bridget.randolph1 -
Should we Nofollow Social Links?
I agree with Andreas explanation of Google rules on using nofollow /
Technical SEO Issues | | jasongmcmahon0 -
Very Old Pages Creeping Up - Advice
Thanks for the reply, this makes a lot of sense. Very much appreciated
Intermediate & Advanced SEO | | JH_OffLimits0 -
Advice - Remova/ Disavowl of Inbound Spam Links
Hi there, It's not the nicest task in the world, but unfortunately, it's best to manually review websites before you add them to a disavow file. You can use metrics such as Moz spam score to filter them and start with the worst score which may speed up the process a bit. You could also try and prioritise multiple URLs that are from the same domain. For example, if you have 50 URLs linking to you from one domain which looks suspect, you probably only need to review a few of those URLs in order to understand the quality/type of link they are and take action. I'd avoid putting a hard rule in place such as disavowing anything below a certain metric score. You may accidentally disavow links which are perfectly fine. There is also some debate as to the impact that the disavow tool has unless you have a manual or algorithmic penalty in place. If you don't see any evidence of a penalty, my advice would be to only disavow links which are clearly low quality/spammy. Hope that helps! Paddy
Technical SEO Issues | | Paddy_Moogan0 -
Home Page Disappears From Google - But Rest of Site Still Ranked
The issue is now being reported as fixed, with a response from Google’s John Mueller saying it was a technical issue their end. https://searchengineland.com/google-de-indexing-issue-now-fixed-result-of-technical-issues-315058
Intermediate & Advanced SEO | | Xiano0 -
Canonical and Alternate Advice
That would normally be the case but not tonight. LOL, I am picking up a lot of the UK Q&A I will be at BrightonSEO and search love London if any of you guys will be in the area I'd love to grab a pint? sincerely, Thomas
Intermediate & Advanced SEO | | BlueprintMarketing2 -
301 Question - issue
It's probably just taking Google a while to process all the changes. Really your 301s should point to the same content, not just all go to the homepage. If you had pages showing on two sites, the pages do 'really' exist on one site but weren't supposed to exist on the other. Correct the 301s so that they point from the URLs on the affected site, to the exact same pieces of content on the site where they were originally located (where they were supposed to be located) If that fails use the HTTP header and X-robots (not no-index tags, fire the no-index directive from the HTTP header instead of the HTML) to tell Google not to index those URLs on the 'affected' website. In conjunction with that, alter the status code of all bogus URLs on the 'affected' site to 410, which is stronger than 404 (it means: GONE - not coming back, 404 just means temporarily gone but will return...)
Intermediate & Advanced SEO | | effectdigital0