Questions
-
App indexing with several subdomains
As far as I am aware, the association file is just about which paths the app can handle - I think that if you register the root domain and handle all paths then it will handle all paths across all sub-domains (but you should test this). When you are actually linking individual web pages to the universal page in the app, it uses a fully-qualified URL with protocol and subdomain so should work fine (see this documentation).
Web Design | | bridget.randolph0 -
AJAX requests and implication for SEO
Hi - right, then if the URL changes for the user, you'll want to probably use the PushState method (linked above) to convey this to Google. They likely can't see the URL change by default. You can check by trying to crawl the site with Screaming Frog SEO Spider with the user agent set to Googlebot. Then go to "outlinks" for the page with the facet links, and see if they are listed. Hope that helps some more! Let me know if you need further direction. -Dan
Intermediate & Advanced SEO | | evolvingSEO0 -
Duplicate content on product pages
To be honest, this type of thing is definitely a weak point in my knowledge but if it were my site, I wouldn't be heading in this direction with it. What you're essentially doing is obscuring duplicate content from search engines but presenting it to users which we know is a no-no. It may well be that search engines can't "see" that duplicate content just yet but that doesn't mean they won't in the next update. More importantly, users aren't particularly engaged by seeing the same block of content over and over so it's kind of a waste of valuable screen real estate. One other question to consider with this scenario: do users actually want to know about this manufacture process? This isn't a leading question. What I'm getting at is that content should always cover what the user wants to know, not what the business wants them to read about. If this process is really just a sidenote for most users, risking content duplication to push it directly in front of them is a large and unnecessary risk. Of course, if the process is a unique selling point that may actually persuade sales and/or build that rapport, disregard this point
Intermediate & Advanced SEO | | ChrisAshton2 -
List of Search Engines subscribing to the ajax crawling scheme?
Hi there, Bing also does and Yandex too. Unfortunately I can't confirm this with Baidu though and haven't seen any online documentation. I'll let you know if I'm able to confirm! I hope this helps!
Intermediate & Advanced SEO | | Aleyda0 -
Crawl Diagnostics - Historical Summary
Thank you Chiaryn, I will post on the Feature Request section Thanks again Dean
Moz Tools | | FashionLux0 -
Altering Breadcrumbs based on User Path to Product URL
Further update to this. Ran into a problem with option 3... this solution works really well when navigating the site internally, however a user landing on one of these URL's directly (bookmark, social share etc) would have a slow loading page as (for non-default product variations) the page will load after the 1st request, then a 2nd request to the server is needed to pull in the image via AJAX. Loading the other images, stock information, prices, copy etc into an array and doing the work on the client side wasn't an option as the page would get too heavy. So option 3 ruled out. Ultimately the goal was to reduce duplicate content of product pages and none of the 3 options above do this whilst not affecting page loading times. I did look to fall back on using canonical tags however I've just now found that Facebook are using this tag, so if a user wanted to share a 'red apple' when the canonical is 'green apple' - Facebook would show an image of the 'green apple'.... so at the moment that is ruled out also. I'll start a new thread on product page duplicates and the best solution - but if anyone has any ideas then please do let me know. Thanks Dean
Intermediate & Advanced SEO | | FashionLux1 -
Whitespace INSIDE # tag harmful?
Thanks Lewis. I think this video is commenting more about whitespace outside tags than inside them, but I think the same principle applies. Unless I hear that it does cause issues, for now I'll leave it in and fight other battles with the dev team. Thanks
Technical SEO Issues | | FashionLux0 -
How to Block Google Preview?
You can turn off snippets by using the following code: HOWEVER, this will also prevent text snippets from being shown for your search results, so probably not what you want. The best thing to do would be to go into your analytics and determine how many pageviews Google Instant Preview is generating. If it's a significant amount, and you feel it's impacting CTR, then spend the effort to create a pretty Javascript-free version. Otherwise, ignore it (less that 5% of our visitors use Instant Preview, for example). Here's Google's answer on the webmaster forums: https://sites.google.com/site/webmasterhelpforum/en/faq-instant-previews#11
Intermediate & Advanced SEO | | TakeshiYoung0 -
Http://us.burberry.com/: Big traffic change for top URL (error 593f1ceb2d67)
solved via GWT thread: https://groups.google.com/a/googleproductforums.com/forum/#!topic/webmasters/WlUzNLFQB54/discussion
International Issues | | FashionLux0 -
Do you bother cleaning duplicate content from Googles Index?
One tricky point - you don't necessarily want to fix the duplicate URLs before you 301-redirect and clear out the index. This is counter-intuitive and throws many people off. If you cut the crawl paths to the bad URLs, then Google will never crawl them and process the 301-redirects (since those exist on the page level). Same is try for canonical tags. Clear out the duplicates first, THEN clean up the paths. I know it sounds weird, but it's important. For malformed URLs and usability, you could still dynamically 301-redirect. In most cases, those bad URLs shouldn't get indexed, because they have no crawl path in your site. Someone would have to link to them. Google will never mis-type, in other words.
Intermediate & Advanced SEO | | Dr-Pete0 -
Why has SEOmoz added G+ code to multiple pages?
There is no benefit to having it on every page. In talking with our Google rep they stated they just needed on our site/homepage but since sometimes they don't know what they need, it was placed on every page via our CMS. At this point we haven't seen any negative effects from it but we might try placing it only on the homepage to see if that does anything. Casey
Social Media | | caseyhen1 -
How to download an entire Website (HTML only), ready to rehost
Thanks for the detailed explanation. If you know of any software or techniques to crawl and download multiple (html) pages and images of a site then please let me know. There are many programs designed to crawl websites and grab the html code. Legitimate sites are often duplicated in this manner. You can try searching a couple relevant terms or searching black hat seo sites.
Moz Tools | | RyanKent0