Questions
-
To what extent is content considered unique or duplicate?
If they are exactly the same listings in exactly the same order then yes, you probably don't need both of those URLs. I'd go back to the architecture and try and work out why so many duplicate URLs were created, what the logic on that is, fix it from the foundation. Messing around with tags that Google ignore half the time is seldom the answer. It 'seems' simple, but in reality doesn't usually properly fix the main issues. Canonical tags for example, do not consolidate backlink authority properly. 301s are an option but then it's like, why have I created a whole shadow section that just 301s to another section? By that point you begin to realise the ridiculousness of the structure and think about fixing it properly
Technical SEO Issues | | effectdigital0 -
Http > https Switch Before Platform Migration?
"The concern is that, due to the http>https 301 redirects that will be in place, are we putting ourselves at unnecessary risk by effectively carrying out 2 migrations in the space of a year (in terms of loss of potential authority caused by redirects)?" In February 2016, Google’s John Mueller announced that SEO equity or PageRank will no longer be lost when a 301 or 302 redirect is used in conjunction with an HTTP to HTTPS migration. While some of us doubted this statement, Gary Illyes tweeted the same thing in July 2016 and Barry Schwartz at Search Engine Land confirmed it. There is no loss of authority caused by redirects when you implement HTTPS. "Would we be better to wait, and implement https at point of platform migration instead?" I think the approach you're taking (convert to https first) is a good one. It affords you better control and is a good use of available resources.
Intermediate & Advanced SEO | | DonnaDuncan0 -
Googlebot being redirected but not users?
Can anyone offer any insight on this? The issue shows no sign of improving, and our organic is tanking. If this really is an issue on Google's side, and it appears based on our technical set up, how can I force Google to take note and reindex the pages? Here are some more examples, from across a number of categories.. Category A: Page that has dropped out of index: https://goo.gl/KEQ8Yh Cached version of that page: https://goo.gl/DPzFWM Page that has dropped out of index: https://goo.gl/5KiQ4s Cached version of that page: https://goo.gl/myRWNg Category B: Page that has dropped out of index: https://goo.gl/pr3YQs Cached version of that page: https://goo.gl/8SEYi5 Page that has dropped out of index: https://goo.gl/LqzDrg Cached version of that page: https://goo.gl/iwPs45 Category Page that has dropped out of index: https://goo.gl/YBZS7c Cached version of that page:https://goo.gl/n33QzG Page that has dropped out of index: https://goo.gl/Ht4gfO Cached version of that page: https://goo.gl/u81vbA
Intermediate & Advanced SEO | | Sayers0 -
Internal Duplicate Content - Classifieds (Panda)
TL;DR: You're right to be skeptical that this is an urgent issue (in my opinion), but it is something worth fixing at some point for several reasons. I was far more concerned by search results, but I see you've added those to noindex/disallow in robots.txt, which is great. Not many people know that works! I think it's very possible that Google understands the difference between a classified ad and an editorial content piece. They definitely treat products and content differently. That said, it's generally a good idea to avoid relying on Google's intelligence, as many have been let down by Google's failure to understand. Duplicate content is generally something SEOs are overly-concerned with. More often than not it triggers a filter - not a "penalty." I don't see it as the most dangerous thing you could be doing by any stretch of the imagination. That said, I've seen several classified sites do the following, which I'd recommend as a "best practice" approach. At one time Craigslist did this, and may still be doing it. Accept non-spam ads with a pending status Check against listings in a given period of time for duplicates. This happens even if the ad is changed slightly, so there's some kind of semantic+image analysis going on. If a duplicate is found under the same user name, inform them that they've already posted the ad. From here the rules are up to you. Many sites say the ad can't be posted again for 7 days (if the old ad is deleted) or 30 days (if not). They then encourage users to buy a featured listing that shows up higher than others. If duplicates are found under different user names, give a warning that it's against your terms of service (make sure it is) to post duplicate ads from multiple accounts, that accounts can be banned, and have them certify the post is not the same. You don't need to follow this exactly, but it's here to give you some ideas on having your users prevent duplicate content for you. Given the general positive architecture I've seen on the site it looks like you know what to do with the site better than I would. Now I don't think 250 out of 10k is bad. Having consulted with a few local classified sites that's actually quite low. But I do think there's something to be gained by detecting duplicates to prevent users from gaining an unfair advantage over those playing by the rules. And if you sell featured listings this is an excellent way to help those who are most desparate to sell while increasing revenue. I hope that helps. Obligatory disclaimer: This is merely free advice for your consideration, and not the Moz official stance. The consequences of any changes you do or don't make are ultimately your responsibility.
Intermediate & Advanced SEO | | Carson-Ward1 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
As Chris says, this shouldn't be a problem at all. We've done this on a few stores and not seen any measurable impact in either direction from it. The only thing to be aware of is the possibility of introducing canonical URLs. http://example.com and https://example.com are different URLs. If both are accessible and return a valid header then both can be indexed. Always worth ensuring that you either have a redirect, a rel=canonical or robots.txt addressing that issue if you have https in place.
Technical SEO Issues | | matbennett1 -
Algorithm update last weekend?
Thanks FDFPres. Interestingly, all sites that have improved over the weekend have previously been hit by Panda, which makes me think perhaps there may have been one of these mysteriously silent rolling Panda tweaks. Some of the sites which were worst affected, and subsequently were reduced <1000 visits a week are now running around 90% up week on week organically.
Search Engine Trends | | Sayers0 -
Reducing onpage links - manufacture list
use some nice AJAX script to list them by alphabetical order or some other order? I assumed from the title you were wondering if you needed to crop the number of links, the answer is not if it's just a few pages and it's clearly just a directory in that part of the site. Those links won't pass huge value due to sheer volume so webmaster tools may scream but I wouldn't be horrendously bothered by them.
Intermediate & Advanced SEO | | SEOAndy0 -
Content in different languages
Hi Nicola, I was going to write you a really long reply and then I remembered a thread written by Google that goes into detail about multi-regional websites and all the bits and pieces. This is probably a really great read for you, more so than me leaving a long winded comment: http://googlewebmastercentral.blogspot.com/2010/03/working-with-multi-regional-websites.html I hope that helps? Christopher
International Issues | | ChristopherM0 -
Ranking for competitor brand terms
Can i re-open this questioning line? We are interested in a similar strategy in order to help educate some customers who are falling victim to an unscrupulous operator in the market. We know we can get links to our site for their brand name because there are a lot of scandals people are falling victim to based around this operator with government bodies and charities trying to raise awareness in terms of what it means to get involved with this player. They are very good at marketing to people who feel they do not have many options under their circumstances however alternatives exist and our goal is to raise awareness. Does anyone know the legality of using competitors brand names on our site providing the information is factual and transparent?
On-Page / Site Optimization | | TrueluxGroup1 -
Site being indexed by Google before it has launched
Google has ignored a robots.txt for my site because it thought we were "hiding" important content. Depending on your site structure, putting a "no index" tag on your page is also a good idea. Doing that as pages are built can prevent it (better) from being indexed in the first place. Then you remove the tag when you are ready to go live. And, submit your updated sitemap via GWT when you finish your move so Google will know what pages to index.
Technical SEO Issues | | josh-riley0 -
Site being indexed by Google before it has launched
Duplicate question, closing this question so all answers can be given at http://www.seomoz.org/q/site-being-indexed-by-google-before-it-has-launched-2
Technical SEO Issues | | KeriMorgret0 -
Malware ranking drops
Yep, we have thought about that but we don't think that is an issue as there weren't any major updates around the time it should have been starting to recover.
Technical SEO Issues | | Sayers0 -
Archive or no archive?... That is the question!
I suppose the main problem could be a lot of similar (even duplicate) content. To avoid this, after a certain time period it might be worth 301 redirecting the old pages to their most relevant category pages. If you 301 redirect to the new archive ad, and then 301 redirect that to the next archive ad and so on, you'll have endless 301 redirects for ever and ever! Perhaps it isn't worth the effort of doing what I suggested initially and it might be best to 301 straightaway. It's worth testing on separate categories if you have the time.
Intermediate & Advanced SEO | | Alex-Harford0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
Hey, so I guess you're wanting to achieve this (http://extensions.ecommerce-team.com/apparel?___store=shopbypro&reset_filter=1) - aor something similar using ajax so the page updates without re-loading? I guess you need to use token's in the url path rather than a dynamic url. Its a big job as you'll have to define some rules to say colour must come first, brand second, range last or something similar so you end up with domain.com/brown/levis/jackets/ - is that right? Essentially its working as a site search with some defined rules, so you'd have to think about the following: 404's - if your pages are built on queries, a user could potentially create a link to a page domain.com/brown/jackets/f**k/, obviously this would need to return a 404 or this could be indexed. Canonical - the previosu point brings about dudplicate content issues, if you don't define the rules then you could have /brown/jackets/levis/ vs. /brown/levis/jackets/ showing the same content, so there needs to be some canonical management. So you can use ajax to call a particular URL, you would need to use the filters as links ( <a>tags) so the pages are crawled and pull that content (search results) back, the url wouldn't changes though unfortunately, but it would still be linked to from the</a> <a>tag.</a> <a>Is there a specific reason why you don't want to use a hashbang?</a>
Intermediate & Advanced SEO | | ChapterEight1 -
Classifieds and Google Panda
Hello David, What if your website allow people to post but they may have posted those ads on other websites also, would that affect an website ? What do you suggest ? Regards
Search Engine Trends | | helpgoabroad0