Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
One more redirect question
This would be a good place to utilize a canonical. You can specify which one you want to be as the main source, and which is the duplicate.
| WhoWuddaThunk0 -
What's a Google penalty or why ignorance is not bliss - A tale of two web sites.
It really depends on the extent of the damage whether you need a pro or not. If it's all self contained, as it sounds like, then it will just take identifying the duplicates, and fixing the linking problems. To find duplicate articles just copy a part of the content and Google it with quotes around it. This will search for that exact text, and allow you to find the instances that it shows up on your websites. Fix accordingly. As for the links, it's pretty much just a matter of using some link searching tools, Open Site Explorer and Majestic SEO, and finding all the instances where you have outbound links from one site to another. For the time being I would just separate them completely. After they recover you can go back to connecting them. I just did this for a company out of Florida. They created their own link network, and it ended up penalizing them. They recovered in 3 weeks. They had a penalty, lower case and less destructive, and not a Penguin or Panda.
| WhoWuddaThunk0 -
Which is one worst panda or penguin?
Thanks for your responses Matt and Andy, very useful information and I will go into them deeper to workout what is going on with my site. It's interesting what Andy says about older sites getting hit the mosts as we are seeing new sites with high rankings, when in the "old days" it would take ages for these sites to rank for very competitive keywords. We have not received a Google webmaster penalty, but have been hit by one of the black and white animals. We have disavowed many links and have seen some recovery, but need to see more progress. I'll check out the links above and see if it will help in understand what to do. Thanks again, ps, if anyone has any more info, do let me know. Tai
| Taiger0 -
Is it appropriate to use review markup for testimonials without numerical rankings
Hello Oren, Most of the information I have seen requires some kind of rating number or data in order for reviews to show in the SERP. If you look at google's guidelines for setting up structured data for an individual review, https://support.google.com/webmasters/answer/146645?hl=en#individual , you will see the 'rating ' is one of the properties. If you set this up without the rating property and then ran it through the rich snippet testing tool you will see an error and therefore the data will not be displayed in the SERP. Also, in my experience, ratings structured data can be a bit finicky too, if the markup is not 100% accurate, then they do not display. Hope that helps.
| Whebb0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
Hi Dana, The problem when it comes to passing authority internally is that properly paginated and crawled listing pages can be one of the primary routes via which Google finds and assigns authority to internal pages. Unless those products are linked to elsewhere, they're not going to be found if they cannot be found on a URL like http://www.ccisolutions.com/StoreFront/category/audio-technica?page=2, ?page=3 etc. The lack of a unique URL with content changed dynamically also means that there never could be a good flow of authority through the site as Google does not have new pages to crawl and new outbound links to index / follow on those pages. Your diagram is correct - the second option (Page 1 ---authority---> page 2 ----authority---> page 3... ) is what you're looking for with pagination.
| JaneCopland0 -
Bing is not Indexing my site.
Thanks everybody for your response. I have set up bing webmaster account four days back and I am waiting it to get updated with my website data. I hope it will be able to provide me some insight. Thanks again.
| saurabh19050 -
What do you think about my new site?
Hi Mike, Well, the site is in Italian, and about Football (which I don't really follow as a sport) so can't really read what is there, but there are two points that I would make. The first, I had only been there 5 seconds, and then I lose the page to a huge mid-window popup. I hate these. The second is that you need to create a Favicon. You can do that here: http://www.favicon.co.uk/ -Andy
| Andy.Drinkwater0 -
Link building with AddThis URL
Mike, this comment you made is correct: "my understanding is that Google disregards everything after the "#" so there shouldn't be a duplicate content issue." If you do somehow see one of these getting indexed in Google then you have an issue, but I have not seen this happen.
| KaneJamison0 -
Why Custom Post Types Don't Get Ranked Well
There is nothing about custom post types that is inherently different from posts. The biggest issue is making sure you properly set up the site structure so that they are interlinked well with the rest of the site. Depending on your SEO plugin, make sure the right URLs are in your sitemap as well. Other than that, you could be having issues with newer content taking longer to rank, but it's not an issue of custom posts vs. posts.
| KaneJamison0 -
Two Dentists, Same Address, Same Phone, Different Business Names
Thanks Miriam Agreed on all of the above.
| danatanseo1 -
Assessing Link Profiles
Hi Neil, Through no fault of their own, some sites end up being blacklisted, so it is worth always checking this. A useful service is www.urlvoid.com. This will tell you if it is has any worrying markers attached to it. Marty has also given you some good pointers there. On top of this, check to see if the page is actually indexed by Google. Doing a "site:...." can often show interesting results. If a page isn't indexed by Google, there is normally a good reason why. -Andy
| Andy.Drinkwater0 -
Multiple sub domain appearing
Just to let you know it was an issue in the DNS servers a records placing a wildcard *. all sorted now!!
| nezona0 -
Need help please with url guidelines.
Good Morning, Is there any way I can check which internal url is getting juice from back links? Is there a way to find out? Thanks Abie
| signsny0 -
How to avoid duplicate content when blogging from a site
Jane, Thank you very much. That is what I try to do-blog on a sub topic to add more depth to the page topic. I have just been paranoid after Penguin about trying to avoid similar key words and phrases that can be in page-but impossible for blog to make sense without some repetition. Dr. Seckel
| wianno1680 -
URL Structure
Hi, I agree with your suggestion. This is the best way of doing your new urls. /shopping/feeding/ /baby-list/feeding/ /ask/feeding/ Thanks Tahir
| signsny0 -
Value in Consolidating Similar Sites / Duplicate Content for Different URLs
**We're thinking of consolidating the smaller sites into our most successful site (www.product1.com) in order to save management time and money, even though I hate to lose the product-specific URLs in search results. Is this a wise move? ** As a general answer, I would say that it is a wise move. I my opinion, in most instances the only time that you should have two sites competing with one another is when you have one site that is a total dominant in its business niche and you have the time to build unique content on a second site. Even then, it might be more profitable to spend that same effort adding additional products on your main site or launching a new site in a new niche. Keep in mind that is a general answer and detailed study should be done to determine if closing some of your sites is a smart move. If we proceed, all of the products will be available on both our company site and our most successful site (www.company.com & www.product1.com). This would unfortunately give us two sites of duplicate content, since the products will have the same pictures, descriptions, etc. The only difference would be the URL. Would we face penalties from Google, even though it would make sense to continue to carry our products on our company site? If this was my company, I would be investigating the closing of four sites and doing 301 redirects to the one that remains (assuming that detailed study supports that and all of the sites are in superb health with great link profiles). But you ask about duplicate content. If you have that, then one site could be dropped from the SERPs. If the two sites have links that connect them it is a very high probability that google will kill one. Google started being able to detect that and kill one of the sites about ten years ago. Disclaimer: What I wrote here is generalized opinion. Detailed information could change my mind... and I don't have time to do that type of evaluation even if the data was presented here. Very time consuming. If these were my sites I would make up my mind about this type of move over a period of months and not in a few moments.
| EGOL0 -
Product page Canonicalization best practice
Awesome Reed. Thanks for the hat-tip on Twitter too. I appreciate that. Cheers!
| danatanseo0 -
2 sets of stats for same site
Hi there, Basically, "www." is a subdomain like any other - it's just a subdomain named "www." and happens to be extremely common due largely to tradition and the set-up of most content management systems. Instead of being named "uk." or "en." (like at Wikipedia), "help.", "analytics." (in analytics.moz.com), etc., it's called www. The screenshot example would be easier to make sense of if you were comparing the domain http://yoursite.com/ and a subdomain like http://blog.yoursite.com/ - the first shows the authority pointing to your domain's root. The second shows only the blog subdomain's authority. This is the same thing, but with "blog" replacing "www". It looks here like there is a split between how many incoming links you have to the root and the www. That is, links point to both "versions". If you try to load http://www.yoursite.com and http://yoursite.com/, do both pages load with the same content? If so, one either needs to 301 redirect to the other, or you need to place the canonical tag on the version you do not wish to be indexed and well-ranked, pointing to the other. If you do not do this, Google finds two versions of the same page (or entire site, if ALL your pages load twice, once with www and once without). Both versions have inbound links, so Google doesn't know which one is meant to be the boss. Worst case scenario is that it ranks both versions poorly as a result, so either redirection from the non-preferred version to the preferred version, or canonicalisation, are the way to go. You can also set your preferred domain in Webmaster Tools under Settings (see screenshot): http://i.imgur.com/sgrvPKo.png This may be all basic to you, but hopefully it helps explain why there are two sets of numbers between the "root" and the "same" URL with www. attached. Let me know if this isn't clear. Best, Jane
| JaneCopland0 -
Pages with a short life time... example Flash Sales Ecommerce Sites?
Hello Ahalliday, First... apology for the delay. Thankyou very much for the answer you have put here... it makes sense for sure. If I may put an extension to the problem... what about at the architecture level. I do not want to cause indexation bloat... so I only want the right pages indexed and not all of them... so I want to put a policy in place for rel=nofollow OR exclusion with Robots.txt and I also need to clarify some rel=cannonical stuff. Attached is an image of the architecture I have in mind. Let me know what you think about it. Regards, Talha edit?usp=sharing
| MTalhaImtiaz0