Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Stefania! Yes, you have correctly installed the code snippet for Google Authorship across your site using a footer widget, and linked to your blog from your personal Google+ profile. You can keep this setup as long as this remains a one-author blog. Google Webmaster Tools has a structured data testing tool that is very useful for determining if Authorship is working for any page. It shows that Authorship is working for your first blog post, and gives you a nice preview of what the search result will look like if Google decides to show your author photo with it - see http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fjijil.pasadena2shop.com%2F. Remember that correctly setting up Google Authorship does not guarantee that your author photo will show in search results. Focusing on creating high-quality content on both your blog and Google+, and becoming an authority on the topics you write about, however, will increase your chances of standing out in search results with Google Authorship. Please let me know if you have any questions, and best of luck with your new blog! Christy

    | Christy-Correll
    1

  • That is a weird one, even googling the url its not showing it.  Stinks of a penalty but you have no msg in GWTs and the backlinks look clean (even a backlink from RTE, which should be golden).  And it seems google is caching it ok http://webcache.googleusercontent.com/search?q=cache:ZXKkuLuk6UkJ:www.yourdairygold.ie/+&cd=1&hl=en&ct=clnk&gl=ie The only thing I noticed is that www.butteritwithdairygold.ie is 301 redirecting to it, and I thought that maybe that domain is burnt, but again its backlink profile seems clean.

    | PaddyDisplays
    0

  • I don't deal with implementation or specifics of our site search, but we're on Magento and I know we've used Google site search and Solr for one of our sites. We haven't been able to find one we're really happy with.

    | Kingof5
    0

  • Hey Andy, Thanks for reaching out! The Duplicate Content errors are actually aggregated using similar criteria as Google - that's a 95% similarity on the code level. So even if something like the title is different, if the overall code is 95% the same they will be flagged and could potentially get penalized by the Search Engines. Using third party duplicate content checkers on the URLs you provided, it does look like all but one combo meet the 95% threshold. In that one case, we may have found them to be 95% similar at the time of the crawl, but they aren't any longer so that will be reflected in the next crawl. To get more information on Duplicate Content, check out our Help Hub. I hope that makes sense. Let me know if you have any other questions and have a great day!

    | SamWeber
    0

  • Sorry, I should have clarified. The redirect needs the full path of the image. The actual redirect would be something Ike: Redirect 301 /images/15985.jpg(.*)   /images/15985.jog depending on where the image actually lives.

    | WilliamKammer
    0

  • Thanks for the response. The issue of "hiding" the content with the randomization was a fear of mine. Believe me, I don't like the rotating content design, but it's where we're at right now. 3 search results, think specific businesses, but for user experience, only 3 will be shown at once. This is not something to be changed unfortunately. If more than 3 are in that specific business category, we'll be rotating them out (which I don't like) upon refresh. The only solution I can think of is to have the top 3 remain static and allow the user to click a "Show more" button which loads them beneath (or replaces the original 3). Either way, Google shouldn't have an issue with that, correct? I know there are "better" ways to accomplish what we're asking, but the site is custom built and nearly 95% complete. We are also taking a unique approach to the way we display results and serve them to our clients, so the most optimal way is not achievable at this point. It's basically finding the most optimal for what we can do, if that makes sense. Thanks for understanding!

    | kirmeliux
    0

  • Actually don't bother with this one, I figured out what I needed to know. Thanks!

    | IceIcebaby
    0

  • So glad this helped!

    | MiriamEllis
    0

  • Hi Phil, Thanks for the reply. The quote (i've since sourced) comes from Forrester (http://blogs.forrester.com/interactive_marketing/2009/01/the-easiest-way.html) but in checking, even IF there was any truth to it back then (2009), there doesn't seem to be any now. Regardless, your insights on video are helpful, and along with the info of SERP Turkey, I've got a bit more direction to go in. Thanks again.

    | Gordon_Hall
    0

  • Go to WMT for whatever dev site you want to remove from the index. Use the URL removal tool, but in the box just enter /, nothing else. This will remove the entire site.

    | Kingof5
    0

  • I've seen this happen several times.  What probably has happened is that your penalty has expired.  All manual penalties do expire.  For some it can take months and for others it can take years.  When it expires you get no notification that anything has happened other than the manual spam actions viewer shows no webspam actions. You still want to make sure that you have done as thorough a job as possible.  If you happen to get another manual review and you still have unnatural links then you'll get penalized again and often the second penalty is worse than the first.  Also, if you had not done enough cleanup to pass previously then it means that there are still unnatural links that could affect you with the Penguin algorithm. Now, one other possibility.  I have seen a couple of instances where there is some incongruity between the manual spam actions viewer and the messaging system and one changes a day or two before the other.  It's more common to have it happen the other way where you get the message that your penalty has been revoked and then it takes a few days for the viewer to show it rather than having the viewer show no penalty and the message appears afterwards.  But, I don't think this is the case for you guys as it sounds like you were not waiting for a response from Google.

    | MarieHaynes
    0

  • Hello WMCA, Does BV want you to add that rel ="canonical" tag to the main product page, or just the paginated review pages. If the former, I say no way. If the latter, we should discuss further. It could be advantageous, but I'd rather send the authority to the main product page instead.

    | Everett
    0

  • Hello I have one more question, when the writer of the article is placing the source, how he should put the source in that article ? I see, that is the most used the site name for anchor text... but what he needs to put in anchor ? this: www.google.com or  this link: www.google.com/car-is-driving or should combine both ? Thank you

    | Ivek99
    0

  • Agree with Lesley - there's little to no benefit in stuffing keywords into a URL (which was a "traditional" reason why people added multiple subcategories), and excessive categorisation / siloing shows diminishing returns. I would stick to as flat a structure as possible whilst keeping a sensible hierarchy of information. Cheers, Jane

    | JaneCopland
    0

  • Any chance you can screenshot and redact the domain - I am not sure that this is being read right...

    | rishil
    0

  • My personal rule of thumb - as few redirect jumps as possible. Three main reasons: 1. User journey + Browsers - Sometimes when there are too many redirects taking place, some browsers find it difficult to follow through and would simply not load the page.  Also, even if there were only 2-3, the browser may load, but users on slower connections may find it tiresome waiting for content to load. 2. As ThompsonPaul highlights, you COULD lose some link value due to dilution through 301 redirects. 3. Multiple 301 redirects are often used by spammers and I foresee in the near future these causing a lot of ranking headaches. The older the site, the longer the chain might end up - for example, imagine you had a product at: https://domain.com/product1 Links to that page exist at domain.com/product1 The journey would be: domain.com/product1 >http://domain.com/product1 > https://domain.com/product1 Now imagine a year down the line, product 1 is discontinued and you decide to redirect https://domain.com/product1 to domain.com/product2 Imagine your journey now: domain.com/product1 >http://domain.com/product1 > https://domain.com/product1 > domain.com/product2 >http://domain.com/product2 > https://domain.com/product2 This could carry on indefinitely in the lifetime of the site... Best solution: Decide what version of the site you want to use and simply try and use only one redirect, not a chain. Periodically check for chained redirects and resolve as you go along. (I try and do this bi annually).

    | rishil
    0

  • So here is your issue run a review, start removing and disavowing all the bad links. A lot of penalties went out in April, e which is why Mozcast was so volatike, there wasn't a major update IMHO.

    | rishil
    0

  • Hi Tamara, Thanks for the clarification. There are special considerations when a business is a brick-and-mortar concern, but the conclusion is pretty much the same in most cases: the single site approach is simply better most of the time, enabling you to amass a great deal of authority in one place rather than having it spread thin over multiple places. You've asked a smart question!

    | MiriamEllis
    0

  • HI Dan - thanks for looking into this. Our traffic from organic search has indeed dropped (Google only, rankings and traffic from Bing/Yahoo have remained stable). Hopefully we've taken care of all the shady back links via disavow. Like you said, however, it could be awhile before we know if this has had any effect. Most of the links you referenced, and most of the ones that needed to be eliminated, came from websites linking to content that existed on our domain prior to the agency purchasing it almost 10 years ago. You're right about the unusually high amount of indexed pages. The inflation is from our blog "tag" pages. We've put a dofollow/noindex on all of these pages. They're pretty deep on the site though, I expect it will take awhile for them to be crawled again for de-indexing. We actually had a 2-day recovery just over a week ago. Then, as quickly and inexplicably as the recovery came, we again lost rank on our generic terms. I'm going to add some of this info to the main post now. It certainly is bizarre, so I'm hoping someone might be able to identify what might have caused the site to recover and then drop again over the course of 48 hours.

    | f1_path
    0

  • I wouldn't freak out too much over the crawl rate immediately. Wait a few weeks and see how things go. It sounds like you did the right thing and should see the benefits over the next few weeks.

    | katemorris
    0