Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Are they two truly different stores, as in two different companies?  If so, a redirect would seem odd to users.  If they are shopping on one site, they would expect the products listed on the site to be sold there and when they click on a link to a product they expect to stay on that site.  Breaking those expectations by listing products that that site doesn't offer and/or redirecting people to a different site would be confusing and possibly cause them to leave. I like Schwaab's alternative suggestion.  At first put a message on the product pages indicating that the product is no longer available through that site, but that the product can be purchased on your sister site.  After while, for the sake of visitor usability/experience, I would still remove the product pages (and internal links to them) entirely from the one site that does not offer the products anymore. Kurt Steinbrueck OurChurch.Com

    | Kurt_Steinbrueck
    0

  • It took us 4 reconsideration requests before our unnatural link building penalty was lifted. In order to have the manually penalty revoked, we downloaded all our backlink lists from Moz, Bing and Google webmaster tools imported them into a Google doc and remove the duplicates. We then contacted every site and made a not of the contact details along with the date in a new column on the Google doc, we would then re visit the sites the following week and email them again. After 3 attempts we would but : unable to contact webmaster for link removal added URL to disinvow list. Once we had gone through the whole list, which did take a long time we submitted the disinvow list first, then submitted our reconsideration request explains what we had done since our last request and include the link to the Google docs as the spam team can access this and see all the hard work you have done. Good luck with it, and i hope you recover soon

    | madaboutmedia
    0

  • Even though there are less pages indexed compared to those that are blocked, you still have a significant increase in indexed pages as well.  That is a good thing!  You technically have more pages that are indexed than before.   It looks like you possibly relaunched the site or something?  More pages blocked could be an indexing problem, or it might be a good thing - it all depends on what pages are being blocked. If you relaunched the site and used this great new whiz-bang CMS that created an online catalog that gave your users 54 ways to sort your product catalog, then the number of "pages" could increase with each sort.  Just imagine, sort your widgets by color, or by size or by price, or by price and size, or by size and color, or by color and price - you get the idea.  Very quickly you have a bunch of duplicate pages of a single page.  If your SEO was on his or her toes, they would account for this using a canonical approach or possibly a meta noindex or changing the robots.txt etc.  That would be good as you are not going to confuse Google with all the different versions of the same page. Ultimately, Shailendra has the approach that you need to take.  Look in robots.txt, look at the code on your pages.  What happened around 5/26/2013?  All those things need to be looked at to try and answer your question.

    | CleverPhD
    0

  • Hi Gianluca, Thanks for your answer. And sorry for the delayed response. I'll ask John Mueller for an answer. Thanks again.

    | Online_Supply
    0

  • Let us know how it goes, either with a response here, or a case study on YouMoz. Best of luck!

    | KeriMorgret
    0

  • Something else I would suggest.  Usertesting.com   You setup scenarios for users to walk through your site and can specify age, sex, location and income of the users.  You get back video with commentary on your site.  You can do 3 users for less than $200 and you get some really useful information that will really help bolster whatever input you get from the conversion companies.      Think of it as a way to put together a quick focus group and you often get results back in less than 1 hour so it is fast, too.

    | CleverPhD
    0

  • Hi Mike It's quite difficult to say, but I would offer this bit of solace: When I've seen sites that have been hit with a Panda penalty in the past, it has happened a day or 2 after receiving a similar warning to the one you have reported. And similarly, although not the same penalty, people have reported that they have received an unnatural link warning and then 2 or 3 days later have seen the big drop in rankings and traffic. Both of these instances indicate that the notification of a possibly penalty comes before the penalty itself.  If (and it is an 'if') this is the case with your site, then the big drop in traffic can be attributed to the penalty that you thought was coming, rather than the action you took yourself. It does sound like the crawl and the warning would result in potential duplicate content issues or low quality content issues, aka a Panda penalty.  I'd take out due diligence and make sure that you have not blocked any key landing pages via robots.txt file or by any other method - always a good idea to double check.  However, if that's not the case, I would say that the drop is because of the penalty that was likely coming, rather than what you've done. The next step would be to wait until the algorithm refreshes - Panda updates rollout at least twice a month now so hopefully your efforts should be rewarded soon enough and the traffic will return. However, it is worth considering that just because the Penguin algorithm updates on a given day, does not mean its impacted is limited to the day or the days that follow.  What I think we've seen with Penguin 2.0 is the impact of the algorithm updating has been spread out over refreshes, so we can't completely rule out this as Penguin action as well.  SERP volatility checkers like Mozcast seem to suggest that this probably isn't the case, all was quiet on August 13th on that algorithm, but it's worth bearing in mind. I hope this helps - do take my advice with a pinch of salt as it's an educated guess from my experience rather than any definitive evidence.

    | TomRayner
    0

  • Hi Nowspeedtom, This is a good question. While there is no documentation of direct ranking benefits from participating in Google Business Photos, it stands to reason that enhancing your content may have some benefits in terms of user behavior and possibly conversions. However, bear in mind that one of the 2 main places that Google displays these tours (the Google+ Local page) has just been downgraded in visibility. Recent changes in Google's display are making it harder and harder for the average user to actually access the Google+ Local listing, so I'm guessing that if any users would get to the virtual tour, it would have to be through Google Maps. Something to think about.

    | MiriamEllis
    1

  • Thanks, this is what I seem to be coming round to. I'm VARY wary about doing too many redirects though as in my experience it's a quick way to ruin perfectly good rankings. I'm thinking loads more cross linking would help, and we'll probably start with a 'related products' section on each page.

    | Blink-SEO
    0

  • Todd - Thanks for your message.  On the bright side - a quick response to my request. Today I received a message back that Google removed the manual penalty for outbound links.  Apparently they agreed with us. Again, many thanks. M

    | seoagnostic
    0

  • I agree with Phillip, If your site is established and ranking well for brand terms, then making this change shouldn't effect the over all branded rankings, especially seeing as your not taking the brand out completely. The keywords should ideally be the main focus of the page, thus having your keywords in the front of the title should boost rankings for those targeted terms. I have always used and recommended using: Main Keyword | Secondary Keyword - Brand Name On a side note: Something i recently started noticing (I don't have any resources to refer to) is that quite a few of our competitors don't use the "Keyword | Keyword - Brand" formula, instead they use "Great [Keyword] deals from leading [keyword] suppliers - Brand Name" and they seem to be ranking really well. (More descriptive words that entice users) Perhaps Google is starting to prefer "natural" looking titles (including stop words, adjectives etc) and not putting much extra value on "optimized" title tag formula's (Keyword | Keyword - Brand) This is just a thought and haven't researched it much. Greg

    | AndreVanKets
    0

  • This is done automatically by Wordpress.

    | driftingbass
    0

  • Were you requiring reciprocal links for a link in your directory? If so, I'd burn the whole thing to the ground and ask for removal of links where possible. If you really think it helps users, you could add nofollow to the links instead. Without knowing more about the directory, how it was run, and your link profile, it's hard to give solid advice.

    | Carson-Ward
    0

  • To answer your question directly: don't worry about it. I mean that both ways. Don't worry about the fact that you've not linked out to a lot of sites in the past. Google's not going to drop you down simply because you don't have a lot of outward-pointing links. Second, much like Chris said, don't make a point of avoiding external links.

    | Carson-Ward
    0

  • Hello John, This is a very good question, and something people don't often think about when blocking the navigational paths on their site from being crawled. Depending on how fast your category pages load and how many products are on each of them, you may consider a View All Canonical page: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html There are many different ways to handle faceted navigation problems, including javascrpt, GWT parameter handling, robots meta, robots.txt, rel canonical... and combinations of these. The right approach should be customized for your specific needs. When possible, I prefer to allow Google to crawl and index down to a certain level of faceting, similar to allowing them into sub-categories (though it depends entirely on your taxonomy) but not tertiary (i.e. sub-sub) categories. For the next couple of levels I might allow them to crawl, but not index. And once it gets down to 4 or 5 levels deep (e.g. /?category=1&size=5&color=blue&price=low&this=that&so-on=so-forth...) I just block them from being both indexed and crawled (i.e. Meta NOINDEX,NOFOLLOW or robots.txt block) to save crawl budget by avoiding spider traps. With all of that said, if you are giving Google an XML sitemap that contains the indexable URLs to all of your products they should have no problem indexing them, regardless of whether or not they can crawl all the way through your faceted navigation.

    | Everett
    0

  • I've used Site Tuners as well.  They have an affordable quick review where they go over your page and call you with their recommendations.  Very effective.  I always saw conversions increase after following their advice.

    | dseckler
    0

  • Instabill, Broadshout gives you the best answer in my opinion - use a subdirectory format. Note: If your site is large and has a lot of sub directories it could be problematic. If your blog is broken into categories and you then have blog/category/post/ you do not want them to have to go any deeper. So, make sure the to get to Blog, it is at most a one step process from the home page. You said: " Will changing the domain really be that helpful (i.e. will the change get our blog on page one for the term instabill)... Well the answer is...  It is likely, but will depend on all the other factors as well. It will improve your ability to rank for that term better than what you have, all other things being equal. Good luck. Robert

    | RobertFisher
    0

  • Just to add on to Mike's response, it depends on how the description tabs are created. If each tab is created on a different page, then naturally Google will treat it as separate pages. However, if all the tabs are created on the same page, but CSS/AJAX is used to display each tab separately, then Google will still consider all the tabs to come from the same page. Besides Googling, you can also check the page source code. If the content in all the tabs appear in the source code, they will all be crawled as a single page.

    | ReferralCandy
    0

  • Thanks Takeshi. I strongly agree with creating a blog and resource center. The issue is that the priority now is to create those unique product and category descriptions (just for the record I am in South America) and the opportunity and priority is to secure first unique descriptions and later to move to other value added content/inbound alternatives. Thats why I am looking for expertise/training in writing remarkable product and category descriptions. Thanks! Finn

    | insite360
    0

  • Thanks, Phil, I'll go check those out, as soon as I put out a couple of fires.

    | Doc_Sheldon
    0