Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • A couple of things come to mind: Why don't you want the product pages to be in the index? Why is there concern of a penalty? As to your question: Are you signed into Google when you are searching? Google will show you these types of results in the SERP's, but they are not necessarily shown to customers. If you have Google desktop installed it will also show you documents on your machine in SERPs if they find them relevant to what you are looking for. * Do the URL's have parameters? If so you can set those in GWT and inform Google what they should do when they encounter them as they crawl the site. *Canonical the pages you don't want in the index if possible to the static pages.

    | Shawn_Huber
    0

  • Check Google Webmaster tools if you actually got a manual penalty, and what their reason was/is.

    | NowHealth
    0

  • To my knowledge, Pinterest is a very high quality linking source. Having several links back and forth should mean that you have an equal amount of images, products and/or pages. No matter how good the source is, when you over use something it doesn't look natural. From what I understand, Penguin 3.0 really targeted linking "neighborhoods". This means that you could see an effect because of link associations that your linking sources have. That could be Pinterest, but my gut tells me that it is probably some other source. These social media sites are pretty safe and usually have ways to prevent bad associations. Have you used Open Site Explorer to compare what you have in GWT? I would just take a look and see what the quality of your links are and if there is any low hanging fruit you should disavow. You want to find high quality linking sources, these are the safest. I would try to eliminate links from low ranking domains with not a lot of root links behind them. You might be suffering the effects of someone else's bad links. Have you received any manual notices in GWT?

    | MonicaOConnor
    0

  • Hi Derek, Sorry for reviving a very old thread but I was wondering how did it go with this project? Thanks Ricardo

    | rflores
    0

  • Duplicate content isn't a big deal for ecommerce sites, you're unlikely to be penalized for it.

    | Kingof5
    0

  • Hi As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled. At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.

    | evolvingSEO
    0

  • Good plan, thanks for this and all your help.

    | Wagada
    0

  • Hi Jim, Thanks for the response, although a little assumptive As we all know the landscape has changed from stuffing keywords in within the title (which I'm sure you're not implying) to more descriptive titles about what we actually are - so we're happy. The creative industry or the way you approach this may be different in Canada. We leave our homepage to explain exactly what we are. On .co.uk. you'll find our main competitors for the term "creative digital agency", which is what (essentially) we are too; creative in our approach, process and designs as a digital agency. Take a look for instance as the homepage page title for one of the most successful agencies in the country www.hugoandcat.com (Net Worth: £3,375,636) which is similar. What would you suggest we add in within the page title? In terms of the page title, it has to provide context to what follows on the page; which isn't much as it's mainly design not content (copy) focused. Arguably if we were a smaller web design and development agency we'd focus on a more optimised page title, and maybe add some more content conducive for SEO, however we don't rely so heavily on search for specific search terms. If we're pulled into the SERPS for semantic (less competitive) variations, not exact match anchor texts over time then all the better; just so long as we're happy in targeting a more niche sector. You are correct that we do provide SEO, as part of a broader digital marketing service which covers UX, CRO etc etc. We're not an SEO agency so we don't have to rely as heavily on search. We do provide more punch page titles for our clients dependent upon the market/industry they operate within, and level/strength of competition. Thanks for your input

    | Tangent
    0

  • If this not really a manual penalty then disavow will take a long time and seeing the rankings nad traffic back will take months. The reason I will suggest not to 301 to the sub category of the main site is because if it was really under the red light, it will hurt the main site as well instead of helping it. What you ideally should do is to make sure you disavow all the bad links, now create some high quality links to different part so the website and see if this help you with rankings and traffic. One you see the rankings and traffic is recovered, then if you want 301 is a safe bet. Hope this helps!

    | MoosaHemani
    0

  • What's the exact day of the drop? Normally, I wouldn't really be concerned with a drop in webmaster tool impressions especially after a bump up, but since you mentioned you lost traffic on Bing, yahoo and social, it does raise some flags. IF that happened around the final 3 days of the month, then you were affected by the update I mentioned which nobody seems to be talking about.... Anyway, did you check crawl errors on GWT? Did you crawl the site separately ? Did you see any problems with the index/noindex/etc? As for injected codes, the easy way is to load pages (that you lost traffic on) in google. Check the saved cache. If you got some injected codes, they'll show up there. There's really not much we can do but assume things. I do hope you could find the reason for the problem.

    | DennisSeymour
    0

  • The problem you're describing is almost exactly the reason why canonical URL functionality exists. Just pick your canonical (with or without slash - it doesn't matter) and make sure you roll it out consistently across your website and sitemap. Regards, George

    | webmethod
    0

  • I appreciate that, thank you.

    | McCaldin
    1

  • Hi, thank you very much for your input. I think Linda meant that adding "something about how each alternate choice is similar to/different from the original boat" would be labour intensive (which is correct for 2k makes and 8m models!) but we already have logic to get "similar boats" so we'll be able to auto populate these pages with some content.

    | pbscreative
    0

  • The quantity of links isn't as important as the quality of those links. If there are several links to and from low quality websites it is only a matter of time before your competitors get smacked with an algo update. Sure, blog back links take time to build, but they also appear natural, and more importantly, related to the content on your site. That means that theoretically they are good for searchers. This is the best strategy. Link building should really consist of high authority directory listings, competitive link analysis, branded comment backlinks, social bookmarking, and guest blogging. You should never allow more than 10% of your existing link profile to be added at a time. Doing any more than 10% looks unnatural, and unnatural link building is not the way to go.  Buying links is kind of an SEO death wish.

    | MonicaOConnor
    0

  • Thanks for the opinions.  I hate how the Fear of Google makes me have to potentially do or not do things that I think are to the benefit of users. By the way, since forums are such a pain to manage, the forum I'm thinking of using is the new Discourse that a lot of large companies (like Dropbox, for example) are using. Because it runs on an complex environment I don't want to host personally, I'd host it at somewhere like DiscourseHosting.com (too expensive at Discourse' own hosting since they are just going after big sites). for something like eighty bucks a month. Technical question: Because it's going to be hosted elsewhere from my own server, I need to map it to a subdirectory, forum.mysite.com, rather than mysite.com/forum (live example: http://bbs.boingboing.net/).  There's not disadvantage to this is there?  I have not heard of one. Also, I'd be interested in any options in this choice of forum platform, in case anyone has ever used it, or is interested in checking it out. It works differently than the usual forums.

    | bizzer
    0

  • Duplicate content is duplicate content whether you cite the original author or not. I would immediately reconsider your blogger's approach. Depending on the citation they are using, they are actually creating backlinks from your site to the original source. That can help them, but, link juice doesn't work both ways. All of your content should be unique, expertly written, trustworthy and authoritative. You want to sound like an expert in your field, not duplicate someone else's expert opinions. Content is king, and duplicate content will definitely effect your SEO rankings.

    | MonicaOConnor
    0

  • IT IS NEVER SAFE!!!! GOOGLE WILL HUNT YOU LIKE A DOG HUNTING ANOTHER DOG HUNTING A SQUIRREL! I am so funny. If the pages that link to other pages are related, in the "OMG Bro, I can't believe just how topically related these pages are!", kind of way... then I'm sure you'll be alright. Edited bit: If everything is under the same IP under the same registrant, than I'm somewhat sure the links will be discounted for the purpose of link juice/page rank. However, if you're funneling people to the good stuff, who cares? The sites might not rank any higher, but if the sites are good in the first place, you may increase revenue across sites. Or you may decrease revenues. You kind of have to experiment with that sort of thing.

    | Travis_Bailey
    0