Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I use them a lot.  I have not done detailed studies for with/without on same page.  The pages where I use them compete powerfully in very difficult root-keyword SERPs and also pull in tons of long-tail traffic. So, speaking from my gut, in my opinion, they are second only to the <title>tag in their on-page optimization power.  </p> <p>I am using them a lot, but I am not using them enough.</p></title>

    | EGOL
    1

  • Hi Thomas Did you agree with the SEO people a list of KW to rank on?. I just ran a simple search with some kw on google.co.uk and the rankings I found were aluminium school signs    5     safety signs    50+     school sign company    7     school signs    50+     school signs in aluminium    5     we love school designs    50+ So you are at least on first page for some of the KW in your title tags and that describe your business. This is not what you expect to see if your website drops to the 10th page of Google's results. Try downloading a program called "free google monitor" from cleverstat, Use it to find your ranking for the KW that are important for you. You probably still rank well enough on them. In my opinion the SEO company is giving you a vague list of what they will do. The activities they list are not wrong, but since they lack detail I would not be comfortable with giving my business to them. I think they are not being straightforward with you on what happened, why it did and what is it exactly that they will do about it. You should probably get a quote from some other (quality) SEO companies. Cheers Jorge

    | Masoko-T
    0

  • Hi Luke, Generally I wouldn't worry too much about that - the main thing is that you avoid users landing on these 404 pages which it sounds like you are. For reference, here is a helpful article from Google regarding their stance on 404s. Ideally, if there is a relevant page that you can 301 redirect to, you could do this with certain pages. But it sounds like this may not be possible as the URLs that Google has now crawled never existed in the first place. So I'd recommend that you 301 redirect to another relevant page if possible (but don't mass redirect them all to a single page or your home page) but if there are no relevant pages, leave them as 404s - it is unlikely that they will hurt your rankings. Cheers. Paddy

    | Paddy_Moogan
    0

  • I tend to agree with Federico's concerns. If the 301 transfers a penalty, the impact could be long-term, and it could be harder to rescue site B. The short-term ranking gains may not be worth it. Google hasn't been clear on how this operates with 301 redirects. John's suggestion to disavow on both sites seems safe. Worst case, it's wasted effort, but it's not much effort (once you've built one file, building two is easy). Still, you've got to wait for that to process, and if the algorithmic penalty is something like Penguin, then you'd have to wait for a data refresh. This could take months, so I'd be really hesitant to risk site B until you've cleaned up the mess. Once you disavow to site A, the 301-redirect should be fairly safe, but it does depend on the extent of the penalty. The risk/reward trade-off is definitely a "devil is in the details" sort of situation.

    | Dr-Pete
    1

  • Here's a video from Matt Cutts himself discussing how many links you should have on your page, and if there is a limit.

    | Millermore
    0

  • This may be an issue. Because these are extremely technical sites I have never done anything with Google+ but it might be time to revisit that.

    | waynekolenchuk
    1

  • Hi Andrea! You will not be "penalized" for duplicate content, but Google probably won't be indexing pages seen as duplicates in their search results. If you are optimizing those category pages for specific keywords, then you will need each category page to have content that is mostly original and unique on each. If you really only have one of those pages optimized and the others are there mainly to help the user on your site, you can use a rel=canonical tag to "tell" Google which version of the page (the specific category page) you wish to display in search results. Let me know if that helps! If not, ill try to eloborate

    | RickyShockley
    0

  • This is the page http://www.over50choices.co.uk/Health/Compare-Health-Insurance.aspx KW: Compare Health Insurance

    | AshShep1
    0

  • Thanks Jeff, it definitely is. I guess having the "rel=" at the end and not the start doesn't really matter.

    | infinart
    0

  • Yes, if you can quickly write unique meta descriptions you can start doing that immediately and if the site is not big and you manage to finish it in 1- 2 weeks there is no use to delete meta descriptions. If you see that this will occupy more then 2 weeks it is better to delete duplicated metadescriptions. But the best solution will be to write unique meta description immediately for home page and other important pages which rank right now.

    | nurzhyk
    1

  • What do you mean by "search engine simulators"? I tested crawling your site with googlebot as the user agent and it worked just fine. Google and other engines are capable of running javascript and ajax just fine that shouldn't be an issue. What I would suggest is to look over your pagespeed. Your homepage loads a TON of external files, about 50 requests for JS and CSS files. You should really consider putting all those codes into a single JS and CSS file instead, making over 50 calls (+ the extra ajax calls) are WAY too many!, not to mention the hundreds of lines you have of inline JS and styles... http://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.ecigexpress.com%2F&tab=desktop

    | FedeEinhorn
    0

  • I've only had a quick look, but here is my theory. If you noticed a sharp drop in Google in October of 2013 then there's a good chance that the Penguin 2.1 update on October 4 affected the site.  I do certainly see links that the Penguin algorithm would look upon unfavorably.  For example: http://www.westdigitalmarketing.com/blog/funding-solutions-uk-digital-strategy/ http://www.startupoverseas.com/news/invoice-finance-for-an-exporter--cash-flow-and-peace-of-mind.html http://21stguru.com/index.php?s=A&c=39&p=141 But, as Penguin was only a Google thing, it wouldn't affect your rankings in Bing and Yahoo.  Those unnatural links were probably supporting your rankings in those search engines. So, when you removed bad backlinks in order to clean up for Google's sake, you removed links that were supporting you in Bing and Yahoo and thus those rankings dropped. There could be other reasons too, but that's my theory.

    | MarieHaynes
    0

  • Unless they have fixed it in recent months, xml-sitemaps does not generate correct video sitemaps.

    | ChristopherGlaeser
    0

  • I'm not aware of any off the shelf tools that will go back several years, so you'll probably have to use a more advanced, do-it-yourself solution. Almost any sort of scraper technology should work, from Screaming Frog, Scrapebox, SEOTools for Excel, etc. Here's an article explaining how to use SEOtools for Excel to do almost what you are asking.  http://wiep.net/talk/diy-link-building/branded-mentions-link-building/ The difference being that this article uses Mentions.net to find recent mentions, whereas you'll likely want to go to Google directly to collect your data - probably by using a custom scraper like Screaming Frog or Scrapebox. I don't have detailed instructions as I've never done it before (and it would take like 3000 words) but the basic idea is to scrape Google for your mentions using the tools described above, then use SEOtools for Excel to extract non-linking mentions. Hope this helps! Best of luck

    | Cyrus-Shepard
    0

  • Thanks. I didn't think it would make too much of a difference but I wanted to get another opinion. I'm not worried about spammy URLs. Everything will be off of a short .edu - they look pretty clean with multiple subdomains.

    | SEI
    0

  • Webmaster Tools isn't telling you that it thinks your website is about Dollars, its telling you that it sees that mentioned often on your site.  You really don't need to fix anything, but perhaps adding more content about your target keyword will help push that down a bit.  Are you saying something costs "88 Dollars" all over your website or actually using a $? If its the latter, that would be very interesting.  If its the former, remove dollar and use $ instead.  Cheers!

    | jonnyholt
    0

  • Hi Steve, I think that sounds like a good plan.  If the recipe site is gaining links naturally and isn't affected by Penguin then it might be better to work on that one and drive traffic to the main site with it.  I also like the idea to keep working on cleaning up the e-comm site in the meantime. Good luck! Marie

    | MarieHaynes
    0

  • Hi Jason I wouldnt think the duplicate content is the issue, especially assuming the main keyphrase you are after had the homepage ranking for it (which doesnt have the dup issue) right? Moz is not showing you anywhere near as many links to the domain as ahrefs is, I would look at GWT also and have a good look through all your backlinks, at first glance there seems to be a chunk of obvious spam/seo type links in your profile that might be a more likely cause of the trouble. If you are not getting any more of those and have a marketing plan in place then it might just be a matter of keep doing what you are doing and you will start ranking again. But you will want to make sure you know where those dodgy links came from and that you are not going to be getting many more.

    | LynnPatchett
    0