Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • With the canonical/alternate approach, where you're still serving two URLs, the META data is still useful for the mobile crawler (and potentially for mobile search). I'm not sure it's technically "necessary", but I think it's safer all-around. If the mobile site's URL structure is the same as the desktop site, you probably don't need a separate sitemap, but that sitemap could help you diagnose problems and aid in Google's ability to discover pages for mobile users. I think it boils down to how important the mobile audience is for you and how much you want to invest in it. Not having a separate sitemap isn't going to get you into trouble, it just may be slightly less optimal.

    | Dr-Pete
    0

  • If you'd like to Private Message me the campaign, I can take a look. I noticed you have more than one, and I can't find one with 62 notices. The "Notice" level is our lowest-level warning, so sometimes it's just a heads up and isn't cause for concern. The "Tag Value" thing sounds odd, though, so I don't want to tell you that it's fine without taking a closer look.

    | Dr-Pete
    0

  • Bleed is just slang that means that portion of the PageRank is lost.  If a page has multiple outgoing links, the PageRank is divided among them.  If some of these links are nofollow, the portion of the PageRank associated with those links is not only not passed on, it is lost.  In other words, the PageRank is divided among all outgoing links, both follow and nofollow.

    | ChristopherGlaeser
    0

  • The idea in engaging with people in forums is to drive traffic, become a source of information and eventually achieve sales. been there done it. Do not focus solely on link building, SEO now is more of a complete digital marketing package than link building. Unfortunately I do not remember the source I saw this but I am 99,9% sure that signature links might be penalised by Google so i wouldnt go with that direction (should you insist in pure link building ofcourse).

    | artdivision
    0

  • Thanks for the responses! I think we're going to have to go down the route of styling the heading to look like an h1 and then use the heading which is related to the content to be the h1 to avoid duplication and the make best use of the header tag. Really good to get other opinions !

    | J_Sinclair
    0

  • Hey There, Our crawler should definitely pick up any changes that were made before the most recent crawl began, so this definitely seems a bit strange. I am going to run your site through our Crawl Test toll to see if the pages are still being reported with the Missing Meta Description error, but I need to know which campaign you are having issues with. If you don't want to include the campaign name in this public forum, you can just let me know the initials for the campaign name. Also, it would be helpful if you can let me know when the meta descriptions were added to your site so I can check into whether or not the crawl had already begun by that time. I look forward to hearing back soon. Chiaryn Help Team Ninja

    | ChiarynMiranda
    0

  • The rel next prev is not for duplicated content - it just shows google how the parts relate to the whole. An alternative to the rel next prev is the "Classic Pagination for SEO" that uses noindex another article by Adam http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284 If you have a duplicate issue, this would solve it as you would noindex all the duplicate pages. What you need to do (and I can't do this for you), is to look at all the crawl paths that you are providing Google.  As I mention above, you are not doing any favors to Google or to your site when you show Google an infinite number of paths to get to the same content.  It just wastes Google's time and you don't want to do that when Google also has to crawl the rest of the internet.   If you solve this issue, you will solve your duplicate issue. AJ Kohn just posted an article on the concept of crawl budget that talks about this.  I think the article is quite good and it explains why we need to look at all the topics of noindex, nofollow, robots, canonical and rel next prev  http://www.blindfiveyearold.com/crawl-optimization

    | CleverPhD
    0

  • Thanks so much for the quick responses, really appreciate it!

    | J_Sinclair
    0

  • Agree with Chris. A new account is the best option.

    | Clickatell2
    0

  • If it's a commonly used acronym then both the full keyword phrase and the accronym should be included in the SEO optimization process.

    | irvingw
    0

  • This post is actually an April Fool's Day joke.

    | Christy-Correll
    0

  • If you just scription's are showing up as duplicates that could be the issue. However you're telling me that your Mejia data descriptions are showing up as duplicates? Or are your product descriptions showing up this duplicates? If so both are not good. Run it through moz and use the campaign tool while you're waiting for that to happen use Screaming frog spider seo you know and Google Seer Then screaming frog spider guide This will tell you for free up to 500 pages and give you the ability to target those links that you are trying to decide if their duplicate or not it will give you the answers are seeking however you can also use copy scape and it has a new beta cool that looks inside your own website

    | BlueprintMarketing
    0

  • Hello, Using "span" inside "anchor" is perfectly valid to use in your code, it should not affect SEO in any way. A similar question has been posted before: http://moz.com/community/q/span-tags-inside-a-tags-is-this-bad -- Jørgen Juel

    | jorgen_juel
    0

  • Not having the numbers would probably be less confusing than having numbers that might later duplicate.  If you can produce the news sitemaps (and keep them accurate & up to date) the unique numbers are not technically required for Google news listings.  So if they are hard to add and you are confident in your sitemaps it shouldn't be an issue. If you are less confident in them and being listed on google news is a big deal for you then try to get them added.

    | matbennett
    0

  • Thanks for all the responses. It is a large site that runs on Wordpress. For the login and registration pages we are currently forcing https. It currently shows the https pages in SERPS for login and registration. Just want to be sure that if I redirect the entire website to https that it wouldn't hurt my rankings.

    | Clickatell2
    0

  • If you delete the page (URL) with the link (clickable words that takes the visitor to another URL), Google will eventually recognize that the URL no longer exists and drop it from the index. That may take anywhere from a few days to a month or more. Once the URL is out of the index, whatever links were on that URL won't count against you. You may use the URL removal tool in Google webmaster tools to try to speed up the process of removing the URL from the index.  There is no need to use the disavow tool for a URL that resolves to a 404 page.

    | Chris.Menke
    0

  • I agree with Chris.  I thought I'd add some personal experience here.  I worked on an unnatural links penalty for a site with a horrendous backlink profile.  After several weeks of work we filed for reconsideration.  Previously every time the site owner applied for reconsideration he had failed.  This time we got the same message as you. At first I felt sick that we had spent so much time (and he had spent so much money) trying to get rid of a penalty that had expired on its own.  But, I now know that if the webspam team revisits your site and sees that you have not done the work to clean it up then they will re-penalize you and that often the second penalty is worse than the first. Also, the backlink cleanup was necessary for Penguin reasons.

    | MarieHaynes
    0

  • Are you continuously adding new pages to the website each month?  If so, you could be diluting your domain authority. So when google caches your website, the pages have less authority than the last time resulting in a drop in rankings.

    | Jonathan_Hatton
    0

  • Excellent! Thanks guys! That was really stressing me out trying to figure out the best thing to do with those pages ...404 it is

    | k9byron
    0