Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • If you got a partial action warning when you redirected site A to site B, then there are a couple of possible reasons why: If Site A has a lot of unnatural links pointing to it then those suddenly are pointing at Site B. It's also possible that the unnatural links pointing to Site A were part of the problem but that it uncovered a problem with unnatural links to Site A. If you know that Site A has had no manipulative link building done, then a quick solution would be to remove the redirect from Site B to Site A and file for reconsideration.  However, if Site A has unnatural links as well then Google won't remove the warning unless you clean up those links as well. If removing the redirect is not an option then there are a few things you can do: -Do a link audit of all of the links pointing to Site A AND site b, attempt to manually remove as many links as you can, document your efforts, disavow the remaining unnatural links and then apply for reconsideration.  The process of filing for reconsideration is extensive. -There are also ways that you can redirect Site A to Site B and NOT pass any of the link equity.  You can do so by redirecting via an intermediate page that is blocked by robots.txt.  This is a complicated thing to do, but it can be done.  This would remove the link equity from ALL links that pointed to Site B so that they don't pass to Site A.  This includes removing the link equity from any good links that were there. However, my guess is that if you got a warning for Site A that Site A may also have unnatural links, so you may have quite a job on your hands in cleaning this up. With all of that being said, there is some debate as to whether you need to heed partial match warnings.  In some cases Google says you can ignore the warning because Google is simply not counting the unnatural links.  But, in every case that I have examined where a site got this warning, within a few weeks their rankings started to drop.  In my opinion, these warnings should always be cleared.

    | MarieHaynes
    0

  • Remembered seeing this back in the day, Matt Cutts Take, https://www.youtube.com/watch?v=4eYJuT0yGrI&list=UUWf2ZlNsCGDS89VBF_awNvA I should think you will be fine for just 4 days, James

    | Antony_Towle
    0

  • Great insight guys. I knew I would be going with dashes along but I was having a friendly debate within the office. Thanks again!

    | tbinga
    0

  • If you use gLTDs as your sub-directories, you can then use WMT to set preferences. See a further discussion here: https://support.google.com/webmasters/answer/182192?hl=en#2 From my understanding (as a hosting provider not a SEO), is that increasingly signals coming from on-page or WMT are far more important that server location, IP or other hosting related information.  This makes sense with more and more CDNs. You may also want to see: http://googlewebmastercentral.blogspot.com/2013/04/x-default-hreflang-for-international-pages.html In terms of what to optimize that is up to you, but the important part is to give Google clues as to what pages you want ranked where/how.

    | jeff-rackaid.com
    0

  • Yea would have my developer look at the issue for me thank you..

    | thewebguy3
    0

  • Few rules about sitemaps; You should only include in them pages you also want crawled and indexed They should not contain URLs with 404s or blocked by robots.txt My guess is there are too many URLs in the sitemaps, since I'd guess the website is not over 2 million actual "real" pages, Also, I randomly clicked on a URL in one of the sitemaps and it 404'd; http://www.eumom.ie/forums/topic/oakhill-school-leopardstown-/ This is probably causing a lot of the errors you see. It's honestly not a 5 minute fix - but if it were my site, I would be using the Yoast SEO plugin and using the sitemap feature within Yoast. It makes it very easy to include / exclude certain pages and updated automatically etc. I think there must be a way to tell your plugin what to include / exclude from the sitemap but I don't have as much experience with it. But generally - only include pages you want crawled and indexed. Don't include pages that 404.

    | evolvingSEO
    0

  • The navigation doesn't have so much to do with it being a page or a post. You can place any page in the menu by going to Appearance -> Menus and dragging and dropping. Google does not look at pages vs posts differently - it's generally a content-type decision, not so much an SEO decision. -Dan

    | evolvingSEO
    0

  • Have you setup your https version of the site in webmaster tools as a separate site, if not I would and then I would submit a https sitemap on this profile. I would also 301 redirect the http to https across the site - you may find this previous question useful - http://moz.com/community/q/best-method-of-redirecting-http-to-https-on-homepage

    | Matt-Williamson
    0

  • Can you show some of the code that the sitemap displays? I would think the tags have to exist on the page somewhere, in order for the sitemap to be pulling them in. Could it possibly be from a sidebar or module position header that is rendering the same H tags? Try running the different pages in SEO Browser, to see if the pages physically are the same. After entering your URL, click simple: http://seo-browser.com/

    | David-Kley
    0

  • Thanks for all the responses. I did end up fixing it with a rewrite rule after setting up cheap hosting on GoDaddy and made sure the domain was redirecting with a 301. This is how it was working before. What had changed was the client's desire to have everything on Yahoo due to his email and Yahoo told him it would fix his flaky email issues. He has resigned himself to transitioning his email away from Yahoo and all is well again. Who knew that Yahoo only does 302 redirects! Seriously?!

    | cindyt-17038
    0

  • I would go with the Event schema as you're just telling Google about the structured data of the event, not that you're not the organizer. If you correctly implement the data you should get intro trouble.

    | Martijn_Scheijbeler
    0

  • Why you say panoramic, do you mean side scrolling? If so, you should be in the clear, Google is ok with that. People for years have used negative positioning with sliders and the content associated with them and it gets indexed just fine. The if the site is utilizing parallax side scrolling and is pretty much a single page site I think you will have the issues more closely related to getting a single page site to rank.

    | LesleyPaone
    0

  • Most plumbing companies just can't devote the resources to rebrand. What you said was a permutation of what I said, so I can't disagree. Though getting a smaller business to rebrand is terribly difficult. We all know the benefits. But are there any tactics you know of that would help?

    | Travis_Bailey
    0

  • Haven't heard of penalties with no messages or emails (I'd double check the email settings and spam folders just to be certain). Definitely seems like an odd situation and a less-than-ideal user situation. As for the age of the penalty, Google has only been notifying users of "partial" pentalties for a year or so, and each penalty has an expiration date. (the expiration date isn't revealed to the website owner, only Google knows this) There is a chance if you pose this question on the Google Webmaster Forum, an actual Googler may be able to answer directly.

    | Cyrus-Shepard
    0

  • Upload the file with all of your backlinks you want disavowed, including the old links.

    | KeriMorgret
    0

  • This is a pretty thorough outline of what you need to do: http://moz.com/blog/web-site-migration-guide-tips-for-seos My steps are usually: Identify pages that get significant organic traffic by pulling the Organic Traffic report in Google Analytics for the past year or so. Identify pages that have a significant number of links (or, have links from high traffic sources) in Open Site Explorer. Map where that content should be now, and 301 redirect to new pages. Completely remove all old pages from the index by 404ing them and making sure that no links on new pages point to old pages. Sounds quick and simple, but this definitely takes time. Good luck!

    | KristinaKledzik
    0

  • That's super helpful, thank you. My GTLD would be a brand term, perhaps the closest description would be 'B.bc' (if that existed) or The.times (again, if that existed). In these examples, would the URL be interpreted as BBC and thetimes or just 'B' and 'the'?

    | ecommercebc
    0

  • Dear Martihn Is there any fix to this? it is hard not to have scripts and input fields that contain these sort of things!

    | urgiganten
    0

  • Do you have implemented the hreflang mark-up? Remember, Googlebot comes from an US IP, hence it will probably will see always the .com general version of the site. If you implement the hreflang, you are telling it what URLs to show depending on the country the visitors performs a search from.

    | gfiorelli1
    0

  • Hi Jimmy, Great to hear from you. My initial reaction is that it might be trickier to rank with the .co.uk, as well. Context wise, I hope this helps... this would be for a B2B website that sells into various countries, including the likes of Spain, Italy, Germany, as well as China, Vietnam, Thailand and more. Primarily we'd be looking to rank for our brand name in various different languages, e.g. simplified Chinese, Thai, German, all from a .co.uk. My initial suspicion is that a .co.uk may be trickier than getting a .com or country specific TLDs ranking (e.g. brandname.cn brandname.de), especially with Baidu which doesn't seem to have a lot of love for .co.uks. Based on this, would you stick with .co.uk, go global with a .com or country specific with ccTLDs for each target market? Holly

    | ecommercebc
    0