Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Thank you gentlemen for your answers. As per the post the website is built with ModX CMS and the difficulty has come about because the client is using one database to serve 4 websites.  One of the sites has been configured incorrectly and is creating pages that relate to listings on other sites and passing traffic via a 302 redirect. We're working on fixing the configuration so new pages and 302 redirects are not created.  I want to remove from the index the 2050 302 redirects we've discovered to date.  GWT allows you to remove URLs one at a time although the removal request procedure does not seem appropriate in the situation. Due to NDA can not publish URL.

    | tinbum
    0

  • You started by saying the problem is duplicate content. Are those pages with the various parameter strings basically duplicate content? Because if they are, no matter what you do you will probably not get them all to rank; the URL is not your main problem in that case. (Though you still should do something about those parameter strings.)

    | Linda-Vassily
    0

  • Using the campaign url's to track all traffic in analytics from the alternate domain. The issue being is when people search for {domain} rather than put in {domain}.com nothing comes up. I think there's an issue with how the 301 is being served. It serves up a 301 immediately rather than letting google seeing the page, a tracking pixel, and then serving a redirect.

    | Dom441
    0

  • It's my understanding that you treat each subdomain as a unique site. So each subdomain should have its own unique XML sitemap and robots.txt file, as well as submitted separately to Google Webmaster Tools. But to answer your inter-linking question, I would avoid including the other domain URLs in those files (XML and Robots). Only include the URLs for that particular domain and/or subdomain. With that being said however, I would inter-link them on the actual site somewhere. Maybe in the HTML sitemap, navigation, footer, or even naturally throughout your body content where appropriate as well.

    | Ryan-Bradley
    0

  • Those pages will eventually drop out of Google's index, but if there are still sites (either pages within your own site or others) that are linking to any of those pages you will continue to see 404 error codes. I'm working on fixing the same issue on a site that I just started optimizing. The best thing you can do is a 301 redirect from each of the old .php pages to a similar, relevant page that currently exists on the site. This will fix the 404 codes and also pass any page authority from the old page to the new page that it is being directed to. Here's some helpful info from Moz on 301 redirects: http://moz.com/learn/seo/redirection Hope that helps!

    | garrettkite
    0

  • I agree with the earlier responses left by EGOL and David-Kley. You asked about must-have's and best practices for creating a blog. Coincidentally, one of our fellow-Mozers wrote and pubished a how-to-launch-a-successful-blog post on October 13 that I think will answer your questions and work as a great how-to example at the same time. That same post talks about blog promotion. Neil Paten and Aaron Agius at Sprout Social have also written a very helpful (and free!) Complete Guide To Building Your Blog Audience that you might find helpful. Good luck.

    | DonnaDuncan
    0

  • It can take a while, sometimes up to a few months. Google doesn't work fast, and it looks like you have already resubmitted the sitemap and fetched the pages. When we launched our new site, we saw a drop for a month, then we started to come back. Now after 6 months, we are higher than we ever were. "Normally a traffic drop like this is because someone failed to do 301 redirects of the old pages to new pages" Agreed. Was the new site launched on the same domain?

    | David-Kley
    0

  • Hi, That should be enough to stop the search engines crawling and indexing the test site. Remember to take it off when you go live though.

    | Houses
    0

  • Jesse, Thank you so much for the response. Much appreciated. It really narrowed down the problem for me . I will begin auditing the backlinks & adding them to the disavow tool. Will really appreciate if you have any suggestions for a recovery? Cheers!

    | ScorePromotions
    0

  • So, there isn't any way to directly pull search volume directly into excel? I'm looking at a data set of 400K keywords

    | nicole.healthline
    0

  • Hi, Thanks, I will do some testing to confirm that this behaves how I would like it to

    | ntcma
    0

  • Thanks, Marie. That post is really helpful. Just to clarify one point. I've read that google may take as long as six months to recrawl a website. Does that mean that many of the links on our disavow list uploaded in Sept might not have actually had the (invisible) nofollow tag applied yet, and in which case may still be harming our website as far as penguin is concerned? When I read that google was processing disavow requests with the penguin update, I thought that that meant that the usual wait wouldn't apply, that everything would be recrawled with the penguin refresh, if that makes sense. I'm trying to convince myself that our work on removing/disavowing links hasn't fully taken effect yet, and that we'll see a bounce in our rankings with the next penguin update, whenever that may be. I'd rather not take the lack of improvement we've seen this time around as a sign that we're never going to make a recovery. I certainly can't see how we can do much more work in terms of removing links. We were pretty thorough.

    | mgane
    3

  • Hi, Thank you all for your knowledge.  I now feel better equipped. Cheers

    | McCaldin
    0

  • I know it might sound subjective, but we can’t assume that a nofollow link doesn’t provide value. Given that Google can choose to crawl and credit the link if they so choose. I wouldn’t say it does you any harm as the link was placed for a user. If someone found it worthwhile to place the link then maybe there are other opportunities to meet their need for other resources.

    | FarkyRafiq
    0

  • 2 of my sites hit yesterday and they are not showing anywhere on google. They are still indexed but there is no manual action on webmaster. i think they are not coming back.

    | samafaq
    0

  • Hi, First off it would be useful to know which version of Drupal you are using, which modules you have installed to control Meta Tags and the version of that module you have installed. It sounds like you might be running Drupal 7 and are perhaps using the Meta Tags Module (the version - can be found under /admin/build/modules)? The page at /admin/config/search/metatags allows you to set up meta tag defaults for different content types using tokens, such as the node's title, site name etc. However, you should be able override this on a per node (page) basis. Take a look at this video and it will guide you through it: https://www.youtube.com/watch?v=SviVqtAinSk#t=5m56s. If that doesn't help let me know what happened and all the version details and we can explore further watch?v=SviVqtAinSk#t=5m56s

    | MatShepSEO
    1

  • The rel=canonical tag would be a great way to start off. This should send all the right signals to Google about which site to show. I'd make the change and then wait a few weeks to see if this happens. Then go from there.

    | EricaMcGillivray
    0

  • The SEO was right with the 301 with the knowledge that 301 will not pass 100% rank authority as the original URL. The 301 will drop between 1 and 10%. Sounds a bit complicated the next bit so to save this complication. Have the info on both sites, but put Canonical tags on the pages with the duplicate data.  This is the preference from Googles perspective.  this tells google that this is duplicate content.  If you intend remove the data from these original locations then rel canonical etc will not be needed. Google does not want duplicate data, therefore you should for good practice use the canonical or delete the other data from sites Hope that is of use Bruce

    | BruceA
    0

  • I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing.  It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows.  It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow.  The blacklist contains over 14,000 domains at this point and is growing.  And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes. I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links.  If you're trying to escape Penguin, you have to be way more accurate than this. It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/

    | MarieHaynes
    0

  • Yes, I would always give it a try as long as you try to follow all their guidelines and only submit news to them they might approve you.

    | Martijn_Scheijbeler
    0