Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Chandu, There's not much value in these for links. The primary value of video networks (assuming you mean networks like YouTube and Vimeo) is the ability to get your video in front of a wider network of people. You could make that argument that nofollow link diversity has some value, but it would be very low on my list of priorities.

    | KaneJamison
    0

  • Let me answer your question exactly I apologize I went over again. Depending on how many total inbound domains you have if you have a lot take it with a salt shaker of salt. If you have very few take it very seriously. Also if you filled me in on exactly why it felt these domains were black or negative that would be a huge huge help as I am really flying blind here I have no URL I have absolutely nothing to go by but somebody telling me should I use this tool Jurmu 70 links? It's a fine question and I don't want to say that shouldn't be asked everything should be asked to hear that you want to know. However I failed and not asking you for a lot more information about your website. Prior can you give me your domain URL? Can you give me the results of your linked schools test? Can you give me the results of running into the URL I've Been asking for . Do they actually match up? Then I can give you an honest answer that won't ruin your website possibly. I'm sorry if I sound crazy but there's a lot that goes into this man and I don't want to be Responsible for hurting your website I hope you understand that., Respectfully Thomas Sincerely, Thomas

    | BlueprintMarketing
    0

  • Thanks - I hadn't encountered this before so immediately panicked. Having spent the last several hours doing research I've calmed down a little and will work around as best I can Really appreciate the help!

    | LeahHutcheon
    0

  • At the moment, it doesn't seem that rel=publisher is doing all that much for sites (aside from sometimes showing better info ion the knowledge graph listing on Brand searches)  but personally I believe it's functionality and influence are going to be greatly expanded fairly soon, so well worth doing. As far as it contributing anything to help speed up indexing... doubt it. P.

    | ThompsonPaul
    0

  • Thanks Cyrus, that makes a lot of sense - one of those strange intricacies!

    | EdelmanDigital
    0

  • Yes that all makes total sense. It's a real shame that google are so harsh on new sites. New sites need traffic from google like a baby needs oyxgen, and without it they might not be able to survive. Happier is self funded and about 6 months old, and still losing money every week. I've thought about pluging the plug a fair few times. I'm not sure how much more money to invest, I might be just throwing good money after bad. Sorry whinge over.

    | julianhearn
    0

  • Hi Tom Thanks for your reply and useful help. Your advice about the 301 tallies with what others have said about the matter. With regard to your indexing question only the ranking url is being indexed. Thanks again for your help.

    | sicseo
    0

  • Wikipedia updates content all the time and they seem to rank rather well. From google's perspective they would rather rank up-to-date content, so yes its got to be a good idea to update. An old page might have links to it, and history with google, so if it had up to date content its got to be better than a brand new page.

    | julianhearn
    1

  • It looks like the sites are just taking the content from their site, and putting it onto other blogs to generate backlinks. It's hard to imagine links from all that duplicate content helping much, but apparently it's working well enough for them. What keywords are they ranking for? To answer your question, though, you absolutely should not give up on these niches. These sites barely have any PR. You should be able to easily outrank them with a decent site and a few authoritative links.

    | TakeshiYoung
    0

  • I'm going to throw in a completely different option, because in my opinion, messing with this kind of multiple version situation is going to put your huge website at massive risk of screwed up rankings and lost traffic no matter how tricky you get. First, I'm assuming that significant high-level load testing has been done on the dev site already. If not, that's the place to start. (I'm suspecting a Joomla site for 40 million visits a month will have lots of load-balancing in place?) Since by all indications, the sites will be identical to the visitor, I'd suggest switching to the new site, but keeping the original site immediately available in near-line status. By setting the TTL of the DNS to a very short duration while in transition, the site could be switched back to the old version within a minute or two just by updating the DNS if something goes pear-shaped on the new site. Then, while the old site continues to serve visitors as it always has, devs can fix whatever issue was discovered on the new site. This would mean keeping both sites' content updated concurrently during the period of the changeover, but it sounds like you were going to have to do that anyway. There's also the small risk that some visitors would have cached DNS on their own computers and so might still get sent to the new site for a while even after the DNS had been set back to the old site, but I'd say that's a vastly smaller risk than screwing up the rankings of the whole site. Bottom line, there are plenty of load testing/quality assurance/server over-provisioning methods for making virtually certain the new site will be able to perform before going live. Having the backup site should be a very short term insurance, rather than a long term duplication process. That's my perspective, anyway, having done a number of large-site migrations (though certainly nothing approaching 40M visits/month) Paul Just for refernce, I was involved in helping after just such a major migration where the multiple sites did get indexed. It took nearly a year to rectify the situation and get the rankings/traffic/usability back in order

    | ThompsonPaul
    0

  • Sorry for the late response, but I can sum up my answer in a brief way. I agree with you. If SEOmoz is finding the meta refresh, it's likely Google is too. And why the meta refresh? Why not a 301 redirect? Could you provide an example of one of your search pages? Feel free to private message me if you don't want to share publicly.

    | Cyrus-Shepard
    0

  • Yeah, I think canonical is fine for "index.php" variants. One important addition, though. Check your internal link structure. Many sites link to "index.php" via their "Home" link or logo. I'd suggest changing that to an absolute URL or just "/". That way, you're not creating the non-canonical version in the first place.

    | Dr-Pete
    0

  • If Page.fr is in French, I'd use rel=alternate/hreflang and not the canonical tag. It should allow both versions to rank for the appropriate audience without causing issues with duplication: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 These aren't "true" duplicates and rel=canonical could disrupt the ranking ability of the French site.

    | Dr-Pete
    0

  • I work for a non-profit association. We currently use a .com as our primary, but also own the .org. Should we switch to the .org address? What would the benefits be? You own them both.  That's the important thing. I would carry on as you are doing and 301 the .org to the .com.

    | EGOL
    0

  • In terms of what you've written, blocking a page via robots.txt doesn't remove it from the index. It simply prevents the crawlers from reaching the page. So if you block a page via robots.txt, the page remains in the index, Google just can't go back to the page and see if anything has changed. So if you were to block the page via robots.txt, and add a noindex tag to the page, Google won't be able to see the page with the noindex tag to remove it from the index because it's blocked via robots.txt. If you moved all of your old content to a different folder, and block that folder via robots.txt, Google won't remove those pages from the index. In order to remove them from the index, you would have to go in to Webmaster Tools and use the URL removal tool to remove that new folder from the index - if they see it's blocked via robots.txt, then and only then they'll remove the content from the index - it has to be blocked via robots.txt first in order to remove the whole folder with the URL removal tool. I'm not sure though if this would work for the future - if you removed a folder from the index, and then added more content that was indexed previously afterwards, I'm not sure what would happen to that new content moved to that folder. Either way, Google will have to come back and recrawl the page to see that it has moved to the new folder, and then remove it from the index. So either way, the content will only be removed once Google recrawls the old content. So I still think a better way to remove the content from the index is to add the noindex tag to the old pages. To facilitate the search engines reaching these old pages, I'd make sure there is a way the engines can get to them - make sure there is a path they can take to reach them. Another good idea I saw on a forum post here a while ago would be to create a sitemap containing all of these old pages you have indexed and want removed. Add the noindex tag to the sitemap - using the Webmaster tools sitemap interface, you'll then be able to monitor the progress of deindexation over time - by checking how many pages on the sitemap/s of the old content are originally indexed as reported by webmaster tools, and then you can see later on how many of those pages are still indexed, this will be a good indicator for you of the progress of the deindexation.

    | Mark_Ginsberg
    0

  • Can you provide keyword phrase for us?

    | DarinPirkey
    0

  • I think we recently dropped, I pulled all these numbers from SEOmoz's rankings reports. Thanks for the tip though, I have definitely made the mistake of being logged into a Google account while checking my rankings in the past!

    | Travis-W
    0

  • I submitted my site map to Bing Webmaster tools yesterday and it's still flagged as 'pending'. Does it normally take that long?

    | VERBInteractive
    0

  • What do you mean you had no other choice? What forced you to add these tags to your client's site?! Because they were updating it? ...uhh.. This thread is bizarre. Anyway sounds like all of it was a non-issue. Mark is absolutely correct about your real issue though. Get some redirects in there asap.

    | jesse-landry
    0