Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I found this resource a few days ago, have a look please, I think it answers your question. http://www.forbes.com/sites/ciocentral/2013/01/23/html5-vs-native-mobile-apps-myths-and-misconceptions/

    | echo1
    0

  • I'm working to remove low quality pages from a directory while at the same time allowing a few high quality pages in the same directory to be spidered and indexed.  To do this I placed a robots noindex tag on the low quality pages we don't want indexed. This noindex tags where implemented yesterday, but the low quality pages aren't going away.  I even used "Fetch as Googlebot" to force the crawl on a few of the low quality pages.  Maybe I need to give them a few days to disappear, but this got me thinking: "Why would Google ignore a robots noindex tag?"  Then I came up with a theory.  I noticed that we include a canonical tag by default on every page of our site including the ones I want to noindex.  I've never used a noindex tag in conjunction with a canonical tag, so maybe the canonical tag is confusing the SE spiders. I did some research and found a quote from Googler JohnMu in the following article:  http://www.seroundtable.com/archives/020151.html  It's not an exact match to my situation because our canonical tag points to itself, rather than another URL.  But it does sound like using them together is a bad idea. Has anyone used or seen canonical and noindex tags together in the wild?  Can anyone confirm or deny this theory that the canonical screws up the efficacy of the meta robots tag?

    | davidfricks
    1

  • I'd want to step back a bit and strategize, set some objectives before proceeding. Determine what outcomes are expected, and while you're at it keep in mind how changes could improve user experience. Maybe end up consolidating if it makes sense as a way to improve rankings, possibly retire some of them by moving the content over to the main site and 301 redirecting everything. All depends on what we're specifically talking about of course, how well do all the domains rank now (including main), what content is housed where, how does it all inter-relate logically, how much traffic does each currently get, quality of that traffic in conversion terms, consider any trends: are any of them moving up or down lately, what inter-site referral volumes are happening, all that. To the dup content point: absolutely do not simply copy from one place to another. Hire a writer if necessary to add value by at least offering a different point of view on the material if there is any similar content in more than one place.

    | peterthistle
    0

  • If you are using WordPress you can use "J translate" which allows you to have multiple versions of the copy per language. That is what we use for our foreign clients or websites that require multiple languages.

    | BeardoCo
    0

  • It seems like Google would want unused/unwanted pages removed--less "junk" on the web--less irrelevance to add to their already huge index... A long time problem I think...I will sometimes put NOINDEX, NOFOLLOW in pages I wanted removed from the index--then after a few months, delete them. Thanks for the feedback!

    | spkcp111
    0

  • Silos do several things that make them valuable IMO: First they allow you to control how pagerank(ranking potential) flows throughout your site. They allow users to navigate your site much easily(imagine trying to navigate amazon if it wasn't in silos) It makes internal linking easier because then you don't have to stretch to find  a legit way to link all the pages within a silo You can focus on specific keywords or keyword categories Those are a few of the benefits. So yes I think they are still really valuable. P.S. the majority of updates have been punishing people who are trying to game the system through content farms, anchor, texts, and other linkbuilding tactics. Good information architecture is still good information architecture.

    | chris.kent
    0

  • The cache date and crawl date need not necessarily be the same. This was pretty evident till late 2006 but after that Google changed this to great extent but still there are lot of technical limitations where it is not possible or rather I must say not feasible or unnecessary (Google also uses the, 'if modified since' status to know if a page has been updated or not) to update the cache date every time they crawl a page. Here are two URLs that will give you more info in this regard: 1. http://www.mattcutts.com/blog/video-crawl-dates-in-the-google-cache/ 2. http://googlewebmastercentral.blogspot.in/2006/09/better-details-about-when-googlebot.html 3. Most importantly: http://www.seroundtable.com/google-cache-update-12686.html We see these cache date anomolies daily in our job and personally I have seen these for over a decade. Once you hit the main stream SEO where you start working for clients realtime, you will see these first hand. Regards, Devanur Rafi.

    | Devanur-Rafi
    0

  • Yes, I bet.  I just want to be able to know what I'm asking.  lol Thanks again for your help!  

    | cyberlicious
    0

  • I can say that I've "claimed", maybe, a couple dozen businesses for clients over the years using my account--just as anyone else can.  The Google Places for Business is just a dashboard where you can claim and request updates for a business record and and a place where Google can keep the person who requested the changes informed of the status of that change. When you "claim" a business, what you're really doing is telling google that you claim you're authorized to make changes and that when google sends the PIN by phone or mail to the company's phone number or address, that someone at the company is going to share that PIN with you so that you can enter it in the dashboard and make the changes take effect. If you did accomplished all that, then the next day, you got fired, someone else (or the business owner) would simply be able to go their own dashboard add the business, make any changes were needed, and wait for the validation PIN to be sent by phone or mail, enter the PIN into their dashboard, and then what ever changes they made would take effect. Business pages don't belong to accounts, they're tied directly to the name address and phone number of the business itself.

    | Chris.Menke
    0

  • This happened to me, too. Google has been finding these links since last October, and I just keep adding the domains to my Disavow list. My rank has slipped a bit (from 2 to 4) but its hard to know if these thinks are the reason. Probably not. When the links were first pointed at my site, Google moved me from 2nd to 1st place for my top keywords. The bad links seemed to give me about a one week a temporary boost, before we settled back to 2nd . Google HAS to be aware of this. Their silence on this issue is deafening. Cutts gave it a little bit of lip service; but there must be tens of millions of these junk links being added daily judging from all of the people selling 100k bad links on Fiverr for five bucks. So far I'm mostly annoyed by these bad links, rather than hurt by them.This has really screwed up all the intense work I did to scrutinize and analyze my link profile. A nice feature for SEOmoz would be to allow us to UPLOAD our DISAVOW list so we can get some of our reports with the junk scrubbed out. After all, if I tell Google to ignore these 1,000 links and presuming they actually do ignore them, then it would be more useful to get SEOmoz reports with that data removed as well.

    | DarrenX
    0

  • Thank you so much Mike . This helps.

    | INN
    0

  • What's the URL to your site? I do not see one posted above.

    | itrogers
    0

  • Hi Brian, There is so much confusion surrounding this very common topic because Google has repeatedly changed their stance. The thread Amber has pointed to on Linda's forum quotes the most current Google policy on this. Linda's thread doesn't link to the source of this, so I will here: http://productforums.google.com/forum/#!category-topic/business/S6APN5ijnOQ And here is the complete language of what Google staffer, Jade, wrote: Verified business owner of a page, and is your business moving locations? Here's what you do.__Edit your address in Google Places for Business or in the Google+ page admin area, whichever you are using to manage the page. This will either make a new page or edit the address on the existing page. It may take a week or two after editing your address before you see an update. At this point, you may need to go through a verification process again. Don't worry -- this is normal.__If you see a page that's still got the old address, click on Report a problem and mark that location as closed. Provide the link to the new address or information about the new location if possible. You can find more instructions on closing a location here: http://goo.gl/YZIjq*Now, remember, following the above steps must be followed up by a citation cleanup campaign. You need to find all web-based references to your old address and edit them so that they reflect the new address. This is very important.

    | MiriamEllis
    0

  • This same thing is happening to me right now. I have two projects: One in English and one in German and they are getting totally different results. I have pretty much tried with everything: wistia, vimeo and dailymotion I got all of the videos on my sitemap BUT they were indexed in different times: first two of them and then the last one. I have noticed that sometimes Google shows the video snippets and sometimes its ignored. The rankings for those keywords are being also very unestable.  I tried  also to put flash videos on my German language project and they also got indexed because I put them on my video sitemap but now in this precise moment I was testing if putting a video for a page that was on first position could get a higher CTR with the video snippet but guess what... currently my #1 ranking position is now out of the 100 positions. CHAOS! I guess at this moment Google is trying to figure out whether to show up or not a video snippet and our page for a keyword in a language where there were never a video snippet. SO Confusing! Also found out that your rankings definitely drop when they were at the top 3 positions even they can drop to the second or third page but at this moment they are trying to get their past position. I will have to keep a big eye on this. If you guys got more insight and ideas for me to get my video snippets with more stability let me know! P.S in case you want to check my keyword and URL that was #1 and dropped out of the 100 results its the German word "Peru Reisen" my page is http://www.viventura.de/reisen/peru Thanks

    | viventuraSEO
    0

  • Wow, I just spent some time trying to figure the footer links out and it is just a mess! I don't understand what Magento is trying to do but it sure isn't working. Thank you for pointing that out.

    | t_parrish
    0

  • Hi, Ive been using wordpress for a while now and am just starting to look at this. I have two questions; 1. What are the benefits of adding more update services (as mentioned above)? 2. Is it just as simple as adding these in (comma seperated) under setting: writing - update services? Or do you need to register with each one? Thanks in advance Dean

    | Deanknight
    0

  • What's the end goal here? Are you actively trying  to block all bots? If so, I would still suggest "Disallow:/". The other syn-text may also work, but if Google suggests using a backslash, you should probably use it.

    | Igal_Zeifman
    0

  • Firstly welcome to the community David :), Very interesting video  - I hadn't seen this! Additions like this show why this community is so great! We are constantly helping each other expand our knowledge. I think you will agree that a 301 is still the answer to this question and the best method for moving domains and taking the value of your old domain with you in terms of the search engines and ranking...

    | Matt-Williamson
    0

  • I personally do everything manually.  I think that the link removal tools can work great for some sites, but your best chance at identifying the bad links and keeping the good ones is to look at them manually.  2500 domains is a lot, but not impossible.  I'm currently working on an account of about that size and it will take me about 10-14 days to go through as many.  Once you get going you will recognize patterns and it will go faster. I used to get emails on my own but I have just hired someone to do this for me.  I find that the automated tools miss a lot of them.  I was considering hiring from o-desk or mechanical turk, but in my situation, because my business is expanding and most of what I do is penalty removal, it's worth my while to hire and train someone to do this for me. btw...if you've got 2500 domains, you won't have 2500 emails.  Many will be offline or nofollowed or perhaps even natural. Ezine Articles links definitely need to be removed if they are followed links.  Often times those links are nofollowed, but if you have a high enough account level there then they are followed and need to go. A few other points: -Yes you're right.  It's not enough to just disavow.  Google's going to want to see evidence that you've tried hard to remove links. -Lately I have only be using links from WMT and not other sources like Majestic and ahrefs.  That may cut down on the number of domains you have to deal with.  So far it is working for me. Hope that helps!

    | MarieHaynes
    0