Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hey Spencer, Normally you'd use the meta fragment directive you mention for pages that don't have #! in the URL (see section 3 here: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started) to indicate to crawlers that this site is AJAX. When crawlers account the #! they usually search for the 'crawl friendly' version of that URL which is specified by the 'escaped_fragment' URL parameter. The directive above indicates to crawlers that even though they don't see a hash they are on an AJAX page. The #! approach was an interim method that sites used, which is gradually being replaced by the alternative approach that HTML5 PushState allows. I think if you're still confused the easiest solution would be to get some example URLs for your site (or at least the pattern of the URLs and what markup they have etc., and whether they are indexed). Hope this helps! -Tom

    | Tom-Anthony
    0

  • Hey Gerry, Sorry for the delayed reply. The Q&A questions only get directly assigned to the Help Team if they are marked as a product support question. I just looked into the campaign associated with this account and I don't see that we have been reporting any 404 errors on this site, but I do see that the campaign was just created on Nov 26th, so it seems that you may have deleted and recreated the campaign. Since we are no longer reporting any 404s, I can't look into the issue directly now, but if you run into this issue again, I would recommend sending an email directly to help@moz.com so we can investigate what may be causing the issue. Let me know if you have any other questions in the meantime. Chiaryn Moz Help Team

    | ChiarynMiranda
    0

  • Have you tried this? Add your list of users or followers in column A. Add this in column B: =ImportXML(A1, "//span[@class='bio']")

    | storemachine
    0

  • My SERPs for a competitive term I felt I underperforming for dropped about 10 spots overnight after I added "noindex,follow" to the product pages. From the 3rd page to the 4th page, so it's not like I had a lot to lose. My SERPs for less competitive long tail keywords, which is where I'm getting most of my traffic, have dropped slightly or stayed the same. Should I cross my fingers and hope for a recovery? Revert the product pages back to "index, follow"? Any thoughts?

    | znagle
    0

  • If you want nothing on that test subdomain indexed, verify that subdomain as its own site in Google Webmaster Tools, exclude that subdomain from being indexed in robots.txt, then request removal of that site (subdomain) in GWT. And consider setting up a page monitor like https://polepositionweb.com/roi/codemonitor/index.php on the robots.txt of your test site (and live site). It'll check the contents of those pages once a day, and email you if there's a change. Handy if there are multiple people working on the site.

    | KeriMorgret
    0

  • Hi James While I fully disclose I'm not intimately familiar with the ins and outs of responsive, I do believe that instead of CSS hiding/showing the same links but having them repeated three times in the HTML, ideally the links would be in the HTML once, and CSS would take care of just restyling them based upon the screen size. But with that said, Google does not mind so much about the "too many links on a page" situation. They are good enough now to likely figure out what is going on with your menu, and will try to take that into account when assessing the page, and when assigning PageRank through each link. So while it would be 100% perfect to not repeat every link it's still something that's not going to be detrimental if you had to leave it that way. The only exception to that I think, would be if this is a super large site (like 100,000+ pages) but I don't get the impression this site is quite that big -Dan

    | evolvingSEO
    0

  • Nop I just put all of them in the same folder but I would like to change it

    | AymanH
    0

  • I want to say I agree with Tom I focused strictly on your CMS and as Tom said essentially you have been hurt by your lack of relevant root domains pointing to your site. Unfortunately after penalty given with an excellent cleanup Google is a little more wary of everything you do. So when you try to earn these new links which will need to do to compete for specialist paints an extremely competitive keyword. Be extremely careful that you do not cross any boundaries. Take a look at the information below as well I did quite a few scans on your site and hopefully that will be of some help to you. Respectfully, Thomas

    | BlueprintMarketing
    0

  • Are you looking to remove every single video snippet? Or just the snippets from some pages? It seems fairly drastic to remove every one, so I'd suggest you  delve a bit deeper and work out exactly which pages you don't want the snippet on any more. That way, you can submit a sitemap with some URLs, but not all. Additionally, note that it's much easier to get the thumbnail changed than the snippet removed entirely, could you try split testing thumbnails to see if it's just suboptimal pictures that are causing you problems here, rather than the snippets as a whole? If you can't take the videos down, I honestly think you'll struggle to get the snippets removed. Google are not on your side here unfortunately.

    | PhilNottingham
    1

  • I would think that using the subOrganization schema markup to denote the "secondary" location and imply which location is the "primary" business location and should be served/favored as the default may help take it a step further.

    | BlueTent20
    0

  • There was a big local shake-up around 10/25 - were you potentially ranking in local markets for some of these phrases (they don't seem like local queries, but just thought I'd ask)? I'd have to agree with Marie that the timing suggest this wasn't Penguin, but Google hasn't been very forthcoming on Penguin data refreshes. Your link profile looks pretty clean to me, and your site isn't really large enough to have large-scale content issues. Those services pages look a little keyword-targeted and border on thin, but if you're talking about a handful of them, I doubt it's enough to cause you serious problems. If you rolled out hundreds of them that were all just variations on the same core keyword phrases, that would be different. On the other hand, if those pages specifically target the terms that dropped, it's hard to ignore that fact. Did you do any targeted link-building to those pages. I'm seeing one weird thing - OSE is showing a ton of recent links from Scoop.it - across a large number of pages - but when I check the source site, I'm not seeing any links. It's possible you got a temporary boost from some links that got removed, but that's really hard to track.

    | Dr-Pete
    0

  • Thanks Appreciate your help Martjin!

    | Ideas-Money-Art
    0

  • I have heard conflicting things from people at Google, including Matt Cutts, throughout the years. I have also seen some first-hand anecdotal evidence that subdomains inherit some benefit from the parent domain, though much less evidence that parent domains gain anything from the subdomains. It is my opinion that the devil is in the details here. If, for instance, you have a site with dramatically different topics on various subdomains, as if they were completely different websites, I think it is more likely that Google will treat them as such. However, if you have a site with a subdomain for a certain section, such as a forum or blog that is of the same, or very similar topic, with lots of interlinking between the two - as if they were different parts of the same site - I think it is more likely that Google will treat them as such. So I don't think the answer is as cut-and-dry as people often like to think, which is why it is so difficult to find consistent proof of one theory or the other. In the end, do what's best for your users and developers. Most of the time that means a subdirectory, but every once in awhile it means a subdomain. I'll leave this open for other opinions. Maybe someone has some empirical proof that we can look at.

    | Everett
    0

  • According to Google you have over 2000 indexed pages. However your site does not have any Extremely intuitive ways of allowing someone to navigate to those pages. I see the search directory. I would contact the developer and go through your entire site making sure that it will not prevent Google bot from crawling it http://marketing.grader.com/report/http://www.casacol.co/#seo http://www.internetmarketingninjas.com/seo-tools/google-sitemap-generator/tmp/c706f9f5cdb4a88879c39ea6efaf8ab8.html

    | BlueprintMarketing
    0

  • It's difficult to say whether these links are helping or hindering at the moment  - by the sounds of it the report the SEO agency ran was a Link Detox report and the results are purely automated, while I have a lot of faith in the tools from LRT I wouldn't be 100% confident with the results as some of the links may be giving a false positive... however you will most likely be talking about a small change ratios. Ultimately there is a potential problem if you haven't received a warning yet, so I would look to remove as many of these links as possible - as soon as possible.

    | ChrisDyson
    0

  • Hi Jen: That is very helpful, thanks!! The one point I did not understand is the last one one regarding checking to see if the c-blocks are varied. Could you please elaborate. Also, do you think it would be risky for me as an amateur to do this on my own, that link removal would be better left in the hands of a professional? I am working with a reputable SEO firm, but they are requesting almost $3,800 to identify and remove approximately 225 domains that have toxic links to my site. If I use a professional SEO firm I would probably want to conserve my resources for link building ($2,500/month). But I don't want to be penny wise and pound foolish. So do you think I could disavow bad links on my own? Also, would you suggest any software of tools for doing so? Thanks so much. Alan

    | Kingalan1
    0

  • If anyone has successfully switched a blogger blog from blog.site.com to site.com/blog, please let me know or point me in the direction of a good tutorial. I want to make sure I'm doing this correctly. Would also like to know if blogger redirects those old urls to the new site. Thanks.

    | ChaseH
    0

  • Hi EGOL, Wow, thank you so very much. This is one of the best answers I've ever received, probably the best, here in Q & A. Your thoughtful comments and suggestions are so appreciated. Honestly, you gave me a check list of things that have potential to be pure gold for us if we act on them. Yes, you are correct, this is the site that had many issues with content being under tabs. It's also got a tremendous amount of duplicate and thin content issues, in addition to orphaned pages. Progress has been coming along, slowly and surely, but having your comments, and having them be so specific, pointed and concise are something I can take to my team and say "Here's an awesome check list of  things that we can actually address right now, without re-platforming the site [you know, there are always people who think that the root of all a site's problems is the platform that it's on...pure mythology]." I hope many others find your check list useful. Combined with Annie's audit spreadsheet in Google docs, I feel like I have the tools I need to go to battle and help this site fulfill its potential. Nearly every point you mentioned struck a chord. Better yet, now that I know my way around the "guts" of this homegrown CMS, I feel like I can actually make the necessary changes. Egol, I really can't thank you enough.

    | danatanseo
    1

  • Any time! It's definitely wise to err on the side of caution. If you ever want to chew the fat on SEO/digital stuff I'd be happy to help.

    | TomRayner
    1

  • Thanks to everyone for the info. The main url they use is: http://tinyurl.com/l5sl4to however this is just an iframe to the .biz url. Neither url has many links, which all look fine. I'm thinking it could be for the frame?

    | WillWatrous
    0