Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Google's guide for AJAX crawling is here.  This is a considerable amount of work though, and honestly I wouldn't recommend it.  Google is getting better everyday at crawling JavaScript content.  You can try to test whether it's indexing this content by doing some very specific Google searches for things only rendered by JavaScript, and seeing if your pages rank for those searches.  If they do, there's a good chance that Google is reading and rendering your JavaScript properly.  Even if they don't now, it's likely only a matter of time before they do. I don't imagine that Rogerbot has an army of engineers at Moz trying to figure out how to render JavaScript outside of a browser, so I wouldn't expect this to come to Moz crawls anytime soon.  I also doubt that Rogerbot would understand what's going on in Google's guide either, as I wouldn't expect many sites to have actually gone through this process. Why is most of your main content generated by JavaScript?  It sounds like you should be rendering some of this on the page in good ol fashioned HTML.  A lot of times this doesn't require an entire redo of a website. It could just be a matter of loading some default HTML with the page, and then updating it with JavaScript, instead of rendering all of it with JavaScript.  It would be easier to see if you share the site (if not, I understand).

    | john4math
    0

  • Hi Robert, Thanks for the taking the time to provide the valuable insight and information. A couple of points to add: “So that I am clear, you are currently in City A which has good ranking/traffic for city A plus keyword.” This is partially the case, what I am also seeing is that the main URL (www.url.com) is ranking for many head terms (In the 7 Pack). Not including the Geo terms. Which seem to be driving a large percentage of the traffic. So it seems like there may not be much we can do about this, as that ranking in the 7 pack would remove over time. (This is my main concern because it seems to be pushing most of the traffic) Also in the analytic platform I am seeing that a large majority of the searches are being made near the centralized position of the city. (Very close to our office now) Since a large majority of the searches are being made from the centralized location would it be worth maintaining another small office in this area? From reading through some of your points seems like even if we don’t have an office in the area we can still rank a local page? But this wouldn’t be in the 7 pack hence a drop in CTR..? Also just recently I discussed with another individual that had made a similar move. He informed me he communicated directly with Google to change the address as well as get the reviews pushed to the new location. (He said he saw a considerable drop in traffic and rankings in a short period of time) which I'm sure can be from numerous factors, just not sure if you have had similar experience. Thanks for your assistance. Sam

    | PRKEL
    0

  • True, and they don't have a blog! I may be able to convince them to start one. Question: How do organize new articles in a 2013 manner when they've already got an article section with 30 long articles? Here's what I'm talking about: http://www.nlpca.com/DCweb/nlparticles.html

    | BobGW
    0

  • Can you provide one example of each? The answer at this point is: It depends on what the order page looks like. My gut instinct is that the best solution will be "something else". But it would help to see the pages so we can assess the value and uniqueness of each (i.e. are they both worthy of being in the SERPs; do visitors need to see them both; are they both 100% unique, etc...).

    | Everett
    0

  • Hello Tesbat, It sounds like you are using a content delivery network (CDN) to serve images, which is pretty standard. However, there are some best practices that I like to follow, and if your traffic from images is really that low it could be that you are already not following them. First, I like to have a CNAME set up so the CDN exists on your own site's subdomain (e.g. cdn.yourdomain.com) as opposed to another server, (e.g. yourdomain.akamai.com). Every CDN company and host will differ on the details of how to set this up, but if you tell them you want the images to exist on a subdomain of yoursite.com I'm sure they can figure it out - if you don't already have it set up that way. Second, don't change the filenames or basic folder structure. For example, if your images followed this format (/images/image1.jpg) and you went to a CDN that either renamed the images or put them in a totally different folder structure (e.g. /images/0000123.jpg or /CDN/assets/image1.jpg) you could lose some of the momentum you built up over time, especially if those image files are not redirected. Here's a case where that happened to someone. Third, set up and verify the new CDN subdomain in Google and Bing Webmaster Tools. Add the geotargeting for the country you are targeting in case the IP address of the CDN is in another country. Typically this is not an issue for most sites, but is worth considering if the CDN is hosted in another country. Fourth, continue to optimize your images with appropriate file names and alt attributes, and with the title attribute (as you do) when linking to larger versions of the image. Don't worry about those links. I'd rather link to a bigger image than just have the thumbnail sitting there by itself. Just make sure you're actually linking to the image file, as opposed to another web page on which the larger image is embedded (e.g. image nodes in Drupal, or attachment pages in Wordpress). Here are a few pages from Barry Schwartz worth reading: How Barry set up his Amazon S3 CDN Does Google Like CDNs? More from Google on CDNs and SEO

    | Everett
    0

  • Hello Andrew, It sounds like you're asking about how you should format your on-page factors to rank for searches in which people are typing in a specific year, or range of years. If this is the case, I think you will need to include each of the years. When I search Google for a date range as part of the query (not as part of an advanced search, more on that below...) it shows me exactly what I typed in. So if I search for "shoes 2005-2009" I get pages for that exact date range first, with some single year results for 2005 and/or 2009... but without any 2006, 07, or 08 results. If I search for "shoes 2006" I don't get any of those 2005-2009 results, and instead get results from only 2006, or those with 2006 as part of the listed range (e.g. 2006-2007, 2004-2006, etc...). In other words, when searched like that it just treats the date/s as part of the query like any other number - meaning you'd have to include that date in the on-page optimization. However, a searcher can search within a custom date range on Google by going to Search Tools --> and changing "Any time" to "Custom range", then selecting the range from the calender. This isn't going to show those pages that are "optimized" for the dates, but rather the ones that were discovered by Google on those dates. If you give us a little more detail about what you are trying to accomplish, what types of products you have, etc.... we may be able to provide more assistance with regard to best practices surrounding date-based archives/campaigns/products Vs "evergreen" landing pages that are frequently updated. Good luck!

    | Everett
    0

  • I agree with you from an SEO standpoint it quickly seems like overkill.  From a content marketing perspective though it does seem endless the amount of content you could produce.  In my own research I think I found a very helpful blog post with some ideas: http://moz.com/blog/how-to-build-a-content-marketing-strategy I think what I'm lacking is a more directed approach to the entire strategy.

    | AaronHenry
    0

  • Hi, Following some testing, you can actually use the xml update feature for both categories and products individually, the trick is to first remove the existing rel=canonical that is put in if you have those settings turned on for that in the magento admin. So: <reference name="head"><action method="removeItem"><type>link_rel</type> <name>http://www.domain.com/oldurl</name></action></reference> <action method="addLinkRel"><rel>canonical</rel> <href>http://www.domain.com/new-url</href></action> You will need to identify the canonical url being inputted into the code, add it to the oldurl bit above and then put in the new url. You can do this in products and also for categories in the custom design tab and then the custom layout update box. Hope that helps!

    | LynnPatchett
    0

  • HI, This one is pretty good: http://www.noupe.com/php/htaccess-techniques.html And the boilerplate htaccess file often gives a good starting point (and its commented): https://github.com/h5bp/html5-boilerplate/blob/master/.htaccess For redirects Lesley is right, they can get complicated and often need a bit of fiddling to avoid server 500 errors.

    | LynnPatchett
    0

  • Thanks Lesley, I'll take a look at this and report back. Best, Dan

    | LeDanJohnson
    0

  • So, the thumbnail implementation seems fine, but there are errors in other parts of the schema mark-up, which may be why your video isn't getting indexed. Additionally, you may find that you can't get an embedded YouTube video indexed, since the way Google treat YouTube videos is different from other platforms and you often won't get a rich snippet for you page - additionally, the YouTube.com version of your video may end up outranking you and claiming the video snippet. So, firstly I recommend not using YouTube for this - then you need to ensure you're implementing the rest of the Schema.org markup correctly. Hope that's useful!

    | PhilNottingham
    0

  • If your redirect is a 301 there is no need to change anything. The linkjuice will be passed through... To be formal correct and clean up the things a little bit, it would be better if you make a 301 from http://example.net to http://example.net/

    | dotfly
    0

  • That logic seems incorrect since it doesn't account for root domain links that point to the subdomain. This would only apply if all the inbound links were from other root domains. For example, my blog on the subdomain has 1.4M inbound links, 1.35M of which come from the root domain. I'm guessing this is because it's a footer link. So, the PR6 of the blog seems largely inherited from the root domain, which has a PR of 6. Were you just trying to oversimplify it?

    | brad-causes
    0

  • Hi, CleverPhD has some interesting ideas with robots.txt and Google Webmaster Tools, but simply password protecting all dev pages should keep pages out of Google's index. There's no best practice here, since a password wall will keep Googlebot out on its own. To be doubly safe, you can also include a meta noindex tag on dev pages. Keep in mind that once a page is in Google's index, it's going to take awhile for it to leave (unless you use CleverPhD's method). But, having a blank page in Google's index really isn't all that bad. It's there, but it won't rank for much. Hope this helps, Kristina

    | KristinaKledzik
    0

  • Thanks for the link Guillermo. Dr. Peter's response helped out. I guess I'll keep an eye on it over the next couple of months, and if it doesn't improve, I'll do some 301 redirects. Thanks guys.

    | farmiloe
    0

  • Hi Steve, htaccess can be tricky! That link above is a good overview of the options. What you want in your case I think is to make sure that the specific individual page url rewrites are before the generic domain rewrite so they are triggered first and the generic redirect is only triggered if none of the individual page urls above it are hit.

    | LynnPatchett
    0

  • No, as long as none of the links are broken, I don't see how this would have a negative effect on SEO. If the user experience doesn't change, it generally won't affect rankings (minus markup data).

    | OlegKorneitchouk
    0

  • The keyword in your domain won't really add much because of the anchor text itself, if that's what you're inferring. The link's placement and relevance is the more important factor, always. Even if they did, the last thing you want is a keyword-heavy anchor text profile... That's the short version of the way I see it. Anyway I guess I still don't understand what you're asking in this topic. In the original post you mentioned that the "franchisor will not allow the keyword-domain.com to be their primary domain." But then you said people will be landing on keyword-domain.com? So is it the other way around? the franchise-name.com will be redirecting to keyword-domain.com? If this is the case, have at it! I guess what I'm saying is I advise against building links to the domain that's being redirected and instead all links should be built to the final destination. But it sounds like that's what you are doing. Forgive my confusion here, I've had a stuffed head all week! Yeah I don't love exact match domains and I can all but guarantee their effectiveness will be going away. Google has addressed this multiple times publicly and made no effort to conceal the fact.. Nonetheless I think the most important factor is the quality of the site's content and the authority/relevance of the links coming in. I wish you luck!

    | jesse-landry
    0