Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Site not showing data in Open Site Explorer
Thanks for your help. They have each been online for at least six months and both have been crawled.
Moz Tools | | SocialKwan0 -
How to associate landing pages to specific key phrases on my report cards
Generally speaking index pages have a lot more trust and link-juice than sub-pages. That's why part of link building is making sure you have some quality links pointing to 2nd and 3rd tier pages as well as the home page. For example, if your home page is about "Basket Weaving" and your sub-pages are about "Underwater Basket Weaving" - you would need to make sure the "Underwater" page has a bunch of relevant links, otherwise your "Basket Weaving" page is going to trump it. Can you PM me your specific URL and the keyword examples? I can give you a more specific answer then.
Moz Pro | | GeorgeDavis0 -
Very well established blog, new posts now being indexed very late
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all. It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared. The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it. I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue? As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
Technical SEO Issues | | RyanKent0 -
When do I see the new Linkscape?
The new Linkscape index is now live! We hit a small snafu rolling it out. Sorry for the mix up.
Moz Pro | | Cyrus-Shepard0 -
Help with canonical tag
Yes as long as the "www" is included in the URL of the canonical tag.
Intermediate & Advanced SEO | | RyanKent0 -
What's the best research tool for measuring blogger outreach success?
Thanks EGOL. I agree social sharing features make sense. But we don't have our own blog yet - my question is about finding the best tool for tracking the number and quality of inbound links that result from our blogger outreach efforts.
Moz Tools | | MJOshea0 -
I have a page where you can download a PDF of the material - should I exclude the PDF from the search engines?
Thank you! This is exactly the kind of information I needed! I was thinking contacting webmasters who published the original article to tell them about mine. But now, perhaps what I will do is not just contact them but attach a copy of the pdf for them to use.
Content & Blogging | | MarieHaynes1 -
SEO Correlation Between Code and Search Engine Rankings
I have a lot of thoughts on this subject. If I was to make a blog entry on this topic, it would span multiple pages or have to be broken down into sub-topics. I do think there is a correlation between good code and search engine rankings. I do not think there is a correlation between a w3c validated page and search engine rankings. The validator is not current enough, nor flexible enough, to accommodate the real world situations which websites encounter. Example A: HTML 5 is recognized by all major browsers. W3C validation of HTML 5 is still experimental. A specific example that applies to SEO is the canonical tag. According the the W3C validation site, the canonical tag is not currently valid. Take a snippet of HTML5 code which passes validation, add a canonical meta tag to the header, and the code will no longer pass validation. This is a direct conflict between best practices and validation. Example B: The world's most popular web page, google.com, does not pass validation. Matt Cutts discussed the topic. In short, they had a choice between providing code which validated, or code which worked. They chose to go with the working code. Example The standard facebook widget code, youtube video code, and other popular code does not pass validation. Whenever I design a website, I check the code in the validator to look for errors. Initially, I will find numerous errors related to code outside of my control such as social sharing widgets or youtube videos. Once I remove that code, the page often validates. I have researched the issue and it is possible to modify the facebook code or youtube code so that it still functions and passes validation. Doing so requires extra effort, it provides absolutely zero benefit other then saying "hey, I pass validation", and there are often drawbacks such as having to add extra javascript to your site which can otherwise be viewed as unnecessary code.
Intermediate & Advanced SEO | | RyanKent0