Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Rel="canonical" for PFDs?
I stand corrected on that point. Thank you Jassy for sharing the link. I was not aware Google made that change.
| RyanKent0 -
Moving domain.com to subdomain.domain.com
I own the SERP for my brand name. But i have lot of mini-domains like brandtrends.com and my goal is, to increase the authority of my main domain, and make it more reputable, we write on different niche, we have journalists etc.. the content is completelty different and is completely white hat. I think trends.brand.com maybe i can hire it under brand.com domain, on the top #5 of SERP. What do you mean Davis? it's better to do this or not? Because my mini-domain have PA:40 minimum, and if i have on subdomains, i think it would be better. Tell me your opinion .. THanks
| leadsprofi0 -
Toolbar sais 33 PA & DA, does Google have the same values?
Theoretically, yes. The 301 should tell search engines that the site is moved, so all factors previously attributed to Site A should now instead be attributed to Site B (with a minor loss). Although already done, you could double check that you followed the best practise for moving a site laid out here, but if you're waiting to see toolbar PR, then you'll be waiting a while to see any updates (probably). Is the site ranking for anything the old site used to? If so, you shoudl be fine, if not, double check the link and see if there's anything else you can do.
| StalkerB0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
When you say "Google Webmaster tools," is it inside the seomoz pro dashboard? Or do I need to create a google webmaster account? If you have a website, you should definitely have a Google WMT account. Go to www.google.com/webmasters and register your site. It is a must for anyone who cares about SEO for their site and related analytics. For the SEOmoz report, I asked that same question to SEOmoz help. The reply I received was: "it sounds you made a change to the way your site is indexed by Google directly in Webmaster Tools. Since we don't have access to GWT, we can't account for any changes that you've made in there, unfortunately. Furthermore, these issues are still present for crawlers other than Googlebot, so if you want to make sure those fixes are indexed by Bing/Yahoo, make sure to submit the fixes to their index tools or make the changes directly on your site. On the other hand, if you don't care about those search engines, feel free to just ignore our warnings. Our software is over-inclusive of errors so that you can choose which errors/warnings/issues to act on and how. We'd rather give you too much information than too little!" With the ?pg=2 parameter, is there a way for google not to treat it as two URLs with same title and at the same time make both appear in search results? In Google WMT you can see the Parameter Action as Don't Ignore. You would then need to fix the Duplicate Title issue on your site. I am not advising this approach, but merely responding to your question of how to do it.
| RyanKent0 -
Meaning of agnostic ?
Hi Andrew, you can read the answer here in the new search ranking factors: http://www.seomoz.org/article/search-ranking-factors#metrics-2 There it is pretty well explained
| petrakraft0 -
Is there any issue with using the same structured data property multiple times on the same page?
I am basing my response on an example offered on the schema.org site. In the example below,the "range" itemprop was set multiple times for the same item. It seems that is how the author situation would be handled. With respect to the isbn, I would assign both. This is a new system and the above are my best guesses based on what I have seen. I am certain these questions will be addressed in detail quite soon. If I was coding today, the above is how I would proceed. If it is later found to be incorrect, it should be easy to remove the extra line of code.
| RyanKent0 -
Very well established blog, new posts now being indexed very late
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all. It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared. The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it. I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue? As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
| RyanKent0 -
Domain Authority and Page Rank concerns when using CNAME
Thank you Daniel. I know it's been awhile, but now just getting back to this. You pretty much substantiated what I thought to be true for the first part. I have heard of some workaround 301s when going from something like Blogger to a self hosted version. And I do believe that WordPress.com has a paid option to redirect all traffic if you go to a self-hosted blog. Thanks again!
| WilliamBay0 -
Could Having Blog Posts as Home Page Cause Keyword Dilution?
Thanks for that EGOL. That does make sense.
| WilliamBay0 -
Best SEO strategy for a site that has been down
What most likely happened: Your site was offline, Search bots crawled the site and found nothing responding with a 404 error, or worse a 500 level error. There is a good chance that your site could have been crawled multiple times, and thus the search bots deemed your site gone, and thus your rankings fell. The first and foremost thing you should do (and I assume you have) is restore your website. Being offline for only a few days should not have a long term effect on your rankings. It will simply take time for the search bots to recrawl all the links that you have built and determine that their is in fact a website on the other end of them again. You should check webmaster tools for any errors and possibly ping your sitemap against the engines. Otherwise, I think it is best to carry on as normal, the rankings should return within a few weeks. If I may be so bold, I would suggest looking in to a different hosting company, unless it was billing related, there is no excuse for going down for more than a couple of hours in today's market.
| knichol0 -
301 redirect from domain to home.aspx
I'll take a stab. You can use a 301 redirect to enforce your canonical URL. In this case, it may be that site decided that /home.aspx is the canonical and wanted to be sure any links out there go to it. This is a strong signal to the search engine that if it has both www.domain.com and www.domain.com/home.aspx indexed, the /home is the URL for the juice. Now, that assumes this was done on purpose and not just some silly mistake. I've seen sites with 4 URLs for the main page: www.domain.com www.domain.com/home.aspx domain.com domain.com/home.aspx You can read up on Google, but one way to fix this problem is to redirect 3 of the 4 to the canonical URL.
| PhiloMedia0 -
How valuable or not is javascript linkback from a competitor?
Contrary to Theo I'd say: A. Positive Pretty sure Google has been following js open links for many years now. I'd be reasonably confident of saying even if you did something tricky like href="javascript:CleverFunction('http://example.com')" and CleverFunction was designed to add nofollow/ open in new window, that Google would still be able to follow it from the url within the href. You could quite easily set up a test for a page that only has a link like this pointing to it and see if it gets indexed. The only thing I might not be totally sure of is if it passes juice, but I think as of mid 2009 it does (as I'm sure Google no longer endorsed js links as a method for preserving link juice).
| StalkerB0 -
No Following Existing Non-SEO Pages A Good Idea?
I block my Terms of Use and Privacy pages with robots.txt. I use a tool on my site which ads any search terms a user entered in to a search engine to reach the page. "Users found this page by searching for..." and then the box will fill up with search terms if the page is selected from SERPs. I noticed my Terms and Privacy pages were getting hits. I don't want those pages competing with the rest of my site content in Google. If you allow those pages to be indexed, you run the risk of those pages appearing instead of a more relevant page. This can cause searchers to skip over your result, or click on the result and bounce off your site.
| RyanKent0 -
Geolocation has changed targeted domains, or has it?
Thank-you, I will try and let you know.
| Vovia0 -
How to setup tumblr blog.site.com to give juice to site.com
I looked at wordpress, but I am having trouble getting it to play nicely with MVC.
| oznappies0 -
Crawl Tool Producing Random URL's
The same thing is happening for one of my campaigns, specifically for a 302 redirect to the homepage. My guess is I need to update it to a 301, but I'm not 100% sure if that would solve the issue?
| kchandler0