Since creating more than one link to some URL from one page does not increase the linkjuice that is passed to that URL, I recon that Crawlers understand that these are identical links. This isn't easy to verify though...
Posts made by DeptAgency
-
RE: Too many on page links
-
RE: Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
Although users might not go to the sitemap very often, it is usually a very easy way to make sure some linkjuice is passed to all pages. Especially if the sitemap is linked to from a lot of pages, it usually has quite some juice to pass on. However, the sitemap should never be the only way you link to deeper pages.
-
RE: What's the best way to tackle duplicate pages in a blog?
Usually creating a 301 redirect in your HTAcces file is an effective way of dealing with duplicate content. You could use the free redirect generator if you're not too familiar with writing htaccess files.
-
RE: Can the template increase the loading time of the site?
Google Page Speed provides very helpfull information on what factors are import for your site. It determines what is slowing your pages down, and what you can do to fix it. It also distinguishes between various degrees of importance. You can check out the speed test at: https://developers.google.com/pagespeed/.
Good luck!
-
RE: SEO baklink tracking
Google isn't too eager on showing information about links. For example, you could search in google using the 'link:' operand, but the list it will return is far from complete.
'link:www.seomoz.org' returns about 1700 results, but I think we all know they have a quite a few more than 1700 backlinks. Google Webmaster Tools isn't showing a complete overview either, but it's better than using the search engine.
I'd say the best you can do is keep a close watch on analytics and see where traffic is coming from.
Obviously you can use third party tools like opensiteexplorer.org and Majestic for tracking links to a website. Usually Google picks up on links way before they do (in my experience).
Good Luck!
-
RE: How to handle Not found Crawl errors?
Hi Ben,
I agree with you that some links are not worth redirecting. However, in my experience a dead link never comes alone. Often there is some kind of reason that the link was created, and there might be others you don't know about.
For this reason I usually recommend redirecting all broken links, even if the individual link is not worth the trouble. Obviously there are exceptions to this rule, but most of the time it's worth your trouble.
Sven
-
RE: Http://www.xxxx.com does not re-direct to http://xxx.com
I'm pretty sure it doesnt redirect twice. I double checked it with httpFox. It simply redirects only once.
If you want to see for yourself:
(https://addons.mozilla.org/en-US/firefox/addon/httpfox/)
Good luck with your website!
-
RE: Http://www.xxxx.com does not re-direct to http://xxx.com
As far as I can see the redirect points to http://earthsaverequipment.com/, so multiple redirects is probably not the issue. Very strange though!?
-
RE: Http://www.xxxx.com does not re-direct to http://xxx.com
I do not get a 404 error when i'm going to http://www.earthsaverequipment.com. Also a redirect is not handled through your robots.txt. This file is used to specify which locations a robot is allowed to enter.
You can use http://web-sniffer.net/ to see what header a URL returns, and you will see that your website returns a 301 redirect, as it should.
The robots.txt file can also be used to specifiy the location of your sitemap, which it currently doesn't. I suggest you add this information.
-
RE: Managing international sites, best practises
Recently there was a blog posted on the Youmoz Blog which is worth reading. I think the main idea of this post is that the best strategy for international SEO depends heavily on the company, and the resources they are willing to invest.
The main distintion is made between:
subfolders of one international domain (.com/us/ .com/au/ etc.):
This is the easiest way to maintain different websites. (low cost, easy to use). However, you will also have the most difficulties ranking the right page in the right country.
seperate country specific domains (.co.uk .au .com)
If there are resources available for maintaining separate websites, building domain authority and creating content for all of them, this would be the best option.
subdomains (au.blindbolt.com etc.)
This is the middle way. Harder to maintain than subfolders but not neccissarily more expensive.
I'd advise you to read the blog carefully and also to study hard on best practices in international SEO, since a lot of people have difficulties ranking the right pages in the right countries.
-
RE: Reporting Low internal links to Homepage
Well, First I figured the internal links were simply not indexed. This would mean Open Site Explorer would not show info for pages low in your hierarchy. When I checked, it did, so I was initially wrong. However I also noticed a lot of external links from etoyszone.co.uk so I checked that site, only to find myself redirected to your current site.
I think this problem will solve itself over time, since I expect that the deeper pages will not be crawled very often. Therefore I would advise to always double check and use crawler software (like Xenu) to map your internal link structure.
cheers!
-
RE: Reporting Low internal links to Homepage
It seems that the Open Site Explorer still has an old domain in it's index(http://www.etoyszone.co.uk/) which is currently redirected to your new page. It still sees these links as external.
Also you could use XENU to crawl your website and see if you are linking to (redirected) duplicates of your home. http://download.cnet.com/Xenu-s-Link-Sleuth/3000-10248_4-10020826.html
Btw, you might want to use your html header for more than just the title. For example, you specify the character set in the body.
Good Luck,
Sven
-
RE: Redirecting users based on location
Hi SaraSEO,
I don't think redirecting visitors based upon country is wise because of the following reasons:
- The Search Engine crawlers are not neccissarily located in the country they crawl for and might not be able to crawl all languages.
- Redirecting Users but not Crawlers could be considerd cloaking
- There might be German speaking people in sweden getting very annoyed not being able to see the german version. - Google explicitly advises not to do this:
"Make sure each language version is easily discoverable
Keep the content for each language on separate URLs. Don’t use cookies to show translated versions of the page. Consider cross-linking each language version of a page. That way, a French user who lands on the German version of your page can get to the right language version with a single click.
Avoid automatic redirection based on the user’s perceived language. These redirections could prevent users (and search engines) from viewing all the versions of your site."
source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192&topic=2370587&ctx=topic
Greets,
Sven
-
RE: Universal Local Results & Venice Update - What's what?
Hi Harald,
Thanks for your reply!
Both the blogposts you mention refer to 'standard organic results' being shown due to ´geographical factors´. However, before the Venice Update the term ´Local Universal Results´ was used quite often used to describe places results in the standard SERP. examples: http://blumenthals.com/blog/, http://flashious.com/how-to/how-can-an-out-of-town-company-compete-with-local-competition/
I guess we can thus say that the SEO community uses this term for both Google Places results and standard organic results that are shown due to geographical factors.
Can I safely assume that Google shares our interpretation of this term? Alternatively, if 'Local Universal Results' would be synonymous to 'Google Places Results', then the Venice update would only be impacting queries that trigger 'Places Results'.
-
Universal Local Results & Venice Update - What's what?
There is some buzz in the community regarding the Venice Update of the Google Algorithm. Google has said the following about this:
"Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal."
source:http://insidesearch.blogspot.com/2012/02/search-quality-highlights-40-changes.htmlA lot of Blog posts refer to this update and discuss the effect of the location of the googler on the ranking of "standard/traditional results". My question is:
Does the term 'Local Universal Results' include "standard/traditional" results or is it used by Google to refer to 'Google Places' Results in the standard SERP?
So, are the results that are impacted by Venice recognizable as 'Local Results' (by the teardrop and google Maps and so on), or does it influence 'standard/traditiona results' as well, other than simply pushing them down by inserting Google Places results in the SERP?
Thanks a lot!
-
RE: Run a batch of keywords 1 by 1 through Adwords Keyword tool
No, but I will close this question anyway

-
RE: How to get a quick idea of competition for large numbers of keywords
I found Market Samurai to be fairly useful for getting competition data on a few thousands of keywords. However, it is not as in-depth as SEOmoz KW difficulty tool.