You are going to see very little benefit from optimized anchor text on internal links. Use the anchor text that will most encourage your users to keep clicking through. This doesn't mean necessarily "read more", though. Think about calls to action in anchor text that are more compelling.
Best posts made by HiveDigitalInc
-
RE: Any opinions about the common anchor text?
-
RE: International SEO - auto geo-targetting
Perhaps I have misread, but what is the problem with doing...
brand.ccTLD/productA/
Where all ccTLD's point to the same server and the only thing different between the two is that when language differences are in place, it grabs from a separate database table and language-file based on the ccTLD. This would allow you to keep just 1 server, still have keyword-optimized content, etc.
You wouldn't be able to really build off of the domain authority, but separating into sub-domains will essentially segregate the authority as well.
-
RE: Is Remove Em A Fantastic con?
Hi Steve,
I am Russ Jones, an active user here at SEOMoz and my company, Virante, owns Remove 'em. I would be happy to explain to you the issue you are discussing.
When you run the tool on the home page which makes an estimate on the number of links you have to remove, it is looking at the total number of links. This means that if you bought 1 sitewide link, and it was discovered by our predictor tool, it would count every link you found on the site.
However, when you run the real Remove'em tool, we find the unique linking domains specifically because you only need to contact the webmaster 1 time to get all the links removed. The sitewide link I mentioned above would only show up 1 time, rather than perhaps hundreds of time in the predictor.
We have had discussions internally about whether we should change the predictor tool to reflect unique linking domains, but unique linking domains reflects more on the number of webmasters you may need to contact and not necessarily the breadth of the penalty you face. We will certainly take this example into consideration in the future.
Thanks again for using Remove 'em!
-
RE: How does the number of obls on a page affect link juice?
If we go by the original PageRank algorithm, then you can essentially divide the PageRank of the page (or MozRank or ACRank) by the number of links on the page (external or internal) and that is the amount of PR or mR passed by each link.
However, there are probably several modifiers now that have been added to tweak the flow of PR in terms of rankings...
1. Where does the link occur on the page in relation to other links (ie: is it the first or last on the page)
2. Is this the only link to your domain on the page or are their others?You do not want to use DA or PA as a measurement of link quality as it is a machine generated score created by SEOMoz to indicate the rankability of a page or domain. Instead, you should be looking at MozRank and MozTrust of the page that your link is on.
Moreover, when using MozRank / MozTrust, you cant simply divide it by the number of links. You would need to first convert it to the Raw MozRank or Raw MozTrust (A logarithmic function), then divide it by the number of links, then re-convert it back into Pretty mozRank or MozTrust.
This is because the mR and mT (just like PageRank) are logarithmic scales from 0 to 10. Each integer increase actually represents about an 8x improvement. Having 1 links from a PR8 page that has 7 other links (8 total) is not like PR8 divided by 8 = a PR1, it is actually about a PR7 level link!!!
-
RE: Considering redirecting my site from .com/us to just .com. What could the possible SERP consequences?
The biggest concern would be potentially losing some of the PageRank passed. However, in my experience, these kinds of internal redirects tend to go off without a hitch if well executed.
1. Make sure that you have a good understanding of all URLs that are to be redirected and that the redirect does occur correctly.
2. Watch analytics, GWT, and server logs like a hawk the first 48 hours after the redirect to make sure you aren't missing.
3. Consider doing Sitemap Assisted Redirects (Google it) to speed up the process
-
RE: Creating Duplicate Content on Shopping Sites
Yep, it is definitely best practice to come up with unique titles and descriptions for the shopping sites. If you have a large number of products, you might want to use a service like TextBroker or ContentWriters.us to handle the work for you.
-
RE: Google results inverted in a different country
There are myriad factors involved in geographical search engine rankings, including links from ccTLDs (other .uk or .au domains), localized language usage, settings in Google Webmaster Tools, etc.
Your competitor simply has better metrics for .au than you do.Go get some clean .au links.
-
RE: LinkDetox Versus Removeem??
TAGFEE: My company owns Remove'em and I helped build the bad link detection algorithm for it.
I would say that Link Detox and Remove'em are very different. Link Detox, from my estimation, focuses very much on perfecting the bad link detection mechanism. In fact, many Remove'em users upload Link Detox lists to Remove'em just to use our outreach functionality.
Remove'em intends instead to be a tool that covers from "Discovery to Recovery". We don't disclose the same granular level of data on why a link is identified as concerning, instead our focus is sorting and flagging the links in a way that most efficiently allows you to do link removal outreach and build a disavow file. We get contact info for you, manage emails, track links for removal status and even build a progress page so that you csn show google your work so-to-speak in a reconsideration request. We deliberately have a bias towards action.
That being said, from all I have heard, link detox is am excellent product and, as mentioned, many of our customers use link detox as the scalpal, so to speak, in identifying bad links, and then upload them to removeem for managing the link removal process.
Hope that helps!
-
RE: Making deep links to your page on BBB or DMOZ - Effective or waste of time?
Google crawls based on a priority-system that is tied, at least in part, to the value of each page. A high-value page will be updated more often than a low-value page, theoretically. Of course, the regularity with which pages are updated on the site will have an impact on the crawl rate as well.
Both DMOZ and BBB are updated fairly regularly, so my vote would be to spend your time building links to your own site rather than these other linking pages. If they haven't been spidered yet by Google, I would try tweeting them out to a few friends who, in turn, retweet them, to see if you can convince Google that it is worth taking a look at. Beyond that, don't waste your time helping someone else's page rank, build the links directly to your own.
-
RE: Optimal / Best Practice Title tag
Agreed with Ryan for the most part here - especially competitive short tail words like web design or web development. If you must, at least make it readable... "Graphic Web Design & Online Marketing in Ireleand"
-
RE: Microsoft SEO Toolkit vs MoZ
At Virante, we really like the IIS toolkit, but it has some drawbacks. The biggest drawback is that it doesn't take into account canonical tags when looking at things like duplicate content. It also makes assumptions that are simply overkill (like lumping nearly every redirect into being an unnecessary redirect). In the end, we actually use both.
Moz gives your regular crawl data over time, which is valuable to discern what might be the cause of an issue because you can correlate it with changes you made on the site between crawls. The timeseries data is all there in front of you.
IIS can give you some other data and, more importantly, give you the raw crawl data like the export of all internal links.
-
RE: How well does Google's crawlers understand foreign websites?
A lot of "semantic" search technology is multi-lingual as it looks at factors like word proximity and collocation which still matter regardless of language. Moreover, I am fairly certain Google has teams of developers dedicated to search in multiple languages. While you can expect that they wont be quite as sophisticated in a non-english language search engine, whatever gains you might find in that Google isn't as "smart" simply means that more people can effectively compete against you.
-
RE: LinkDetox Versus Removeem??
Hi Alan, thanks for your questions. In the interest of respecting the Moz forums, perhaps it would be better if I let the support team handle it, they are great. As much as Id love to brag about the tool here, it just doesnt seem right. Ill see to it support gets back to you shortly.
-
RE: Keyword stuffing
Nope. Don't worry about it. I just ran a more comprehensive internal tool that we have which not only looks at keyword occurrences, but also looks at the other sites ranking for that term to see if you are out of sync with them as well. You are within reasonable levels of keyword usage on the page to not be overly worried.
-
RE: Moz is advising that a page has too many of the same keywords.
Depending upon the keyword and context this could be an issue. The biggest concern is obviously the appearance of keyword stuffing the page, and the percentage of content on the page that is not that keyword.
Is the keyword necessary in the product name? E.g. A category of Rome Tours, and you list all of the tours as "Rome Tour of Pantheon", "Rome Tour of St. Peter's Square", "Rome Tour of Spanish Steps", etc...
I could probably be more helpful if you could provide an example...
-
RE: Can 302 chains (affiliate links) from "toxic" sources hurt you? Or are you "shielded"?
We know that over time Google will consider a 302 redirect as permanent if it remains in place. The 302 redirect is default, so Google often is forced to determine whether a person intended to place a 302 or just did it out of laziness. We recommend that you go ahead and use robots.txt to block the affiliate URL parameters so that you don't ever have to worry about it at all.
Is the affiliate program on your site or are you using a 3rd party? If you are using a 3rd party, they might already block Google from crawling those URLs. An easy way to check this would be to follow the redirect chain and take note of the domains. Check if any use either robots.txt or X-Robots-Header to block Googlebot. You can also check your own links in GWT to see if they show up.
-
RE: AdWords Editor auto-correcting keywords
The Keyword Planner is a great tool for getting keyword ideas, but it's not the most accurate keyword information you can get. It's used to get ideas and plan landing pages/ad groups, but if your specific term doesn't have enough volume, results will not be shown. A better way to get more accurate keyword information would be to use Webmaster Tools in conjunction with Analytics to show actual terms searched to reach your website, which would have more up-to-date information. But keep in mind, the Search Console has a roughly two day window for updates.
Over the Summer, Google updated the Keyword Planner to combine more terms and it made it less accurate. You can no longer search for exact matches, misspellings, or plurals using the tool and have to take the aggregate data. Hopefully Google will build a new tool to fulfill this need that many marketers have.
- Will Hare
-
RE: Anybody tried the content syndication network SYNND?
This is shady, but it probably would leave very minor footprints. It also would likely deliver you few if any quality links unless you create quality content. Good link building services come from good link building agencies. Go take a peak on the SEOMoz recommended list - http://www.seomoz.org/marketplace/companies/recommended
-
RE: Lots of overdynamic URL and crawl errors..
You should look into using using Robots.txt to filter out specific querystring parameters
You can use asterisks (*) to match the URL type above...
Disallow: /index.php?a=reg