Questions
-
Adding the link masking directory to robots.txt?
Iredeto, I do this on a few of my sites and it works out well. Saves on crawl budget, keeps google from accessing my affiliate links, and keeps any pagerank from passing through the links, which keeps me in-line with Google's webmaster policy on links. One thing to keep in mind is that Google may show those URLs in the search results with a message that they can't show the content because it has been blocked. If you want to keep this from happening you may have to remove that directory GWT using the URL Removal tool. If they get re-indexed in 90 days (or whatever the reset time frame is) you will have to do it again. Hopefully that won't be an issue once you get them all removed the first time and block the folder. Using rel = "nofollow" tags on the hyperlinks going to that directory wouldn't hurt either.
Search Engine Trends | | Everett0 -
Your advice regarding thin content would be really appreciated
"You are better off putting up a separate review page for every review that page gets but still, I would choose putting it on the same page to take full advantage of it." Yep this is exactly what I want to do. Sounds like Laura's idea is amazing and I need to do some more research on how to design a page to act in this way. Thanks heaps guys!
Intermediate & Advanced SEO | | irdeto0 -
URL construction in 2014
Yes don't use underscores. Underscores make your url looks spammy and so do long urls. Go for the first example for sure.
Intermediate & Advanced SEO | | ataraxis20000 -
High PR Web 2.0 sites
Keep in mind that is the toolbar pagerank of the home page. If you can get a link on www.wordpress.com that's great! But you're not going to get the same value out of brandnewsubdomain.wordpress.com.
Link Building | | KeriMorgret0