Awesome. That's a fantastic way to check. Thank you.
Posts made by Syed1
-
RE: Does Anchor Text in Links Count When Google Looks at Overall Keywords on the Page?
-
Does Anchor Text in Links Count When Google Looks at Overall Keywords on the Page?
So, on our site, we have a list of related blog posts on a page that focuses on bamboo flooring.
These blogs posts have anchor text like "What's the best bamboo flooring?" "How to install bamboo flooring." "Yada yada bamboo flooring yada."
Because the main keyword for the page is bamboo flooring, would the presence of these words within anchor text on the page be considered as keywords on the bamboo flooring page, affecting that page and possibly stuffing within that page?
-
RE: Creative way to secure local addresses for Google Places?
Thmbs up'd. Love the idea! Not super easy to execute but could be very well it.
-
RE: Are Exact Match Domains on Blogspot or Typepad Effective?
Exact match on subdomains have very very low weight. Anyone can create thousands of exact match subdomains without much effort and cost - in fact the exact match weight has been on the decline even on the root domain.
-
RE: Can visitors duration time affect Google Rankings?
It could. It shouldn't weigh in heavily as 'time on site' does not always equate to quality of page or even the relevancy of the page. Some pages are more useful when short and the 'action' perhaps does not consume a lot of time but I assume Google looks at industry or "website type" averages, sets benchmarks and considers accordingly. This is what I 'think' - not something successfully tested and proven.
-
RE: Should links to tech data sheet be no follow?
I would simply leave all PDFs as dofollow - unless they are not unique or if any of PDFs have content that has already been published on the site (html/ page).
Are all 52 PDFs have same content? If so I would let get 1 of them indexed and use NOFOLLOW/NOINDEX on rest (use robots.txt) and may be even use 'canonical' to prevent the others from being indexed and cause any potential duplicate content issue.
Some good reference for you:
http://www.seomoz.org/q/can-pdf-be-seen-as-duplicate-content-if-so-how-to-prevent-it
-
RE: Best Strategy to Rank Wordpress Category
Killer question. Once I had a coder do a quick mod to have fixed/ static block of content and show all the posts under it and it worked. I didn't carry that project too long and didn't try other alternatives - would love to know if there are better ways to go about it too!
-
RE: Should links to tech data sheet be no follow?
Doesn't matter much since if you use NoFollow you still lose a certain rank "juice". If i were you I'd leave them as it is - but If you are using same content from PDF on other web pages, I will nofollow/noindex them
-
RE: On page SEO? (This is good! I promise)
Headers are not as important as they used to be since they can be easily manipulated so I wouldn't put in too much time and effort on that. That being said, it's a good practice to use them.
There is no penalty for using same headers but its better to use unique headers for each page so they are not competing with each other.
Meta keywords: it doesn't have value in Google and other major engines at least - it could work against you as competitors can refer to it and easily know what you are targeting.
Bold text: it used to matter - but it's very less now - most likely has negligible affect now.
Internal link anchors can be duplicate - if you have a site-wide text link menu, wouldn't technically those links be 'duplicate' throughout? No penalties/issues. My sugggestion though is to vary it occassionally even internally to get most of it.
Image alt attributes - no issue in have same of it for the Social buttons on all pages. In fact, you don't want to tweak them and try to rank for some keyword. They are just social buttons...
Competitor: They are using DIV Scroll - I would be very careful in how I use it. It doesn't seem user-friendly as the box is too small and there is lot of content in it. I am not too sure if Google can discount it algorithmically but if their staff see it manually, it could be an issue. There is no 100% sure way to know the outcome of using div scroll the way they are using - as of now they probably do get benefit by using lot of text but using lot of content in a little box like that is not a sound long term strategy considering the risk of manual review and also with G-bots getting smarter
-
RE: Big Link Network Taken Down
I always assumed Google would have 1 or more staff dedicated to manually hunt/ look out for link building networks. It doesn't take couple of hours to get a list of dozens of these popular networks where hundreds/ thousands people buy and exchange links. If anything I am surprised it took Google so long to catch up to them.
What would be even more interesting to see is if they are able to catch up to all the 'participants' in this network and be able to discount all those links
-
RE: SEO Conferences - Which One(s) do you Attend ?
I will be going to Pubcon and Mozcon.
I went to SMX Seattle and IMC last year - would suggest SMX as long as you go for the main event and select the 'right' workshops.
-
RE: Using mlm and 'scammy' websites to identify brand/reputation management opportunities
Nope, in most cases these blogs are owned by people who simply sell links. To be able to sell links, they have to have some content around it and so they start publishing crap - since content quality/ people's opinion is not at all important here.
This is where the shady companies you mentioned come in - they create mini "review" sites and get links from these junk blogs. Obviously these are shady companies and they have something to hide which is why they are spending so much effort to bury the negative reviews, rather than go after the root cause of this negative publicity.
-
RE: Using mlm and 'scammy' websites to identify brand/reputation management opportunities
A rather not-so-exciting very long term strategy recommendation (for a legit biz/site) would be to 'feel the pain' of negative reviews - leave the wound out in the open with fresh air and let it heal. Don't wrap it up as it just a easy lazy thing to do and it just numbs the pain and hides the problem - the cause of the negative review still lingers around..uncared for.
But that's my suggestion to a business owner not the SEO who has no control of the 'quality' of the offer and cannot improve customer satisfaction.
Regarding the places where they get links ~ from what I have seen most of them have links coming in from blogs that seem full of garbage - posts that re-written content that doesn't make much sense (probably auto generated). It's surprising but such links still have power to push up ranks even for a temporary period. When the sites links get get caught in the filter / ie. get their links discounted/ de-ranked, they probably have a bunch of others to rely on. There is not much risk since they are not playing with the main site.
-
RE: Wordpress noindex
I don't know of any plugin that does this but why not simply do it via robots.txt:
<code>User-agent: Googlebot Disallow: /page/specificfolder/</code>where 'specificfolder' could be '/2/' as in your example (domain.com/page/2/) - it would simply noindex all content under that specific folder.
Also, if you want to noindex specific pages/posts, you could do it by using this plugin: http://yoast.com/wordpress/meta-robots-wordpress-plugin/
-
RE: Google unranks target keywords
It just means Google will weigh the easily manipulated on-page and off-page factors less than they have earlier.The anchor text might have as much important as it used to before - perhaps the context of link would be more important than the anchor. It could be a factor - not necessarily THE factor ~ I mentioned that since you wrote 'anchor text' relating to 'rankings' and now the anchor factor is not as heavily considered......
"anchor text having the url as the landing page for the page you want to rank for."
-
RE: If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
If they pull the same content even from different sources, there still could be duplicate content issue. Most likely the higher authority site will be considered as the 'original' source.
The content needs to written uniquely - at least re-written extensively enough (new words, different phrasing, etc) so it passes Copyscape test
-
RE: How much to change to avoid duplicate content?
Changes should be on deepest level - words/ phrases. Rephrasing alone is not enough if the words used are exactly same.
Also, to get most bang for SEO campaigns/ traffic, I'd try avoiding having content that is 're-written' from just 1 source. I would use many sources and sort of give a 'new take' on this newly merged comprehensive consolidated content piece. As I mentioned in another post, this huge content piece with a "new take" leads to your article being considered original and unique, and it also adds to likelihood of it attractive a lot more links than just a re-written piece would.
-
RE: Why do rankings show differentley when checked from different computers
Are you and your client in the same location/ city? If not, that could just be the answer as Google gives different results based on location.
Also, are you 'logged in' to Google Account while searching? Personalized (logged in) search results are different from search results that show up when you are not logged in. For e.g. If you searched for a phrase, and clicked on your site, then the next time you search similar phrase, your website will show up higher than it actually is for majority of users (including your client)
-
RE: Community site for link building?
It really depends on how you approach the advertising on the forum/community. If you do it in a positive way - by having helpful/relevant/subtle/non-intrusive ads, it could be a great strategy in the long run.
If you simply throw in some keywordy text links, the user experience is degraded and you also risk being 'reported' by competitors (who would be looking at your 'new' links; if niche is competitive).
It could be an affective strategy as long as you are very careful on how you carry this out - right from the initial strategy (where you host) to ads and link implementation (how you monetize)
-
RE: Do search engines penalize for too many domain aliases?
I don't think there will be a "penalty" as per say but those links could definitely be discounted or filtered out if they are hosted on the same server (IP range).
We have many such domains pointing to our main domain and there is no issue at all. Do make sure that those 20 domains are clean and are not in 'bad neighbourhood' (bad links, hosted with bad sites, etc) as penalty could pass through a 301 redirect.