Category: White Hat / Black Hat SEO
Dig into white hat and black hat SEO trends.
-
My website is coming up under a proxy server "HideMyAss.com." How do I stop this from happening?
That's very frustrating! I've never had a problem with HMA specifically, but that script usually works. You may want to try searching for things like "break out of frame proxy" or "break out of frame php" to see if anyone out there has come up with a better solution.
| TakeshiYoung0 -
Looking For SEO expert
Outside of what I said above, I really couldn't mention anyone here I'm afraid. That would be a little awkward to state one name over another. -Andy
| Andy.Drinkwater0 -
Do you know any popular websites using fragment identifier or hashbang (#!) in URLs?
You may find the following helpful if you want to use them for something other than blackhat seo http://support.google.com/webmasters/bin/answer.py?hl=en&answer=174992 and https://developers.google.com/webmasters/ajax-crawling/docs/faq#whereinresults. You will also see the second URL has some examples.
| GPainter0 -
What are the advantages and disadvantages of having a dynamic website in terms of SEO? and a static website?
SEO there is no huge benefit but lets look at it a different way : user experience - at the end of the day that what we are all aiming to make great, if you have a great website that's great to use people will want to use it and fingers crossed share it. Dynamic sites are great because they fit into the users style depending on what resolution they have or how they have their window it lets them use your site as they want to. now that sounds great but obviously it can take a bit more time and knowledge to create a dynamic site but its worth it in the longer run. Best of luck !
| GPainter0 -
Dublicated content
As billy said Copyscape is a great option but you can also take sections of the text put it in Google see what comes up. If you can't trust your copy writer they may not be for you then. You can always ask to see previous work when getting some one new and look to see if its also duplicated etc.
| GPainter0 -
Does this URL need rewriting?
I'm actually not going to rewrite the URL since, based on the responses, it's not too spammy. I was just wondering if it was spammy since it was long and jumbled. But what an awesome post by Paul Thompson! Taught me a couple of things! Thanks!
| BobGW0 -
Which is better paginated URL? domain.com/directory/1 or domain.com/directory/#page-2
Whether one should have a folder structure versus # tag implementation completely depends on the content, page design and user intent. #tag implementation is useful when the entire content needs to be rendered on the same page. The advantages - You dont have to wait for the new page to load You can have a single page that has all the content Disadvantages - You cannot create multiple pages which some believe is important from an SEO perspective i.e. having more pages in Google's index In case of dedicated pages, the advantages and disadvantages are reversed. Again I would suggest that instead of SEO, more focus is provided to user intent and design. Lastly in case of folder structure, please do not forget to add pagination tags as recommended by Google.
| SajeetNair0 -
Redirecting location-specific domains
I think the point of main concern, would be to avoid using the NAP on the landing pages, so as not to cause convolution with the primary domain. Maybe embed the physician's NAP in an image on the landing page...
| SCW0 -
Buying a domain vs. renting a domain
Great to hear Andrew Have a great weekend. -Andy
| Andy.Drinkwater0 -
Cloaking for better user experience and deeper indexing - grey or black?
I wish I could accurately place this on a scale for you. In my opinion I would consider this to be white hat. You have no intent of manipulating search results here - this is completely a usability issue and this is the obvious fix. Yes, I certainly would yes, I certainly would
| Vizergy0 -
Update: Copied Website
Hi Brant- Since it looks like they really didn't "copy" the site as much as mirror it (all the content is coming from your site directly), maybe you can ask your host to look into blocking their ip somehow from accessing the content. Ken
| CandymanKen0 -
Local Map Pack: What's the best way to handle twin cities?
Hi, Two key factors are location and competition. If you are already ranking for your primary area, great. If you are ranking for other locations as an outlier, even better. Now to get the outliers to rank higher is tricky. Because of location and competition. If there are many other businesses in the area where you are an outlier, location is a much stronger metric then many others in this case. If competition was low your chances could be better. Here are some ideas: Go crazy and get all the local citations directories and creative means like citations in your relevent youtube videos and other assets. Basically create a stellar citation profile. See if this helps if not so much or not at all, that means location and competition are in the way here and are giving precedence to other businesses. If you are showing up on local results as an outlier, if you are showing up that means eye balls are looking, yes you may be lower than other businesses, but if you can get significantly more higher end reviews. This could set you apart and give you that conversion despite an outlier position. Try ranking for organic results for the city + industry you are going for, it might not be too competitive and this might be a better position than as an outlier Increase social signals to your website, and Google local page. Hope this helps!
| vmialik0 -
Bot or Virus Creating Bad Links?
Appreciate the different responses. Sounds like not much to do but keep an eye on it and see where it goes. May have to disavow a bunch of links, but if that is the worst of it, then not too bad. Thanks!
| Whebb0 -
Can a domain name alone be considered SPAM?
In my opinion Google will look at domains as we humans do. If it looks spammy to us i'd tend to think Google may have the same opinion (or at least be a ranking signal) however, (a big caveat) the site may well host the best content in the world and have some uber immense backlinks and be an authoritative site in which case it should have no trouble in ranking. A domain name in itself wouldn't stop Google letting it in the index - it's the structure of the site (make sure you're not blocking crawlers in robots.txt etc) and the quality of information hosted on it. So in short answer to your question - NO!
| louisrix0 -
Do Wikipedia links add value?
If you have good content on your site, there's a good chance that someone will add a link to your site from WIkipedia if it's appropriate for the topic. That happened to me and my site, and I can tell you that the links from Wikipedia helped: and I even see traffic from Wikipedia. I wouldn't be concerned whether or not the links are going to hurt your site: they will help establish trust. If your business is established enough and is a public company, for example, there's a good chance that you either have a Wikipedia page for your company or you should start creating one. You'll need plenty of neutral informational type of mentions (news articles, etc.) that establishes your business's credibility. One important thing to note here is that Wikipedia is a part of the Knowledge Graph, and it's important to get your site/business/link mentioned in the Knowledge Graph. Wikipedia is one way, but you can also participate in Freebase, which will help, as well.
| billhartzer0 -
Ajax Pagination on Ecommerce category pages - Good or Bad?
Hi, Generally if its good for the user in most cases, its good for the search engines, and SEO. However this is a technical issue as well, but there is great material on this, including: http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284 http://moz.com/blog/pagination-best-practices-for-seo-user-experience http://moz.com/community/q/what-is-the-best-seo-solution-for-pagination Hope this helps!
| vmialik0 -
Obscene anchor text linking to non-existent pages on my site
thanks, Oleg. I'll do that.
| MartinDS0 -
Do searchs bot understand SEF and non SEF url as the same ones ?
Google considers both url's as different. In your use case, i would use a robots.txt file to prevent google from indexing the site / pages. When the site is released and the url's are permanent, you can remove the robots.txt More about robots.txt : http://en.wikipedia.org/wiki/Robots_exclusion_standard
| Crocodesign0