Agree with Marcus. Using different CSS for different devices and same content will help to avoid content duplication and many other issues with SEO.
Posts made by de4e
-
RE: Responsive web design and SEO
-
RE: Worldwide CDN with mainland China support.
Thanks for your answer, but unfortunately Azure doesn't support mainland China users (It's only exclusion - http://bquot.com/9j8).
Anyway i'll try trial version, and test delivery speed.
-
Worldwide CDN with mainland China support.
Hello Mozers,
Can you please recommend me Content Delivery Network (CDN) working both for mainland China and worldwide regions?
Many people suggest ChinaCache which provide services only for Asian countries. But for me will be preferable to find one ultimate worldwide solution.
Any thoughts will be appreciated.
Thanks.
-
RE: How is this achieved - SPAM
Freelance.com has 12,300,000 pages in index and most of them are this type of pages, so it's very hard to monitor all keywords manually. If only part of this pages works - bounce rate to other doesn't matter at all, by the way they have page "/jobs/iPad/" too.
User relevance still main goal for Google, but using statistical algorithm has some limitations especially for such rare queries. For more frequently and competitive keywords this tactic will not work.
Personally i think it's black hat with so many internal links and custom generated pages, because it hurt user experience, but using 1-3 such internal links is ok, and can positively affect positions in SERP .
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: Google is not Indicating any Links to my site
Hi,
I've got an idea.
You said:
These pages linking to our new domain are indexed
The links existed the day the site was launched so when the new pages were crawled they existed.
SO the question is: were pages linking to new domain re-indexed after you've add those links?
If no, then just add them to "google addURL" for re-index.
Also operator is "link:" not "links:"
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
-
RE: Facebook question
Agree, also only unique peoples counts. So if same user share 2 posts only 1 will count, but if 2 other different peoples comment on these shares, it will be 3 "talk about" peoples
-
RE: I want tips on how to start a Lead Mangement For a Human Resource website?
Hi,
Can you please clarify what lead means in your case. Is it a company who search for employees, or HR agency who want to advertise their vacancies, or specialist who search for job, or may be all of them? And what should they do to became a lead?
-
RE: Site wide search v catalogue search
Hi,
I'm personally recommend Google Custom Search. It works great for me.
Also You can find in Google many search scripts for most popular platforms(php, .aspx etc). Their will help if you don't want add you pages in Google index, or have private segments on site (personalized by the way)
-
RE: Why Do Links, DA and PA have no affect on this search result?
Agree, i have same picture. Also almost all http://www.plumbingcourse.org.uk/ back links have this key phrase in anchor text.
-
RE: How should I structure my product URLs?
Hi Dru,
Look at this from visitors point of few. realthread.com/products/american-apparel-2001 is much easier to read, looks better in SERP and backlinks .
As we know Google goal is searcher satisfaction, so what is good for visitors is good for Google.
-
RE: Canonical
Watch this video. http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday
This topic covered pretty good.
-
RE: Changing preferred domain
Way you propose is good if you can ask to change most of external links. If no, some link weight will be lost after redirect.
I can propose alternative decision. You can create special subdomain or folder for country with such issues. And redirect visitors by IP. Of course in this case you should use canonical to avoid duplicate content. In result you will have primary page URL in SERP, but all people from specific country will be redirected to working pages.
Cheers,
Vladimir
-
RE: Frequent server changes
Agree with Istvan, i didn't see any negative effect on ranking, only positive if new hosting is faster.
-
RE: Getting 403 error in forum
Hi,
As i see access for this type of pages allow only for registered users. Is it right?
If yes, don't worry about ranking, just disallow pages like this in robots.txt
-
RE: Crawl Diagnostics finding pages that dont exist. Will Rel Canon Help?
Hi,
Yes, canonical tag will word great.
But if people share this page and currently there are small amount of links to www.completeoffice.co.uk/Products/Products.php, i recommend to create 301 redirect for better URL look.
-
RE: Robots.txt for subdomain
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
RE: Changing URL Structure
You should create 301 redirects for all old pages. By the way you can use "Mod rewrite" module for Apache.
If it's hard to determine rules, you can redirect object pages to new category page, but of course it is less efficiency way.