Agree with Marcus. Using different CSS for different devices and same content will help to avoid content duplication and many other issues with SEO.
de4e
@de4e
Job Title: Head of WW SEO/SEM
Company: Veeam Software
Website Description
I do marketing consulting (SEO/PPC/CRO/Analytics)
Favorite Thing about SEO
Search algorithms, statistical models.
Latest posts made by de4e
-
RE: Responsive web design and SEO
-
RE: Worldwide CDN with mainland China support.
Thanks for your answer, but unfortunately Azure doesn't support mainland China users (It's only exclusion - http://bquot.com/9j8).
Anyway i'll try trial version, and test delivery speed.
-
Worldwide CDN with mainland China support.
Hello Mozers,
Can you please recommend me Content Delivery Network (CDN) working both for mainland China and worldwide regions?
Many people suggest ChinaCache which provide services only for Asian countries. But for me will be preferable to find one ultimate worldwide solution.
Any thoughts will be appreciated.
Thanks.
-
RE: How is this achieved - SPAM
Freelance.com has 12,300,000 pages in index and most of them are this type of pages, so it's very hard to monitor all keywords manually. If only part of this pages works - bounce rate to other doesn't matter at all, by the way they have page "/jobs/iPad/" too.
User relevance still main goal for Google, but using statistical algorithm has some limitations especially for such rare queries. For more frequently and competitive keywords this tactic will not work.
Personally i think it's black hat with so many internal links and custom generated pages, because it hurt user experience, but using 1-3 such internal links is ok, and can positively affect positions in SERP .
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: Google is not Indicating any Links to my site
Hi,
I've got an idea.
You said:
These pages linking to our new domain are indexed
The links existed the day the site was launched so when the new pages were crawled they existed.
SO the question is: were pages linking to new domain re-indexed after you've add those links?
If no, then just add them to "google addURL" for re-index.
Also operator is "link:" not "links:"
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
-
RE: Facebook question
Agree, also only unique peoples counts. So if same user share 2 posts only 1 will count, but if 2 other different peoples comment on these shares, it will be 3 "talk about" peoples
-
RE: I want tips on how to start a Lead Mangement For a Human Resource website?
Hi,
Can you please clarify what lead means in your case. Is it a company who search for employees, or HR agency who want to advertise their vacancies, or specialist who search for job, or may be all of them? And what should they do to became a lead?
Best posts made by de4e
-
RE: Robots.txt for subdomain
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
RE: Canonical Tag - Question
To answer this questions needed to understand why google have implement "canonical" tag.
Before, to determine is content duplicated or not. Google bot downloaded page content and via complex algorithm compare it with other page in index. As i think there are special bot running through indexed pages database and searching duplicates (that's why copy-paste sites take ban not right after indexation but in some time after).
Tag "Canonical" make this task more easier, Google bot don't need to download page with duplicate content, just need to check section, and may be hash or something like "hashsumm" for . So there are no necessity to download and store same data few times(delete stored data is hard for high-load data centers). It's more effective and fast way to crawl large data sets like web. Also link and url related data, i think, should be added to primary page data set.
I've made a test on this, Google download much less data if the page has rel="canonical" to other page, compare to primary page.
So according this answers for your questions are:
1. Link just flow as usual's, all link data for duplicated pages merge with data for primary page. So PR may slightly decrease in some cases,by the way if you have links from same pages to both primary and duplicated pages. But impact is not critical, almost similar to 301.
2. No, because Google bot check not only canonical. About this i have one more point, Google is statistical SE, and rate pages on topics, so in you case even if canonical will added to pages, it will not help you rank better for both terms.
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
-
RE: "And" vs "&"
There are both stop words, and most SE's ignore them. they apply in SERP only if you search exactly "and".
So (Holiday Inn and Suites)=(Holiday Inn & Suites) , but if they search ("Holiday Inn and Suites ") he find only page with exactly phrase.
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: How should I structure my product URLs?
Hi Dru,
Look at this from visitors point of few. realthread.com/products/american-apparel-2001 is much easier to read, looks better in SERP and backlinks .
As we know Google goal is searcher satisfaction, so what is good for visitors is good for Google.
-
RE: A tool to post to all bookmarking sites at once
You can try http://www.socialmarker.com/, as i know it's free. I've prefer syndd for bookmarking to main sites, because it has integrated spinner and other great features.
But are you really need to add you pages in ALL social bookmarks?
-
RE: Frequent server changes
Agree with Istvan, i didn't see any negative effect on ranking, only positive if new hosting is faster.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: Canonical URL's - Do they need to be on the "pointed at" page?
Agree,
There are many possible variations of same URLS, not under site owner control - different ?parametrs etc. So better add cannonical to each page.
I'm graduated as radio engineer with deep math and physics background. working as Marketing manager and consultant for few different size companies.