Questions
-
Www.site.com linking to pages www10.site.com
Hi Abe, The links are definitely following these rules unless I'm really not understanding something. I'll send help@ an email with the exact site and situation. thanks, Victor
Other Research Tools | | Motava1 -
Periodic DNS Switching for Major Website Updates - Any Downsides?
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right? The problem with changing DNS is an initial traffic drop as routers/hubs/ gets the update. Quote REF: http://www.mattcutts.com/blog/moving-to-a-new-web-host/ Step 3: Change DNS to point to your new web host. This is the actual crux of the matter. First, some DNS background. When Googlebot(s) or anyone else tries to reach or crawl your site, they look up the IP address, so mattcutts.com would map to an IP like 63.111.26.154. Googlebot tries to do reasonable things like re-check the IP address every 500 fetches or so, or re-check if more than N hours have passed. Regular people who use DNS in their browser are affected by a setting called TTL, or Time To Live. TTL is measured in seconds and it says “this IP address that you fetched will be safe for this many seconds; you can cache this IP address and not bother to look it up again for that many seconds.” After all, if you looked up the IP address for each site with every single webpage, image, JavaScript, or style sheet that you loaded, your browser would trundle along like a very slow turtle. If you read this page you'll see Matt Cutts tested mattcutts.com himself and did not see any major impact. However, Matt Cutts has a high profile domain since he is well known for talking about his experience within Google. The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine. I would concede this point if the major updates are operating in a different test environment then the live environment. By environment I mean different server architecture, like different php / asp versions or database types/versions that the current live server can not or will not be updated to. When you create a test environment you generally want to duplicate the live environment so you can simply push the test elements live once complete.If the server architecture is part of the test then I can't argue with the logic.
Intermediate & Advanced SEO | | donford0 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For those that are interested, we figured out the root of the problem and the fix. Maybe this will help someone down the road. Thanks to all that responded. As it seems all the domains/accounts hosted on our VPS are top level; so we couldn’t use robots.txt to fix this. The problem: -We have a handful of websites using a shared IP address – Only sites with SSL cert get a dedicated IP -Some of these websites have additional A record DNS names (mail.domain1, ns5.domain2, etc) that all resolve to that shared IP address -Within your browser if you went to one of those records it would pull up the website (account) that is first alphabetically within that shared IP pool -It’s assumed that a website or multiple websites somewhere in the web must have linked these domain names somewhere on their webpages – so search engines crawled them for the content of that first alphabetical website -Search engines then indexed the content of that site for these other DNS A records and displayed the alternate URLs in the SERPS The fix: Edit Apache’s httpd.conf file on our VPS – adding a new (first listed) virtual directory of that shared IP address to point to the servers cpanel default cgi page instead of listing one of our hosted accounts first. This page describes what we needed done and what fixed it for us: http://forums.spry.com/cpanel-whm/1568-how-can-i-change-default-page-ip-address-points.html Thanks again!
Intermediate & Advanced SEO | | Motava0 -
Secondary Menu - nofollow or other strategy?
good video! Definitely going with regular links for this secondary menu
Technical SEO Issues | | Motava0 -
How do I find the most linked to page of a site?
In OSE I get the same result for both of these: Show ALL links from External Pages Only to 1. all pages on subdomain 2. all pages on root domain
Moz Tools | | Motava0 -
Registering a domain for multiple years
lol... Alan, you really surprised me with this answer! Maybe CRS should be added to all of the SEO glossaries?
Technical SEO Issues | | EGOL0 -
Tool/Method to find users on Twitter from a CSV file
Hi Keri, No luck with any other tools, unfortunately. It's still something on the to-do list. Ryan's comment is helpful but still a lot of manual work involved unfortunately. Victor
Social Media | | Motava0 -
50+ duplicate content pages - Do we remove them all or 301?
I would keep them on the same URLs and explain the dupe content risk to client. Then client can decide to upgrade them himself, have you source new content or allow them to run "as is". I know of a lot of sites that have pages like this that are working well and make nice money.
Technical SEO Issues | | EGOL0 -
URL with two forward slashes //
The // is most likely the result of a faulty setting in the (url routing) system and should be something that can be fixed. As for both the // and the /cpage: is it optimal? Most certainly not, neither for search engines nor for visitors, will it harm your rankings badly, I don't think so. Even though it is something you'd rather not have in your url, all other things being perfect you'll probably won't notice this in your rankings.
On-Page / Site Optimization | | Theo-NL0