Questions
-
Removing old URLs from Google
I had kind of a 'derp moment'. If you control the same domain, take a look at your historic traffic (if possible) and definitely do a link audit on the error pages. You can 301 the old pages to their newer, relevant, counterparts. I would take that tact before removal.
On-Page / Site Optimization | | Travis_Bailey0 -
Does a sub-domain benefit a domain...
I have heard conflicting things from people at Google, including Matt Cutts, throughout the years. I have also seen some first-hand anecdotal evidence that subdomains inherit some benefit from the parent domain, though much less evidence that parent domains gain anything from the subdomains. It is my opinion that the devil is in the details here. If, for instance, you have a site with dramatically different topics on various subdomains, as if they were completely different websites, I think it is more likely that Google will treat them as such. However, if you have a site with a subdomain for a certain section, such as a forum or blog that is of the same, or very similar topic, with lots of interlinking between the two - as if they were different parts of the same site - I think it is more likely that Google will treat them as such. So I don't think the answer is as cut-and-dry as people often like to think, which is why it is so difficult to find consistent proof of one theory or the other. In the end, do what's best for your users and developers. Most of the time that means a subdirectory, but every once in awhile it means a subdomain. I'll leave this open for other opinions. Maybe someone has some empirical proof that we can look at.
Technical SEO Issues | | Everett0 -
How to separate your - keywords - and | Brand name in the Title Tag
Hi Switch, thanks for your question! Chris is correct in that it is best practice to keep your title brief and readable, with your keyword on the front end and your brand name on the back end. I personally feel that separating the page/article title from the brand name makes the tag easier to read. As Google now truncates title tags based on the total width of the characters (vs. the number of characters), I prefer using the pipe separator for this purpose. Hope that helps!
On-Page / Site Optimization | | Christy-Correll0 -
Paid for News Media backlinks
It all depends on how the site works with Google. Google has a "First Click Free" concept https://support.google.com/webmasters/answer/74536?hl=en http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html Basically, it allows Google (and everyone else) to have access to new content the first time they crawl it This was modified to 5 clicks in 24 hours only http://paidcontent.org/2009/12/01/419-google-modifying-its-first-click-free-program-for-subscription-sites-5/ According to this article http://www.theguardian.com/media/greenslade/2013/mar/26/telegraph-paywall The Times is behind a full paywall. You need to do some digging, heck ask the Times. If they are not letting Google crawl it or a modified portion of it, you may not get SEO benefit "directly" (a la M. Cutts ) That said, there is a ton of indirect SEO that you can get from you. If there are a bunch of people reading the paid version of the Time, they are going to tweet about it, talk about it on FB, mention it on blogs. You will get direct referral traffic from it as well. So, is it the same as a link from a publication that is completely free an online, no. I would not say that it makes it worthless. Look at your analytics, how much traffic is coming from the Times? Does it convert well? If the answers are "lots" and "yes" - then keep doing it! The other indirect benefits that I mention are gravy.
Link Building | | CleverPhD0 -
25 domains, same content for each country, YIKES!
Hi, 25 domains is quite a lot and the short answer is that if they have essentially the same content in the same language then you are likely to have a duplicate content issue of some sort. I think the best way to approach it starts from the top. You need to identify your main keywords by language/country, see what kind of traffic these keywords are (or could be) bringing to your sites and weigh this information against the amount of resources needed to properly build and support all these different domains in their own language. Once you have decided on an operational level which domains/languages should be your primary focus then the technical way to implement in terms of avoiding duplicate content issues and declaring to the search engines which tld/version is aimed at which target audience is pretty straight forward. This recent mozinar runs through a lot of these issues: http://www.seomoz.org/webinars/foreign-language-seo And all of the posts in the international issues category are relevant and worth a read: http://www.seomoz.org/blog/category/international-issues Hope that helps!
On-Page / Site Optimization | | LynnPatchett0 -
Webmaster Tools finding phantom 404s?
Okay, thanks, and you are quite right, a whole morning is more than enough energy... gotta love SEO work!!!!
Technical SEO Issues | | Switch_Digital0 -
URLs - category naming conventions
This isn't an answer to your question, but it's adding one more thing to think about. Your URL structure can be helpful when it comes to using Google Analytics. There's a good post on this at the Lunametrics Blog that you might want to check out. From a metrics perspective, it can be helpful to have that category in the URL -- you can look at the analytics for that entire folder at once, which isn't possible with your preference (I'm assuming that there would be multiple contact lens manufacturers in your case).
On-Page / Site Optimization | | KeriMorgret0 -
Temp redirects of homepage URLs
Hey Justin- Check out this article right here on SEOmoz: http://www.seomoz.org/learn-seo/redirection You are correct ... no juice passed -John
On-Page / Site Optimization | | blu42media0 -
URL length... is >115 now >255?
Although the specification of the HTTP protocol does not specify any maximum length, practical limits are imposed by web browser and server software. Extremely long URLs are usually a mistake. URLs over 2,000 characters will not work in the most popular web browser. Don't use them if you intend your site to work for the majority of Internet users. If you must have long URI's ask your CMS provider if there's a way they can set clean url's instead of the long versions.
On-Page / Site Optimization | | jim_cetin0 -
Site relaunch for client - switching from .co.uk to .com
It has been my experience that using a 301 redirect does not pass 100% of the ‘Google Juice’. However, if your existing backlinks are ‘not substantial’ then I would not expect the domain authority to suffer from your change. When performing similar projects I have gone through the existing backlinks and contacted the webmasters for links I deemed worthy and alerted them to our change, asking them to update their link: I would suggest you do the same. It can be a bit tedious but I consider it a best practice.
Link Building | | Vizergy0 -
Telephone numbers on page getting classed as 404s by SEOMoz
Hey Gentleman, I'm sorry your running into this issue with your crawls. :[ We would definitely love to look into this for you guys, but it's a little difficult for us to do that through the Q&A forum. If you write in to help@seomoz.org with your campaign information, like SEO Consult suggested :], we can look into this issue and work on getting a fix for you. We look forward to hearing from you both at the help desk. Chiaryn Miranda SEOmoz.org
On-Page / Site Optimization | | ChiarynMiranda0 -
20 x '400' errors in site but URLs work fine in browser...
Most major robots obey crawl delays. You could check your errors in Google Webmaster Tools to see if your site is serving a lot of error pages when Google crawls. I suspect Google is pretty smart about slowing down its crawl rate when it encounters too many errors, so it's probably safe to not include a crawl delay for Google.
On-Page / Site Optimization | | Cyrus-Shepard0 -
Google UK search volumes
You are correct. it will be for all purposes a US search. I have no evidecnce not to believe so.
Search Engine Trends | | AlanMosley0 -
Woah, my A-grade optimized pages that were on the first page have all vanished outside the top 50 in Google... is this Panda?
Problem solved, well, more like diagnosed! Although the pages were A grade optimised, filters on the page (i.e. sort by high price, low price, new products) were seen as duplicate pages as the CMS would take the static page and add some dynamic content for the filter. Rel Canonical has sorted it, but we shall see next time Google updates.
On-Page / Site Optimization | | Switch_Digital0