What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Hi,
We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers.
And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings?
Thanks
Delete everything under the following directives and you should be good.
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
As a rule of thumb, it's not a good idea to use wild cards in your robots.txt file - you may be excluding an entire folder inadvertently.
No, it's still a redirect. See attached image clearly stating a 302 Temporary Redirect from http://www.eco-environments.co.uk/solar-power/ to http://www.eco-environments.co.uk/solar-power/default.phuse
If your developer still doesn't believe you then have them verify it themselves with this web based HTTP header check tool ~> http://www.webconfs.com/http-header-check.php
Yes, using a 302 Temporary Redirect is hurting your page authority because these types of server response codes do NOT pass any link juice. To cultivate all inbound/internal link equity you want to use a 301 Permanent Redirect instead. With a 301 redirect you retain about 90% of the link value.
I'm not sure to what extent your website is being blocked with the robots.txt file but it's pretty easy to diagnose. You'll first need to identify and confirm that googlebot is being blocked by typing in your web browser ~> www.mywebsite.com/robots.txt
If you see an entry such as "User-agent: *" or "User-agent: googlebot" being used in conjunction with "Disallow" then you know your website is being blocked with the robots.txt file. Given your situation you'll need to go through a two step process.
First, go into your wordpress plugin page and deactivate the plugin which generates your robots.txt file. Second, login to the root folder of your server and look for the robots.txt file. Lastly, change "Disallow" to "Allow" and that should work but you'll need to confirm by typing in the robots URL again.
Given the limited information in your question I hope that helps. If you run into any more issues don't hesitate to post them here.
I believe one of the biggest differentiating factors to your question is relevance. This takes into account things like anchor text, content of the link source and link destination - just to name a few.
For example, it will look natural for a cooking utensil website to link to similar verticals such as a local bakery or recipe website. On the other hand, if your website is about the advancement of nuclear fusion, there is no real reason for you to link out with the anchor text "buy discount bath robes". This will definitely raise a red flag on the search engine's side.
In a nutshell, a link exchange between real businesses will "work" if they are contextually relevant; those that aren't will be detected and subsequently devalued.
Hello Knut,
Below are a few articles and White Board Friday's to give you a quick primer regarding SEO. There's definitely more of these out there so don't hesitate to ask Google!
WBF - International SEO: Where to Host and How to Target
YOUmoz - International SEO Part 2
mozBlog - Geolocation & International SEO FAQ
I've had a little experience SEO-ing websites in Japanese and the landscape is completely different. For starters, Yahoo is actually the dominant search engine but they use the Google algorithm - so just focus on Google's main ranking factors.
Since you don't know any Japanese you'll need someone VERY fluent in the written language so that you can account for both Kanji AND Chinese. You'll need to make a business decision on whether you want to write it in one form or the other - keyword research would definitely help here.
Don't be surprised if most of your visitors are coming from mobile - that's just how the technological culture is in Japan. Most people surf the web using their cell phones (since they are light years ahead of us) and not so much from their computer.
Last but not least, create great content to attract links. Your easiest links will come from those your website/business already has a relationship with. It comes from the Chinese concept of "guanxi" which literally means "relationships" and is an extension of the culture.
I hope that helps you get started and good luck!
Hi Darren,
To answer your question on how you can leverage microdata for your clients website? In a nutshell - just do it. Feel free to refer to Schema.org for documentation and examples.
As far as implementing microdata goes, I highly doubt it will help you "win" in SEO. Why? Mainly because it's a signal for the major search engines to highlight the content, context and relevancy of the page - not a crucial ranking factor. Amazon uses microdata but that's not the reason they are dominating the SERPs.
Let us know if you have any other questions and we'll be glad to help!
Hi Anchorwave,
You can check out a post published earlier this year by SEER Interactive. It's my favorite article of all time in helping me determine whether a link is quality or not. Feel free to check out the 25 ways to qualify a link.
Cheers!
No, unfortunately there is no way to prevent search engine indexation within the tags of your web page. As you mentioned earlier in your question, you can either utilize the meta robots exclusion tag or the robots.txt file.
If you are REALLY intent on blocking indexation of your promotional page and can only use the section, perhaps you can consider using an <iframe>? For example, create a totally new page with your promotional copy and blocked by robots.txt while ensuring you have NO links pointing to it. Then on your promotional page use the <iFrame> tag to extract the content from the robots.txt blocked copy.</p> <p>Honestly, I'm not sure if it'll prevent indexation since I've never tried it before but just an idea.</p> <p>Good luck and tell us how it goes if you do! =]</p></iframe>
And if for some reason you don't happen to like Chrome you can check out Page Speed Online by Google.
I know this may sound obvious but I thought I would ask anyways: are you sure your page was indexed?
To check if this is the case go to Google or Bingahoo and type in **site:websiteURL. **If your page in question does NOT show up then you don't have a problem.
However, if it does then I would urge you to quickly register your client's website with GWT and request a URL removal. Also, if you want the page to get de-indexed "faster" I would recommend taking down the page altogether and implementing a 301 Permanent Redirect to a relevant page. If you don't have a relevant page then server up a header response of 404 Not Found.
Of course, if that is too technical and you don't have development resources then you can just delete all the content on the page (or insert a "coming soon" image) and no one would be the wiser. =]
I hope that helps!
Hi Rick,
Great job taking the initiative in trying to fill this information gap at a (unfortunately) commonly overlooked disorder. It's good to know that you are pushing out some quality content on the web. Now, let's talk about SEO.
Before you make ANY changes I would strongly urge you to first check out your web analytics and get a good grasp of your inbound traffic. You'll need to create some advanced segments and do some deep dive analysis to address some important questions...
Essentially, what you are doing with this in-depth analysis is to determine what you already do well in and where you can improve. Your website is already live for 6+ months so you don't want to lose any traffic for something you already have traction in. From there, you can make a smart data-driven decision on what pages need to have their title tags changed, add on-page copy, what new videos/content you should create, etc.
As for your question about the video categorization, I would keep the videos under Noah's Minute and sub-categorize them. The main reason is NoahsDad.com is associated with the name Noah's Minute and in essence brands your website. Maybe you can even ask your existing followers to see if they are okay with this?
Regarding mis-spellings, I do not think that is a good idea. If you want to portray your website as an authoritative source to not just search engines but users as well, everything should be written correctly. Search engines can auto-correct mis-spellings so you don't have to worry about that. Here is an example for "downe syndrom videos".
Lastly, ranking for highly competitive keywords is never impossible - you just need to create extremely valuable content and gains lots of links to them. For example, you can create a category for "down syndrome facts and information". Push out some high quality content that includes some myth busting then link to your Noah's Minute subcategory videos and you'll start building out a robust internal link structure. From there, any incoming link equity would boost up your entire website.
At any rate, I hope this gives you a good head start on where to look first but there is a TON of other things we still haven't covered. I apologize in advance if some parts don't make sense but please don't hesitate to ask if you have questions.
Good luck! =]
Yes, absolutely but you'll need to utilize the SEOmoz Linkscape API and generate your own access key. You can check out the Excel spreadsheet BusinessHut created at http://www.businesshut.com/seo/using-seomoz-free-api-excel/
I use it all the time. 
SEOmoz only has one API that provides access to their Linkscape index - it does not include social data such as Like's, +'s and Tweets.
Also, it seems like you have development resources available to utilize the APIs of the major social platforms. Have you considered going straight to the source?
Facebook OpenGraph API - http://developers.facebook.com/
Google+ API - https://developers.google.com/+/api/
Twitter API - https://dev.twitter.com/docs