Try to fetch your site as Google in GWT to see what it says. If there are DNS errors, it should give you some indication of where the error is.
Google Support article to do some more research - https://support.google.com/webmasters/answer/2598813
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Try to fetch your site as Google in GWT to see what it says. If there are DNS errors, it should give you some indication of where the error is.
Google Support article to do some more research - https://support.google.com/webmasters/answer/2598813
Being in the travel industry these are the same observations that I've noticed.
My theory behind a few of them:
For your second question, I know one online travel agency put in a lot of money and time to create dynamic style pages when searching for flights between city 1 and city 2. You can alter the URL by changing the airport code and city name and the copy within that page remains the same with the exception of the city names. There are companies as well that will build out the actual pages with any desired URL structure you would want.
It was discovered that when you search our brand and our city "alaska airlines seattle" that the brand knowledge graph shows a building at the University of Washington that is named after us as part of a sponsorship deal.
This seems logical that it would do that since the building is branded Alaska Airlines and it is in Seattle. The problem is information listed would confuse customers that call the number posted and instead of our customer service they are getting the University.
I admit I am not too familiar with local SEO so any help is greatly appreciated.
noindex only tells the search crawlers to not include the page in the index but still allows for them to crawl the page. nofollow will tell the crawlers to not crawl the page.
robots.txt will accomplish this as well but both I think would be overkill.
I'm getting a ton of duplicate content errors because we use some tracking parameters for navigation tracking. Is there way to exclude these from RogerBot? If not suggestions on how to get these errors from showing in my crawl report?
I agree with Arjen, onsite SEO is going to improve your rankings for the keywords that matter. Your main keyword "Name Necklace" doesn't really appear on the homepage as a keyword for that page. It is the anchor text to your category page of Name Necklaces, plus that page has it as the page title which is why Google thinks it should be indexed for that keyword.
Another thing I noticed, your canonical tag has the O capitalized and your URL does not. Make sure you keep this consistent as the SE's will think this is a different URL.
If you have a mobile equivalent page to your desktop page, make sure to redirect your mobile users to that page. Make sure that you do not redirect all users to the m. homepage, that creates a poor customer experience which Matt Cutts has said will result in poor rankings for your mobile site.
Make sure you use the correct canonical tagging between the mobile and desktop pages. GWT talks about the correct use of these tags here https://developers.google.com/webmasters/smartphone-sites/details
Do search engines pay attention to periods in abbreviated queries? If I use Mt. Bachelor all over my site, would SE's not rank my site well for queries that use Mt Bachelor?
That does answer my question partly. How do you handle the cached URL for the original 301 that points to the invalid URL?
Example. www.bob.com/hello points to www.bob.com/directory/folder/file.aspx
It needs to now point to www.bob.com/directory/folder2/file2.aspx
If browsers and search engines cache the first 301 since it's meant to be permanent, visitors that have been to the first URL will not get passed off to the new one.
We have a vanity URL that as recommended is using 301 http code, however it has been discovered the destination URL needs to be updated which creates a problem since most browsers and search engines cache 301 redirects.
Is there a good way to figure out when a vanity should be a 301 vs 302/307?
If all vanity URL's should use 301, what is the proper way of updating the destination URL?
Is it a good rule of thumb that if the vanity URL is only going to be temporary and down the road could have a new destination URL to use 302, and all others 301?
Cheers,
we have recently installed an application firewall that is blocking rogerbot from crawling our site. Our IT department has asked for an IP address or range of IP addresses to add to the acceptable crawlers. If rogerbot has a dynamic IP address how to we get him added to our whitelist? The product IT is using is from F5 called Application Security Manager.