Hi Brian,
Just sent you a copy from the guidelines I downloaded yesterday from Searchengineland.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Brian,
Just sent you a copy from the guidelines I downloaded yesterday from Searchengineland.
Yeah, it shouldn't be like this but it's probably due to the plugin you're using. Best would be to kick off the new year in style and start using the WordPress SEO plugin by Yoast. It will automatically take care of updating your sitemaps whenever you post new content. On top of that you could also filter out individual posts if needed.
Hi Phillipp,
You almost got me with this one, but it's fairly simple. In your question you're pointing at the robots.txt of your HTTP page. But it's mostly your HTTP**S **pages that are indexed and if you look at that robots.txt file it's pretty clear why these pages are indexed: https://www1.swisscom.ch/robots.txt all the pages that are indexed match with one of your Allow statements are the complete Disallow. Hopefully that provides you with the insight on how to fix your issue.
Hi Raymond,
I seriously hope that they meant this in a different context than just saying PHP for SEO is bad. Because it's absolutely not true, at least 20% of the web is almost using WordPress which is nothing more than PHP. Besides that millions of sites are using PHP as well so I won't bother about this advice and if they literally said this then look for another SEO company as they're probably not worth the risk.
Hi Carl,
Auch, this is probably not a use case you ever wanted to fix for you clients. However I would suggest filing a DCMA request based on the copyright of your texts/ images used on your client site. This will, hopefully, at least remove the copied site out of Googles index.
Filing such a request could be done here: http://support.google.com/bin/static.py?hl=en&ts=1114905&page=ts.cs
Good luck!
Hi Eric,
The useragent of SEOMoz is: rogerbot.
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
Using your keyword twice is definitely not what I would call keyword stuffing. But I would make sure that you create quality content for this post as the EMD (exact matching domain) could get you in trouble someday if you don't do this.
For any people later coming in, in this thread. I got in touch with the site owner, it turned out that the GA tracking code was missing on the thank-you page.
No, it won't help you at all as it's not a valid extension that they will use. What you can do is add a link to the HTML sitemap from multiple pages on your site so you provide an efficient way for Google to access it and use it to crawl the other pages on your site.
Hi Jason,
I wouldn't worry about changing this at all, in the end, the 50K limit that has been put on sitemap is an arbitrary one. So if you keep your sitemaps well under that it doesn't really change anything at all. In the end, the files itself are not a ranking factor, they're being used to become aware of URLs that don't exist on the site or for search engines to be notified of URLs that have been updated (through the last mod attribute). So changing it to 15K shouldn't harm you.
Martijn.
No, because if you're thinking about it you're clearly not writing content that is intended for a user. I've worked with dozens of editors for publishers that really never think about keywords or the density of it. They're the best ranking sites in the world 
Hi,
It can be found in Google Webmaster Tools and can be used to fetch a page to see what Google sees when they visit your page.
It's not completely the way of preserving your shares, pins & likes for all of your content but in this post of Mike King he explains on how to still get the numbers for your old pages: http://searchenginewatch.com/article/2172926/How-to-Maintain-Social-Shares-After-a-Site-Migration
Are you normalizing it somehow? As some of the other charts that are available are showing increases too but not on a daily basis anymore. At some point the new reality should become the baseline right?
Hi Atul,
Data from Google Webmaster Tools could be found in the new version of Google Analytics. You'll have to follow the following menu structure to get to the data: Traffic Sources > Search Engine Optimization > Queries. This data will be provided from Google Webmaster Tools.
It could take a couple of hours before you're data could show up.
Happy analyzing!
Hi Jordan,
I would only stay away from tags related to frames and iframes as they're not good for SEO. Besides that you'll be OK with every kind of tag. Including the span tag. It's ridiculous that somebody would argue that using these tags are not good for SEO as they provide a lot of opportunity to style elements within an element.
What you could do is check how the message was sent to the user by what connection. It could say for example that it was using a Chrome Browser or another application who supports this.
Hi Iris,
The 'issue' for REL canonicals is only just a warning that Moz found the rel canonical on your site. So it's not saying it's an issue of some sorts. But if you want to be sure post the URL and we'll take a look.
Hi Jorge,
If you could sent me you're emailaddress by PM then I'll make sure you'll receive the handbook. Because I received more questions about this, this would be the best option.