If you're all good I'd mark the question as answered, and mark any answers you found helpful as good answers - this helps all of us here in the community!
Thanks,
Mark
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
If you're all good I'd mark the question as answered, and mark any answers you found helpful as good answers - this helps all of us here in the community!
Thanks,
Mark
google can read your site - they may be reading the content and changing your titles that they are displaying - instead of showing your title tag, they are showing one that they believe to be more relevant.
In Webmaster Tools, you can fetch the page as Googlebot and see what they can see - if you want me to help you via your webmaster tools account, contact me in private message and we can take it from there. But you should be able to do this on your own and see what Google can see - once you know what they can see, you can diagnose if they can actually crawl your page, or if they are rewriting the tags for other reasons.
But your 301 redirect you put in place looks ok to me.
Not sure why you're saying Google can't access your site and your pages have been removed - you can see from this search query they are still in the index - https://www.google.com/search?hl=en&source=hp&q=site%3Awww.thegrooves.net
I would definitely improve on your site - you have lots of empty pages created that are simply gallery pages, each page showing one image - I would create a gallery showing multiple images on a page - better for the search engines, better for users.
It looks like your site could use a refresh - it also doesn't look like you have Google Analytics installed - this would be a very powerful resource for knowing who is visiting your site, what they're doing, how visitors are converting, etc. I would definitely consider modernizing and fixing up your site
Mark
Hey,
Glad to hear it worked out
Mark
Hi Ron,
I'm happy to take a look at the site here on the forum - you just need to provide us with your URL and see if we can find an answer for you why the site isn't showing up in the SERPs
Mark
Use Google Analytics for this - set up your conversions as goals, and then you can track via various reports the conversion process - what did they do, where did they go, how did they flow through your website from initial landing page to full conversion.
Google Analtyics is very powerful - it does multi touch attribution - multiple visits until your conversions, how they came each time to your site, etc.
It really is a wonderful tool - here is a guide to setting up goals - https://support.google.com/analytics/answer/1032415?hl=en
Once the goals are set up and the data is flowing, the fun really begins, and you can make lots of actionable analysis to improve your site based on it.
Good luck,
Mark
Blocking with robots.txt doesn't remove pages/files from the search engines, it only prevents them from crawling the subdomain. If they already crawled those resources, as they have in your case, the robots.txt will just block them from visiting it again, but will not remove them.
What you should do is authenticate the subdomain with webmaster tools, and then remove the subdomain via the url removal tool, as you asked. This is the way to go about it.
To verify the subdomain, there are multiple options, including via your host, or you can modify the dns to prove you own the subdomain - it really depends on your setup and what you need to do.
But to remove these results permanently, with your robots.txt block, you should do it via the URL removal tool.
Good luck,
Mark
According to what I saw here, http://www.guardian.co.uk/help/insideguardian/2013/may/24/theguardian-global-domain, they'll be working with the team at Yoast, who are some of the best in the biz at onsite SEO, so I think the Guardian should be in good hands.
OK - if this is the case, to force Google's hand (so to speak), I would canonical these brand pages over to the target pages you want your visitors to land on - this won't remove them from display on the site like a 301 redirect would, but it will point to the search engines that these pages should be shown over the brand pages in the SERPs. That's what I would do to solve your problem.
Mark
Hi Jonathan,
Could there be something technical hurting that page? A meta robots block? A robots.txt block? Perhaps overoptimization of the page?
If you can share the URL, that will be helpful to try and provide some ideas for how to fix the situation.
Mark
I don't believe Yoast's Wordpress SEO plugin will cover Magento files. The Wordpress SEO plugin is built into the Wordpress, and pulls the page, post, category, taxonomy, etc. data from Wordpress in order to build the sitemap. The plugin itself creates a sitemap for each page, post type, etc., and is highly configurable. It then uses a sitemap index file to connect everything together.
But since the Wordpress doesn't have the Magento information, since they are separate CMS's with separate databases, the Yoast Wordpress SEO plugin won't build a sitemap file for information in the other CMS, as it has no access to it and no knowledge of it.
You can use other tools to create sitemaps for the site, either through manual crawls or installed software on your server, but the yoast wordpress plugin won't do it.
Mark
Thanks Paul - I can't take most of the credit - got to give it up to the guys at Screaming Frog - love the spider
You didn't provide the address of your site, but I've seen on larger sites with user generated content, or sites that have some type of forum, this type of anchor text may be indicative of a larger problem - that you have problematic pages on the site with content relating to child porn, or your site has been hacked and code/links have been injected. Thus, the spammers are linking to you with that anchor text, and your site is part of a larger chain.
I would take a close look at the landing pages for those links - are they forum pages/user profiles of spam, or are they linking to a page on your site that was hacked into.
To check if your site was hacked into, I would view the cache of Google for the page/site. If so, you can often see the stuff they have put on your page - if it doesn't show up in the regular view of the cache, try clicking the link for the text only version. While this may be negative SEO, I've seen lots more cases where this is a result of a site being hacked or user generated content sections not being moderated and which have run completely wild.
Mark
Basically, SEOMoz is crawling your site and telling you they found these errors. There errors were found by following internal links.
I took a quick look at the code on your site, and in the social icon buttons in the header of your pages at the top right, for the Twitter icon, you are linking to yourself, to piensa_piensa - here is the code
class="social-icon"><a <span="" class="webkit-html-attribute-name">target</a><a <span="" class="webkit-html-attribute-name">="_blank" href="</a>Piensa_Piensa ">src="http://piensapiensa.com/wp-content/themes/modernize-v3-11/images/icon/dark/social/twitter.png" alt="twitter"/>
Your href is piensa_piensa - this is adding this text to the URL of every page - correct this link and this error will be fixed - either use # until you link to your Twitter account, or link to your proper address of the twitter account. This will solve your problem.
Good luck,
Mark
For webmaster tools, you authenticate each site separately. These are different country level top level domains, and will be separate entries in webmaster tools. In addition, if you have different subdomains on the sites, www, my.peinsa, news.piensa, or anything else, you set those up each as a separate entry in webmaster tools.
You can geotarget the different sites/subdomains in webmaster tools as well - it really depends on the setup of your sites and the targeting you want to implement.
Mark
Hey SaraSEO,
This might be a good solution for you - I haven't tried it, but it seems like what you need - sitemap creator that can strip out parameters and follow canonical tags - http://www.inspyder.com/products/SitemapCreator/Default.aspx
Good luck,
Mark
These pages, which are search results pages, are often viewed by the search engines as thin pages - I would recommend adding the noindex tag to your search results pages, so that these pages won't dilute the index with thin, low quality pages that are yummy food for the algorithms.
By adding the noindex tag, you'll be fine in terms of duplicate title tags.
As an aside, you also have two versions of the meta description tag on the page - I would modify this so that for each page on your site you only provide one meta description.
Mark
I would try creating an open Google doc, and then listing all of the sites in the network. Kind of like the reconsideration request method, where you link to an open Google doc with all of the details of the webmasters you contacted, responses, success rate, etc
Hi Matthew,
Google has a specific form for reporting webspam - you can find the spam report here - https://www.google.com/webmasters/tools/spamreportform?hl=en
Before you submit a competitor, make sure your own site/s are clean - don't want them looking to closely into your SERPs and your sector and finding a problem with you as well.
Mark
I don't think they're "gaming" Googlebot - I think they're trying to help the bots properly crawl through the site, index the relevant content, but not create hundreds of thousands of empty pages that will simply dilute their index and lower the overall value of the site in the search engine's eyes - I think they're trying to keep the Panda hungry and not provide it with lots of yummy food for it's low quality content hungry stomach.
This is why they are noindexing the pages - not to game the system, but to actually play by the system's rules.