Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
If my website do not have a robot.txt file, does it hurt my website ranking?
Hi, No, your website will work just fine without a robots.txt file. Without a robots.txt file search engines will have a free run to crawl and index anything they find on the website. This is fine for most websites but it’s really good practice to at least point out where your XML sitemap is so search engines can find new content without having to slowly crawl through all the pages on your website and bumping into them days later. It shouldn't go to homepage if mywebsite.com/robot.txt doesn't exist shoud go to custom 404 error page. Hope this helps. Thanks
Intermediate & Advanced SEO | | Alick3000 -
HTTPS and HTTP both exist! How to handle?
Hey there, the sooner you'll redirect HTTP to HTTPS the better. You are going to have more backlinks and rankings in time so it would be definitely harder in the future. Also, try to convince the 6 websites which already point to HTTP to change their links to HTTPS. And then, of course, redirect each page individually, as you said. Hope it helps, Martin
On-Page / Site Optimization | | benesmartin0 -
Recent 2017 Disavow Experience - How long is it taking?
Hey there, It usually takes up to one month but often times about two weeks. Once, you disavow the bad links they will still appear in Search Console The main reason is that Google just won't count them against you. Have you been successful with disavowing since May? Cheers, Martin
White Hat / Black Hat SEO | | benesmartin0 -
Rich Snippets in wordpress websites
Hi, Here is 6 best rich snippets plugin for Wordpress @ http://www.wpsuperstars.net/rich-snippets-schema-plugins-for-wordpress/ Hope this helps! Thanks
On-Page / Site Optimization | | Alick3000 -
Google My Business - Switching from Local to National Presence
Good, Andrew. A rep named Mike told me they'd be on the lookout for your tweet to them. Hope you hear back, and you're very welcome.
Reviews and Ratings | | MiriamEllis1 -
Strange Ranking Results
ok great thanks will do ps - dont spose any chance of getting this one answered is there: https://moz.com/community/q/problem-toggling-between-ga-profiles-since-new-update
Local Listings | | Dan-Lawrence0 -
Robots.txt wildcards - the devs had a disagreement - which is correct?
Thanks Logan - much appreciated, as ever - that really helps - if I was to add another * to **Allow: /?resultspage= > so **Allow: /?*resultspage= - what would happen then? ****
Intermediate & Advanced SEO | | McTaggart0 -
What’s the best way to handle multiple website languages in terms of metatags that should be used and pages sent on our sitemap?
Allan, the Google resource is the base. However, remember always that: Google suggests using always the rel="canonical", even if it's only self-referential. The href in the hreflang must always indicate a canonical URL. If not, Google will consider it a mistake and won't consider the hreflang. If you're targeting two different languages (i.e.: English and Spanish), the use of the hreflang is not strictly necessary, however using it is another signal you give to Google about how you want your urls target a specific audience based on language or language/country.
Intermediate & Advanced SEO | | gfiorelli10 -
Http > https Switch Before Platform Migration?
"The concern is that, due to the http>https 301 redirects that will be in place, are we putting ourselves at unnecessary risk by effectively carrying out 2 migrations in the space of a year (in terms of loss of potential authority caused by redirects)?" In February 2016, Google’s John Mueller announced that SEO equity or PageRank will no longer be lost when a 301 or 302 redirect is used in conjunction with an HTTP to HTTPS migration. While some of us doubted this statement, Gary Illyes tweeted the same thing in July 2016 and Barry Schwartz at Search Engine Land confirmed it. There is no loss of authority caused by redirects when you implement HTTPS. "Would we be better to wait, and implement https at point of platform migration instead?" I think the approach you're taking (convert to https first) is a good one. It affords you better control and is a good use of available resources.
Intermediate & Advanced SEO | | DonnaDuncan0 -
HTML and XML sitemaps for one website.
Hey there, there's nothing wrong with having more sitemaps than one. In some cases it's even better because it's easier for Google to crawl them. You can divide the sitemaps by categories, months, types of content etc. However, as mentioned in the following thread, HTML sitemaps are supposed to help primarily people in navigating in the website, whereas XML sitemaps serves for helping the crawler. Therefore, I'd give priority to the XML sitemap in this case. For more information, read this thread: https://moz.com/community/q/sitemaps-html-and-or-xml Cheers, Martin
Web Design | | benesmartin0 -
Best practice for deindexing large quantities of pages
Unfortunately, I don't think there's any easy/fast way to do this. I just ran a test to see how long it take Google to actually obey a noindex tag, and it's taken a little over 2 months for them all to be removed. I had 2 WP blogs that I added the noindex tag to all category, tag, and author pages and monitored the index count 4 or 5 times per week by running site:example.com inurl:/category/ queries. There was a lot of fluctuation at the beginnning, but eventually took hold after about 2 months. On one of the sites, I did add an XML sitemap with only the noindexed URLs on it, submitted it via Search Console, but that didn't seem to have an impact on how quickly they were dropped out. See the screenshot below of my plotting of indexed pages per subfolder: jHm7CkD
Intermediate & Advanced SEO | | LoganRay0 -
Why doesn't Moz crawl whole pages of our website to report All On-Page issues?
Thanks Martijn for the reply. I didn't notice , my question was incomplete. My query, Moz crawled our website pages upto 258, where we have on website more than 15000 pages. Why Moz crawler is failed to do so to report whole website pages issues. Or what actual our website has been facing issue, & prevent Moz not to crawl those pages as well. Rann
Moz Tools | | BigSlate0