Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
If no pages which support web coding actually exist for the URLs you want to remove from Google's index, I'd probably use the HTTP header instead. Maybe use the X-Robots directives: https://yoast.com/x-robots-tag-play/ https://www.searchenginejournal.com/x-robots-tag-simple-alternate-robots-txt-meta-tag/67138/ Even if you have no page with web-code, you can always have a HTTP Header. A HTTP header simply allows a client and / or server to fire additional information through 'requests' (post / get etc). This is the only thing I can think of which would really help. Some people might suggest robots.txt wildcards, but robots.txt handles crawling and not indexation (so those answers wouldn't really be worth anything to you) The other thing you could do (maybe combine this with the X-Robots stuff) is to get all of those URLs to serve status code 410 (gone) instead of 404 (temporarily gone, but coming back)
Intermediate & Advanced SEO | | effectdigital0 -
Click To Reveal vs Rollover Navigation Better For Organic?
Interesting UX question. Short answer; click menu is best, but its not black and white. Naturally its more subtle than that. You mention regular content. Regular content being hidden by any mechanism is naturally not too user friendly. Accordions can often be overlooked, text hidden in the hover state of images is a client favourite that is also terrible UX practice. The mechanism doesn't matter too much - its the fact content is hidden by an un-signposted mechanism. The author knows its there, but your visitor will not. Menu isn't content though; its a different beast. A menu needs to exhibit good information hierarchy. We try to keep our main menu to 7 items or less, essentially for clarity of the first tier of offerings. This can often necessitate sub-menus. Sub-menus are hidden content, we're just arguing the toss about mechanism. So first off we'd suggest a nice little signpost like a downward arrow to show which main items have sub-menus Also note we don't have hover states on touch devices, so unless you're planning on a second type of menu for that, your choice is made for you and it'll certainly need to be selection rather than hover based. Select to get something is more in keeping with how everything else on the web works; text links, buttons etc. Hover feels more immediate but if your site demographic is broad, bear in mind that the dexterity required will elude a percentage of your audience. Consider the accessibility implications of this and your site client needs. For example, hover menus can be a real pain when the sub-menu content is wider than the trigger area. This will have happened to all of you; hover over the main menu item, see the sub-menu item you want, move the mouse to select the sub menu item... o dear the sub menu has disappeared on you. You left the hover area before reaching the sub menu and the hover state is lost. As well as accidental deactivation its quite possible to get annoying accidental activation with hover too. As well as audience consider the sub-menu itself. If you have a couple of small items consider hover, a massive mega-menu will nearly always be better toggled by selection. On that note, if you're using mega-menus consider Nielsens excellent guide here: https://www.nngroup.com/articles/mega-menus-work-well/ PS: I'd encourage everyone to start thinking about selection rather than 'clicks'. I still slip up myself, but clicks are an outmoded, desktop-centric term that is very dangerous to bandy about when making responsive websites. Much as your anchor text should never be "Click here" we should always be thinking about "selection". Selection speaks to intent and action rather than physical methodology, as that methodology can be clicking, yes, but also tapping, voice command, keyboard based, etc.
Intermediate & Advanced SEO | | AndyMozster0 -
Optimizing A Homepage URL That Is Only Accessible To Logged In Users
Hey Mike Have you considered using a canonical to the unlogged-in homepage? www.domain.com - canonical is to www.domain.com www.domain.com/loggedin - canonical is to www.domain.com As long as the page can be loaded and the canonical in the head seen by bots, that link equity should go to the homepage.
Intermediate & Advanced SEO | | katemorris0 -
Sanity Check: NoIndexing a Boatload of URLs
Hi Michael The problem you have is the very low value content that exists on all of those pages and the complete impossibility of writing any unique Titles, Descriptions and content. There are just too many of them. With a footwear client of mine I no indexed a huge slug of tags taking the page count down by about 25% - we saw an immediate 22% increase in organic traffic in the first month. (March 18th 2017 - April 17th 2017) the duplicates were all size and colour related. Since canonicalising (I'm English lol) more content and taking the site from 25,000 pages to around 15,000 the site is now 76% ahead of last year for organics. This is real measurable change. Now the arguments: Canonicalisation How are you going to canonicalise 10,000+ pages ? unless you have some kind of magic bullet you are not going to be able to but lets look at the logic. Say we have a page of Widgets (brand) and they come in 7 sizes. When the range is fully in stock all of the brand/size pages will be identical to the brand page, apart from the title & description. So it would make sense to canonicalise back to the brand. Even when sizes started to run out, all of the sizes will be on the brand page. So size is a subset of the brand page. Similar but not the same for colour. If colour is a tag then every colour sorted page will be on the brand page. So really they are the same page - just a slimmer selection. Now I accept that the brand page will contain all colours as it did all sizes but the similarity is so great - 95 % of the content being the same apart from the colour, that it makes sense to call them the same. So for me Canonicalisation would be the way to go but it's just not possible as there are too many of them. Noindex The upside of noindex is that it is generally easier to put the noindex tag on the page as there is no URL to tag. The downside is that the page is then not indexed in Google so you lose a little juice - I would argue by the way that the chances of being found in Google for a size page is extremely slim, less than 2% of visits came from size pages before we junked them and most of those were from a newsletter so reality is <1% not worth bothering about You could leave off the nofollow so that Google crawls through all of the links on the pages - the better option. Considering your problem and having experience of a number of sites with the same problem Noindex is your solution. I hope that helps Kind Regards Nigel - Carousel Projects.
Intermediate & Advanced SEO | | Nigel_Carr0 -
What to do about endless size pages
Hi Nigel, Thanks for the message. So, just to cap on this, I've got: 150 pages that got any kind of organic landing last month. 115 of those pages are 1 visit out 13000 visits 525 pages in the sitemap 4300 pages in Google's index if you were me you would noindex/de-index all but about 50 that either have traffic or am working on it and then noindex/deindex/remove from sitemap 4250 pages? How would I even get a list of everything in Google's index? Just wondering about the actual process for fixing this. Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
Local Search Verified Location Ideas
You are very welcome, Michael. So glad to have you here, asking good questions!
Local Listings | | MiriamEllis0 -
URL has caps, but canonical does not. Now what?
I've had some run-ins with case-sensitive URLs in the past and it drives me crazy, I don't understand why CMSs still do that!! While canonical tags are a perfectly fine way to handle this, there's a better solution. Brian Love wrote a great blog post on how to do server-side URL lower-casing. I've used this on a few sites and it works great.
Technical SEO Issues | | LoganRay0 -
Any downside to a whole bunch of 301s?
Hi Michael! I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960 The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together? Other things to watch out for with chained redirects: Avoid infinite loops. Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot. Minimizing redirects can improve page speed Hope this helps!
Intermediate & Advanced SEO | | BlueCorona1 -
Client Wants To Use A .io Domain Name - How Bad For Organic?
How much does he plan to spend on marketing to reinforce his brand? I believe .io will rank on Google just as easily as .net .info .us etc. but how much will it take to rank in their customer's brain? I have a .us and 7.5 years later I still have to reinforce the domain when I am talking to a client. It's like trying to get a client to say chartreuse instead of lime green.
White Hat / Black Hat SEO | | julie-getonthemap2 -
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi Donna, Thanks for all the help. I really appreciate it. Best... Mike
Intermediate & Advanced SEO | | 945010 -
301ing Pages & Moving Content To Many Other Domains
This is a great metaphor: "Finally, will moving all of this content damage Site A? Yes. This is cutting out body parts similar to arms and organs. When this content leaves the traffic flow into Site A will drop. The number of linking domains and pages will drop. The offer of this content to entertain existing visitors will be gone. The size of that loss will determine the impact. Rankings of remaining content might fall if the loss is great. If arms and legs or heart or brain are extracted then expect Site A to suffer. But if lesser things are lost then the damage will be lower but some damage will happen. Search engines and visitors will all notice. Enthusiastic visitors will find the content in its new home and they might move with it." Will definitely be using this for future explanations!
Intermediate & Advanced SEO | | Joe.Robison1 -
ExampleSite.com vs ExampleSite.com.br
Hi Egol, Thanks for the message and I get your point. None is actually forwarded to the other. .com has a for sale sign on it and .com.br is a site. Of course, not sure what went on in the past, beyond OSE says .com has no link profile.
Intermediate & Advanced SEO | | 945010 -
Splitting One Site Into Two Sites Best Practices Needed
HI David, Wow, that is better than I would have imagined. If I may ask, what was the PA/DA of these pages and the appr avg moz difficulty of the kws? This site is like 45 DA 20 PA and usually does okay with kws under 40 difficulty. Best... Mike
Intermediate & Advanced SEO | | 945010 -
Is this organic search sketchiness worth unwinding?
I agree with the two methods that both you and Gaston have pointed out. The downside to reversing those links is that the domain authority could drop a bit—which could impact their rankings on the SERPs. If this happens, the client might think you are doing something wrong and causing their rankings to rank when, in theory, you were trying to help get rid of any sketchy links. In my opinion, I’d keep them. They’ll make your work perform better. Disavowing them could yield worse results than what their former SEO provided. If that happens, you're playing defense and blaming. Hope this helps!
Intermediate & Advanced SEO | | BlueCorona0 -
New Adwords Account To Replace Old Low Account Quality Score
Yes I agreed on that . You should start afresh campaign and pause old campaign.
Conversion Rate Optimization | | Alick3000 -
About how much time should I budget for an adwords audit?
I did it myself and went for kind of a top items approach. In the end, it took about 6 hours. Thanks... Mike
Conversion Rate Optimization | | 945010 -
The Consequences & Best Practices In Changing Domains
I believe it took them a few years. They still had not recovered all their organic traffic when I worked on them but we did see a sharp increase in conversion rate. Revenue actually improved I think Moz has a case study out from when they changed from Seomoz to just Moz. I would just make sure the client understands improving organic traffic after the rebrand will be a steady process. You could look at supplementing organic with a PPC campaign or venture into social media advertising and blogging until you see improvements. Hope that helps some.
Intermediate & Advanced SEO | | JordanLowry0 -
Can an "Event" in Structured Data For Google Be A Webinar?
It's possible! From mid of March 2020 Google announced new properties for structured data events: Virtual Postponed Schedled Rescheduled Move Online Canceled You can find complete guide with code examples in my blog post. You think that's all?
Intermediate & Advanced SEO | | farukga130