Questions
-
Impact of Non SEO Subdomains
It is _possible _that a subdomain-based landing page could be cannibalizing rankings for specific terms from another page on your site. But if that landing page is actually that good for the term, its not necessarily a bad thing to have it ranking. If its ranking better than the pages you optimized for organic then maybe you should look at why that is (i.e. is it getting good/better links, are people sharing it around, is it better targeted than the organic page, is it more intuitive or has a better call to action, etc. etc.). Now, if you really don't want thoe pages to rank in organic in place of optimized pages you created then you can very easily add a NoIndex tag to the page or exclude it from being crawled in Robots.txt
Technical SEO Issues | | MikeRoberts0 -
Multiple Common Page Links
I've set up a new redesign of our page to reduce the number of links per item from 4 to 1. We are also going to reduce the number of items from 30. The challenge here is that in some cases, there really are more than 30 items that are relevant (e.g. not related, but actually part of the list as they should be). We've got some thinking to do as to whether or not we do pagination or incremental load.
Technical SEO Issues | | APFM0 -
Local SEO For Agents
Hey There! Good question, and brace for a long answer here. So, where we start with this is Google's own guidelines for multi-practitioners which we excavate for clues to see how they feel about your scenario. These guidelines state: Individual practitioners (e.g. doctors, lawyers, real estate agents) An individual practitioner is a public facing professional, typically with his or her own customer base. Doctors, dentists, lawyers, financial planners, and insurance or real estate agents all are individual practitioners. Pages for practitioners may include title or degree certification (e.g. Dr., MD, JD, Esq., CFA). An individual practitioner should create his or her own dedicated page if: He or she operates in a public-facing role. Support staff should not create their own page. He or she is directly contactable at the verified location during stated hours. A practitioner should not have multiple pages to cover all of his or her specializations. Multiple practitioners at one location If the practitioner is one of several public facing practitioners at this location: The organization should create a page for this location, separate from that of the practitioner. The page for the practitioner should be titled with name of the practitioner only, excluding that of the organization. This is the sum total of what Google tells us and, and while some of these guidelines apply to your scenario (like your agents operating in a public facing role) there's a catch here. Though you've described having multiple offices across the country, you have further described that these agents you'd like to market don't work at these offices. They work from home and would need to keep their addresses hidden because of this. This is a common scenario and one of the more significant grey areas of the guidelines. The problem here is, should Google discover than you've created 30 listings in Iowa for your agents, and then they look at their street level imaging of the back-end addresses you've listed in your GMB dashboard, they will see that these are not offices - they are houses. And, at that point, it's up to Google to decide whether you have a legitimate business model or whether you are trying to game the system by appearing to have offices in these 30 locations when you really don't. The problem here is that many companies have spammed Google in this way. The yard cleaning company whose owner literally does work from home in San Francisco, but who has also set up listings in San Jose (his cousin's house), Oakland (his mom's house) and Walnut Creek (his sister's house). Google catches onto this and hammers down not only on the 3 spammy listings but may also hammer down on the legitimate listing in San Francisco, as well. So, while this is not what you are trying to do, and your agents genuinely do work at home while representing your company, the grey area here is whether Google will see it that way. There is no guarantee that they will, and so what I would say on this is that the safe path here is to only list your physical offices in the cities where you have them and list any agents that work in these offices and are "directly contactable at the verified location during stated hours". You could try it the other way, listing every possible agent, but you'd be doing so at your own risk. Hope these are helpful thoughts! It's good you're considering all the options here. Very smart. P.S. So sorry about the formatting on this. It is wacky.
Local Listings | | MiriamEllis0 -
Virtual Hub Page Impact
TL;DR - Yes and no. There are two cases and i will be descriptive as possible. 1st case is if subfolder are linked from other pages. Like /subfolder2/page2 -> /subfolder1/ or /subfolder1/page1 -> /subfolder2/. In this case you link direct to 404 page and this can frustrate users and bots too. There is also some confirmation that nonuseful 404 page is low quality signal too: http://themoralconcept.net/pandalist.html (#10 in low quality signals). That's why you need to use crawlers and fix 404s - good for users and good for bots too. 2nd case is when users are curios. I'm one of them. Sometime. So let's say we have curious URLs: http://www.moz-team.com/randfishkin/article1http://www.moz-team.com/randfishkin/article2 http://www.moz-team.com/randfishkin/article3 http://www.moz-team.com/cyrussheppard/article1 http://www.moz-team.com/cyrussheppard/article2 http://www.moz-team.com/cyrussheppard/article3 as you can see urls are very clean and very descriptive. And now add curious user (pick me!) that can want to see more about Rand or Cyrus. This can be page with CVs, short bio or list of all their articles. So just editing URLs to: http://www.moz-team.com/randfishkin/ or http://www.moz-team.com/cyrussheppard/ using backspace. In perfect world this will give information... but in your case 404. And this is not good for users. That's why it's much better if you can create "category" page for each subfolder even if this isn't linked from other pages. This was explained many times as "silo structure": https://moz.com/blog/site-architecture-for-seo http://www.bruceclay.com/eu/seo/silo.htm http://www.stateofdigital.com/optimising-urls-seo-ux/ I hope that this answer will help. You MUST optimize site for users and bots too.
On-Page / Site Optimization | | Mobilio0 -
Site Scraping and Canonical Tags
Hi, Content scraping is a very common thing and fortunately, with the rel=canonical tag still pointing to your domain, there is nothing to worry from content duplication point of view and with this tag intact, their content will not make it to Google's index. You don't need to worry about the other website out-ranking you and you don't have to use disavow here as it is for spammy backlinks and not for rel=canonical tags. However, you can approach those guys and ask them to take it down immediately and you can file a DRM or DMCA case against them if they don't agree to take the content down. You can also report the matter to Google here: https://www.google.com/webmasters/tools/spamreport?hl=en Hope it helps. Best regards, Devanur Rafi
White Hat / Black Hat SEO | | Devanur-Rafi0