Hi Leowa,
As long as it comes before you're fine.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Leowa,
As long as it comes before you're fine.
Thanks for the clarification on the platform Suarezventures.
I have worked with plenty of brands that have a similar setup on Shopify. They usually put the blog on a subdomain because Shopify's content management system - let's see, how do I say this nicely... sucks. These clients put up Wordpress on a subdomain. Some also put up a landing page platform like Hubspot or Unbounce to which they send paid traffic.
Your plan to put the eCommerce site on a subdomain has some benefits in that the content side won't be affected by future platform migrations on the eCommerce site. However, the content side will benefit the most from being at the main level with the homepage and most of the backlinks. Thus, organic search traffic to the eCommerce site could be harmed by this move. I normally wouldn't recommend it for that reason (because the business is eCommerce, which is what pays for the content) but in your case, it sounds like the eCommerce site doesn't bring in much traffic as it is.
Good luck. Let us know how it turns out. 
Have you reviewed the reasons Google might exclude a page from the search results when accessing these reports in Google Search Console? You can find them here: https://support.google.com/webmasters/answer/7440203?hl=en#information-status
"Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling."
"Discovered - currently not indexed: The page was found by Google, but not crawled yet. Typically, Google tried to crawl the URL but the site was overloaded; therefore Google had to reschedule the crawl. This is why the last crawl date is empty on the report."
To Do:
1. Check your server logs for errors (esp. 500 response codes) or downtime reports for intermittent outages. Maybe your host is "throttling" traffic?
2. Add more value to those pages. Google may have no trouble finding and rendering a page, and still decide it isn't up to their standards for inclusion in the index. User-Generated Content is often very thin. Real estate listings are usually comprised of the same data everyone else has access to. What makes yours better or different? Is that communicated to Google somehow?
Hi Suarezventures,
I typically draw the subdomain vs top-level domain line at whether the two sites / experiences and purposes are vastly different. For example, a site like blogspot that hosts different websites on subdomains, or a brand that has a forum community on a subdomain because it runs on a different server and has a much different purpose than the main domain.
Ideally, if you're moving to Wordpress you'd have the content and the store on the same site (e.g. https://site.com). If this isn't possible for them, having one or the other on a subdomain would be better than having them on (Squarespace?).
What about having the new site on a subdomain so you don't have to deal with migrating the existing site? Can' t you leave it there and put up store.site.com on WP?
Hi Christian,
I don't see any evidence of the site being deindexed now. Here are some things I checked for you, along with a few observations:
Nothing in the Robots.txt file, or robots meta tag, or X-robots HTTP header response that would keep these pages from being indexed by Google
The rel= canonical tags appear to be functioning properly
The home page is indexed and not duplicated by other indexed pages
Google has about 86 pages indexd from your domain
Hrefl Lang tags appear to be implemented properly
There are only about 50 links going into the domain from other sites, and the ones from Moz are the best of what few aren't just random scraper sites (harmless, but annoying).
Sometimes Google ranks a brand higher when it first comes out because it's a chicken or egg situation. How else can they collect data for their machine to chew on unless some traffic is sent to a new site? We used to call this phenomenon "the Google sandbox" a long time ago, but it is essentially (in its effect) the same thing. We do it ourselves with A:B testing and paid advertising. You have to spend some budget to gain enough data to know what's working and what isn't.
I don't think you have a technical SEO problem here. I think you need to continue building a brand and producing useful, rich content. Good luck!
Hello Ross,
The spam comments below have been reported.
To your questions:
I don't know of any way to "restore" data that you never exported or saved, but there are several documented processes for automating it each month. For example, I recommend this solution: interface: https://gsuite.google.com/marketplace/app/search_analytics_for_sheets/1035646374811
This should also take care of your other question about getting more rows of data than what's provided in the GSC interface.
Can you provide more information? What is the site? What is the TLD? What is the target country in GSC? Where are most of the links from? Do you have Href Lang tags? Etc...
Although I have seen no evidence or documentation that local GMB listing reviews have any impact on non-localized search results for a brand's non-local homepage.
However, the picking apart of each little ranking algorithm factor is short-sighted. These things do not work in a vacuum. I have a client with a single location who does mostly eCommerce. But they allow people to come into the warehouse and buy directly if they happen to be in the area. This means we can get a GMB listing, and reviews for the location. I fully support this strategy, even if it doesn't help the homepage rank better.
Julian,
Will you be translating the content into other languages?
And/or customizing it for the location? For example, changing US English spelling to the rest of the world English spelling?
To answer your question, no, you don't have to worry as much about duplicate content if it is only happening on different ccTLDs. The biggest issue with duplicating content is that Google has to choose one to show. In this case, that decision becomes easy to do: Show the one in the right country.
Use of Href Lang tags, and setting the target country in GSC, are helpful hints as well. I recommend using this tool by Aleyda Solis and team to build out the tags: https://www.aleydasolis.com/english/international-seo-tools/hreflang-tags-generator/
Hello Sam09,
Was your question answered by Jeroen? Were you asking about the "keyword" meta tag?
You lost me at "keyword density".
Alt text should describe the image as if explaining it, as succinctly as possible, to a blind person. If you can get your keyword in there without being weird about it, great. If not, describe the image and move on.
There is no "limit" to exceed. And Welderpro is... not correct about having a word in alt text and in the body being a bad thing.
Write the content for the user and stop worrying about this stuff. You'll be a much better SEO in the end.
Sam09,
My guess is the links aren't very good if GSC isn't reporting on them. First step: Is the page on which the links appear indexed by Google? Can you find it with a SITE: or INURL: search on Google?
If the page is indexed, can you see the last cache date? Is it before or after your links were put there? If after, are the links still there with JavaScript turned off? Are they still there for Google? Are they sill followable?
Questions like this don't get many answers because there really isn't a good answer. It depends on many, many factors.
Hello Ravim,
I agree with Bob that the best place to start is by finding out what your customers ask, and the best place for that is the customer service people in your company, or whoever answers the phone/chat and speaks with customers directly.
In terms of keyword research, there are a number of great tools out there. I like Answer The Public, GuberSuggest, aHrefs, Moz, SEMRush... Just look for who/what/when/where/why/how questions related to that product/keyword.
Here's a great tool I use to help scrape and organize people also ask (PAA) results:
https://www.hannahrampton.co.uk/google-qa-people-also-ask-research/
And, of course, you can just go to Google and start checking for yourself. See what Google is suggesting as you type. See what questions show up in the PAA accordions. 
Hello Chris,
Do you mean how often a snippet will get re-evaluated by the algorithm? I think that's pretty much a continuous process. Some snippets will be more volatile than others based factors like the news cycle, the intent implied by the keyword, seasonality, user-feedback signals that tell Google if people are happy with the result...
I don't think humans are generally going through the search results and reviewing snippets, as that just would not be scaleable. Quality Raters may look at them, but their feedback is considered more at the macro-level, as in "did this change to the algorithm improve the results, or make them worse?" To my knowledge, they're not out there individually zapping snippets they don't like.
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
Hello Jeffvertus,
I think you've chosen a good URL structure. The only additional advice I would give is to consider putting all of those within another directory called /locations/ so they aren't all off the root. This has many advantages when it comes to traffic analysis, SEO analysis and reporting, as you can easily limit whatever you're looking at into location specific or non-location specific data.
Subdomains would have been much harder to manage, although there isn't a whole lot of difference with regard to SEO, if we're to believe what Google tells us.
Did that answer your question?
Hello GhillC,
I think we need to agree on terminology first, but it sounds like you can safely limit some of Google's access. Some people call your "product listing pages" either "refinements", "facets", or "filters". When I read "product listing pages" I typically think of what is also called a "category" page, which is a page listing multiple products. A single product page is often referred to as a product detail page (PDP).
Now that we're on the same page (pun intended), let me know if this article answers your question. It is very dated (2011) but gets the point across, which is that you need to be strategic about which facets/refinements/filters you allow to be crawled and/or indexed: https://moz.com/blog/building-faceted-navigation-that-doesnt-suck .
One more thing: 5% - 9% of traffic going directly from organic search into a page type would be considered significant for most businesses. When I look at pruning out page types, they're typically responsible for less than 0.5% of traffic from organic search.
Planner Guy,
Martijn has experience in this area directly and gives solid advice. I would just add that I always approach this by asking what the possible intents are. In this case, the "intent" of someone looking for "training" could be to get in-house training, like someone to come out and teach a team. It could be on-site training, like a school or workshop they can attend. Someone searching for "online course" can only be looking for one thing. Thus, I think having two pages is appropriate here. Maybe the "training" page could at some point discuss the benefits of taking an "online course" (insert link) over in-house or on-site training.
Hello Steven,
It could be:
Bots poking around the site for vulnerabilities
Old links out there if the domain was previously owned by someone else
Parasite hosting (see MrWhippy's questions) in which someone hacks the site publishes content on your domain, usually for the purpose of linking to theirs.
All of the above or something else
Here are some things to try:
1. Make sure your latest WP version is up-to-date
2. Install some security measures, like a plugin that thwarts "brute force" attacks, such as Log-in Lock Down
3. Update all account passwords, including the one used to log into your hosting Cpanel and FTP
4. Search Google for things like [site:yourdomain.com porn OR viagra OR cialis OR sex OR "credit cards" OR...]
5. Check Google Search Console for messages, including manual actions
6. Check Google Search Console for Queries that look out of the norm (see #4).
7. Use Moz or aHrefs to check for backlinks to URLs like that