Google considers this to be spam. Sometimes pages get away with doing this, but generally you're going to eventually get a manual action reported in Search Console.
Posts made by MichaelC-15022
-
RE: Google Rich Snippets in E-commerce Category Pages
-
RE: Miriam's 7 Local SEO Predictions for 2019
I think most of the community is currently Christmas shopping online...and making decisions based on fake reviews :-p.
-
RE: Miriam's 7 Local SEO Predictions for 2019
Great predictions, Miriam!
I'll add one more...maybe it's more of a wish than a prediction...that Google will make some sort of serious strides towards cracking down on fake reviews (both positive and negative). Hopefully not as over-the-top as Yelp's approach (which throws a lot of babies out with the bathwater!) though.
-
RE: Client wants to rebrand but insists on keeping their old website live as well...
I'll second Miriam's points, above. There's substantial risk here if both sites are going to be visible to Google.
I'd block the old site in in robots.txt permanently. I'd never redirect the old site to the new, even if cleanup had been done. From the penalty recovery work I've done, it sure feels like Google keeps some sort of permanent flag on your site, even after you've done the cleanup. New, good links don't seem to have as much effect as you'd expect.
For the new site, spend the $$ and do some PR/outreach and some solid, strong links in addition to the core directory links you get via MozLocal. Do some community service work that gets a press mention; offer a scholarship to dentistry students from a specific school, so that the school will link to your scholarship page. A few really good links from newspaper stories will work wonders for getting the new site to rank, both in the 3-pack and in regular organic.
-
RE: Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Depending on how you cause the scroll to happen, Google might render the page unscrolled or scrolled. Usually if it's done in the onload() function via Jscript, Google will execute that script and render the page as it is after the script is executed. I've seen examples though where using JQuery's document ready function is NOT executed by Google to render the page.
Test in Google Search Console, using Fetch and Render as Googlebot.
-
RE: Location pages for Two location business
Hi Justin,
Don't sweat having the NAP of both locations on multiple pages if you don't mark those up with schema.org. FYI, multiple schema.org objects on a page is perfectly normal, even of the same type.
Be sure you have dedicated pages for each location, and on THOSE pages, mark the NAP up with schema. Then, in your Google My Business pages, you want to link to the specific location page that corresponds to the GMB page, NOT to your home page.
You can link back to the GMB page from the location-specific page on your website, or from all pages (e.g. in the footer).
-
RE: Is Pagination & thin text issue affecting our traffic?
It looks fine to me. You're using rel next/prev correctly, you've got plenty of text on the page. You're correctly setting rel canonical to the numbered page. All looks good to me.
-
RE: Should I disavow local citation page links?
I wouldn't sweat it. There are a jillion 3rd-tier business listing directories out there that are pulling that sort of data from the major directories. Yes, it's an issue if ALL you have is super weak links, but you'll need to be doing outreach for link-building anyway so that should not be a big deal.
I'd only disavow links that are actual spam. Not weak but legitimate links.
-
RE: Query results being indexed and providing no value to real estate website - best course of action?
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
-
RE: What is your opinion in the use of jquery for a continuous scroll type of page layout?
Google is NOT going to see the content that's rendered by scrolling. In general, more is better in terms of content on a single page (provided it's not crap of course). See this article from Search Engine Land.
For those same reasons, having it on separate pages isn't as good an idea. If you think about how RankBrain is supposed to work, Google is going to be looking for terms on the page that commonly co-occur with the page's primary target search term on other pages on the web about that topic. So, by farming subsections of content out to other pages, you're shooting yourself in the foot, as Google is only going to give you brownie points for covering the subtopics in the very first page.
A better way to do this:
- put all the content on one page
- in the onload() or the Jquery document ready function, hide all but the first page's worth of content
- now, you can react to a scroll by calling Jscript functions to hide the currently shown content and show the next page's worth...all on the same URL
-
RE: One landing page or many?
It seems that both people and Google like bigger pages better. See this study that found the average number of words on the page for pages in the top 10 results for something like 20,000 keywords was over 2000 words per page!
This article from SEL is also worth a read, and talks more about conversions etc.
And yes, I think the expand/contract approach is fine. Another good option is to divide the page into tabs (but have all the content present in the HTML), and then only show the content for the currently selected tab. Be sure however that all of the content is technically visible (i.e., not with a style of display:none) when the page initially loads. You can then use something like the JQuery document ready function to THEN walk the tabs and hide all but the top one when the page is done loading.
-
RE: One landing page or many?
Really, there has been a fairly radical change in how Google measures relevance of a page against a given keyword. A year or more ago, you'd have been better off making separate landing pages for each of those terms, putting the target term in the page title, H1 heading, body text, ALT text on an image, etc. etc.
Whether it's the new RankBrain piece of the algo or something else--it seems that Google is no longer as laser-focused on the page title having the EXACT words in it that were in the search term. Google appears to be able to identify the topic that a page is about by looking at the words on the page and how those words co-occur on other pages on the web.
As an example, my travel site has a page on it that I very carefully tuned for the term "best time of year to visit tahiti". So that's the page title, H1 heading, etc. etc....all the usual stuff. That page now ranks #3 for "tahiti weather", which is SUPER competitive, despite not having "weather" in the page title. I think it's only on the page maybe once, in fact. But, the page content talks about storms, precipitation, temperature, seasons, etc. etc. So, even though I'm telling Google that the page is about "the best time of year to visit Tahiti", Google is able to look at all that content and understand that really, it's about weather in Tahiti.
Long-winded story, I know. But I am indeed going somewhere with this...
I'd recommend having a single page targeted at "metal doors", then work all of the other terms into the page content, using subsections and H2's as Attain Design has suggested above.
I'd go a step further, though. Do a search for "metal doors", and look at the top 20 or 30 pages in the results. Look at the subtopics those pages discuss. Are they talking about locking mechanisms? Corrosion resistance? Insulation R-values? You're looking for other aspects of the core topic that you can add to your page to make it a more thorough discussion of the topic.
The theory I've seen as to how Google is doing this relevance is this: they're looking at a set of pages (maybe the top 100?) that they currently rank well for a given topic, and looking at the fairly rare OTHER terms that are showing up on at least some of those 100 pages. As an example, let's say a given term occurs on 90 of those 100 pages--that's a clue that if a page is supposed to be about topic X, and it does NOT have that term on it, it's probably a pretty poor page for that topic. Now, let's say we're looking at a term that occurs on 15 out of those 100 pages--that's probably a subtopic term that only the best pages...the most thorough pages on that topic...will have. If the term occurs on just 1 or 2 of those pages--well, that's probably an anomaly.
-
RE: Key Word in URL - To Include or Exclude?
Be careful that you don't end up with multiple URLs for the same page...if you do want to go that way, then be sure to set a rel=canonical from one to the other.
I don't know about a click-through advantage. You might say that the brand stands out more and is more readable at the end of the URL, actually.
-
RE: Key Word in URL - To Include or Exclude?
I'd agree with Aaron's comments on click through rate. I'd add that I'm still seeing a lot of boost in ranking from having the keywords in the URL itself, so I'd keep "shoes" in the page URLs.
-
RE: Google Rich Snippets in E-commerce Category Pages
I generally recommend putting basic Product markup (name, price, maybe image, URL pointing to the single product page) at that level. The idea here is to let Google understand that that page contains a big list of products that fit the category as seen in the page title.
DO NOT put reviews at this level--I saw something from Google recently that says they consider that to be a spammy attempt to get ratings snippets in the results for that page. Put the reviews only at the single product page level.
-
RE: How can I optimize pages in an index stack
Hello Rod,
Can you explain what you mean by an "index stack"? I haven't seen that term used before.
-
RE: Site went down and traffic hasn't recovered
Is that site still down? Typically when I've seen sites go down, unless it's for a long time, Google doesn't seem to drop it from the index. I had a client site down all day Saturday and it continued to rank well.
And I don't see a reason why that would affect the other sites, unless a huge percentage of their inbound links were from the site that was down--but even then, it would have to be down weeks, at least.
I'm inclined to think that the site outage is a red herring, and that there's something else in common between the sites that's causing an issue. Have you done a fetch-and-render as Googlebot for each of the sites in Search Console? Maybe something is blocked by robots.txt in all the sites that's preventing rendering, and Google is seeing very little content above the fold? <-- bit of a wild guess there...but that's all I've got!
-
RE: Are there any negative side effects of having millions of URLs on your site?
I'll echo Robert's concern about duplicate content. If those facet combinations are creating many pages with very similar content, that could be an issue for you.
If, let's say, there are 100 facet combinations that create essentially the same basic page content, then consider taking facet elements that do NOT substantially change the page content, and use rel=canonical to tell Google that those are all really the same page. For instance, let's say one of the facets is packaging size, and product X comes in boxes of 1, 10, 100, or 500 units. Let's say another facet is color, and it comes in blue, green, or red. Let's say the URLs for these look like this:
www.mysite.com/product.php?pid=12345&color=blue&pkgsize=1
www.mysite.com/product.php?pid=12345&color=green&pkgsize=10
www.mysite.com/product.php?pid=12345&color=red&pkgsize=100
You would want to set the rel=canonical on all of these to:
www.mysite.com/product.php?pid=12345
Be sure that your XML sitemap, your on-page meta robots, and your rel=canonicals are all in agreement. In other words, if a page has meta robots "noindex,follow", it should NOT show up in your XML sitemap. If the pages above have their rel=canonicals set as described, then your sitemap should contain www.mysite.com/product.php?pid=12345 and NONE of the three example URLs with the color and pkgsize parameters above.
-
RE: Multiple Blogs with Google Blogger
Sure thing. Presuming your main site's blog is in WordPress, there's this handy-dandy importer:
https://wordpress.org/plugins/blogger-importer/
There are instructions in the Installation section on how to export your existing Blogspot posts into an XML format that the importer can then read.
-
RE: SSL providers? Any reviews?
I've been pretty happy with Comodo. Some of their interface is a bit confusing, but their support is good and their prices are fine. Enormous numbers of options (which leads to some of the confusion!) but with tech support help I've been able to navigate it all pretty well. I've bought a number of simple ones from them, as well as multi-domain certs. They've also been good at helping me move existing certs from one hosting company to another--with no extra charges.