Category: Search Engine Trends
Explore current search engine trends with fellow SEOs.
-
How much do branded search organic traffic & direct traffic impact the ranking for their non-branded topic/keyword?
Traffic is an important indicator along with onsite behavior once they are there. The more people visiting your site the better it should be for your search engine rankings as long as they are engaging with the site. Try to generally keep your bounce rate as low as possible and try to make the site as sticky as possible. Best Regards
| Dalessi0 -
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hello Vtmoz, Im not following you. You have clearly stated below: Noindexing the login page will force visitors from google on doing an extra step to log in.
| GastonRiera0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi Linda, So, if we noindex the popular page of our website, what difference it is going to make at Google beside that page not showing up in SERP. Actually I have replied to the related thread to your post below: https://moz.com/community/q/log-in-page-ranking-instead-of-homepage-due-to-high-traffic-on-login-page-how-to-avoid Please suggest. Thanks
| vtmoz0 -
Number or percentage of new visitors impact Google rankings?
In general, new visitors are better than returning for many reasons, but whether their number will improve your positions depends on the source of traffic. Direct, paid and referral traffic will not affect your rankings directly. There are few cases where spikes in referral traffic improve positions, but this effect doesn't last. Organic traffic will, this falls under user behaviour category, if your CTR is growing and your Bounce Rate is in check, your positions will improve.
| Igor.Go0 -
What happens when we canonical and point to a page which has been redirected to another page? Google response!
**What happens if we canonical from A to B and set redirect from B to C? ** Keep in mind that rel=canonical is optionally obeyed by Google and some people would say that it is recognized by Google by chance. So, when you place that on a webpage Google might honor it or they might not. How they honor it today might be different from how they honor it tomorrow --- and Google often changes their mind about such things without telling anybody. And, so far, they are not telling anybody EXACTLY how they treat 301 redirects and because of that, in my opinion, anyone outside of Google who gives firm answers to how rel=canonicals work knows not that he knows not. If you want these pages associated with one another then you should use 301 redirects and not canonicals. 301 redirects are on your server and they force visitors and Google crawlers to your intended page. Still, Google might not give full credit for them (but they say that they currently do), however, if you want to pass value from one page to another, a 301 redirect is the most assured way to make that happen, in my opinion. I think that your **What happens if we canonical from A to B and set redirect from B to C? ** is looking for a far reaching answer. Nobody can tell you with assurance what is really gonna happen. And, again, what happens can be changed by Google at whim and without no notice.
| EGOL0 -
Rel canonical on other page instead of duplicate page. How Google responds?
Thanks for the answers and suggestions. I have more questions raised in my mind and I put them in the below different thread very clearly. Please reply there. https://moz.com/community/q/what-happens-when-we-canonical-and-point-to-a-page-which-has-been-redirected-to-another-page-google-response
| vtmoz0 -
Syndication and canonical tags across domains
This is an old question—I found it because I was looking for the same information. In case anyone needs this answer, this is what I found. Also, this. In brief, yes, cross-domain canonicals are appropriate for syndication.
| Linda-Vassily0 -
I am wondering if there is a right answer for keywords with alternate spelling.
Google probably isn't going to distinguish between "co op" and "co-op" for ranking purposes and, as Thomas said, case-sensitivity shouldn't matter. You might still see some issues with "coop" as one word vs. two. Actually, digging in, it looks like searches for "coop insurance" (one word) return "co op" and "co-op", but searches for "co-op" don't return "coop". You've also got the confusion that "coop" is a word (although low-volume -- not a lot of chicken coop content, relative to some other topics). I'd personally pick one version -- inconsistency can look weird to visitors and unprofessional. My gut feeling is use "co op" or "co-op" (i'd lean toward the latter, but no solid data to back that up). If your logo is "CoOp" and you use "co op" or "co-op" in content, I think that's probably fine. Keep in mind, though, that I'm only speaking from an SEO standpoint. I don't know the brand or history, from a business standpoint.
| Dr-Pete1 -
How to get gold star reviews on SERP's
Hmmm, depending on the product and number of reviews I would either have a database holding your reviews so you just add to the DB and it updates your dynamic variables per scheme. If it is for a brand and the same aggregate review is over the entire site you could maybe have a config file that references a single dynamic field which will update the entire site, this method may not need a DB. You may have to do some leg work in the first instance to make the figures dynamic. I tend to use PHP and MySQL for this purpose.
| TimHolmes1 -
Can we add header tags followed by header tags without text in-between? Best practice?
The semantic purpose of a header tag is to mark a collection of words as a header introducing a section of descriptive content. If what you have instead is a series of headers with no content in between, you're defeating the purpose. There's no "reason" to mark those list elements as headers. If you're using because it's an easy way to create the styling for those terms, you should get out of that habit. There's no major damage done by having headers follow each other, but there's no real benefit either. And for what tiny value those headers bring for helping search engine crawlers better understand your page content, you've thrown that away. In other words - don't do it. Hope that helps? Paul
| ThompsonPaul0 -
Linking from high ranking sub domain pages to less ranking main domain pages to benefit latter
You have to decide how important the new pages are compared to the old ones. If you really want the new pages to beat the old pages, you should 301 the old pages to the new ones. (You could even make another new page, "old guides," that you could link to from the new page, if there is enough demand for the old guides.) If you are trying to move authority away from the old pages and onto the new pages, the old pages won't be as findable in search. You have to decide whether this is important. Yes, canonicals do pass page rank though (as with many things in SEO) there is debate about the details. Yes, you could do that, but it is sending Google mixed messages. First you tell them to ignore the old page and look at the new page, but then the new page points to the page you said to ignore. (Take a look at Pete Meyers' Whiteboard Friday for more on this.) The other problem with this approach is that the old content would still be out there, potentially getting links or other recognition that you'd rather have for the new page. (Even though page authority can be passed in various ways, you always lose something during transfer.)
| Linda-Vassily0 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi vtmoz, I see you have posted an updated question on this issue separately - regarding the spammy inbound links pointing to this subdomain. That is a more likely problem for your site. That said, the more spam that shows up on that subdomain, the more likely Google is to seeing your site as part of a "bad neighborhood" of the web, and this could risk the organic visibility of your root domain. There is no documentation on "how much spammy content is too much" - generally, some spammy content on a subdomain doesn't negatively impact the entire domain. However I have seen cases where an entire domain was negatively impacted by a very large number of spammy pages on various subdomains. It's a matter of scale. If your rankings and traffic on your primary domain are important, my advice would be to mitigate the risk by either policing the content on your forum subdomains by bringing in admins, or shutting down the forums entirely. Best, Mike
| MikeTek0 -
Do the sub domain backlinks count for main domain and increase authority?
Nigel's suggestions are right on. We redirected subdomains into folders on the main site and the results were KICKASS. KICKASS. We started producing a lot more traffic and making a lot more money.
| EGOL0 -
A page will not be indexed if published without linking from anywhere?
If a page has no links and has not been submitted another way, Google won't see it.
| Linda-Vassily0 -
HREFLANG for multiple country/language combinations
Hi Sam, Apologies for the slow response. Your question slipped through the net. This is an interesting case! In an ideal world, you'd specify the relationship between all of those pages, in each direction. That's 150+ tags per page, though, which is going to cause some headaches. Even if you shift the tagging to an XML sitemap, that's a _lot _of weight and processing. Anecdotally, I know that hreflang tagging starts to break at those kinds of scales (even more so on large sites, at that kind of scale, when the resultant XML sitemaps can reach the size of many gigabytes, or when Google is crawling faster than it's processing the hreflang directives), and so tagging everything isn't going to be a viable approach. I'd suggest picking out and implementing hreflang for _only _the primary combinations*, as you suggest, and reducing the site-wide mapping to the primary variant in each case. You might consider that there may be cases where the valuable/primary combinations aren't just the /xx/xx/ or _/yy/yy/ _versions and that there might be some examples of varying country/language combinations which are worth including. For the atypical variants, I think that you have a few options: Use meta robots (or x-robots) tags to set noindex attributes. This will keep them out of the index, but doesn't guarantee that you're effectively managing/consolidating value across near duplicates - you may be quietly harming performance without realising it, as those pages represent points of crawl and value wastage/leakage. Use robots.txt to prevent Google from accessing the atypical variants. That won't necessarily stop them from showing up in search results, though, and isn't without problems - you risk you creating crawl dead-ends, writing off the value of any inbound links to those pages, and other issues. You use canonical URLs on all of the atypical variations, referencing the nearest primary version, to attempt to consolidate value/relevance etc. However, that risks the wrong language/content showing up in the wrong country, as you're explicitly _un_optimising the location component. I think that #1 is the best approach, as per your thinking. That removes the requirement to do anything clever or manipulative with hreflang tagging, and fits neatly with the idea that the atypical combinations aren't useful/valuable enough to warrant their own identities - Google should be smart enough to fall back to the nearest 'generic' equivalent. I'd also take care to set up your Google Search Console country targeting for each country-level folder, to reduce the risk of people ending up in the wrong sections.
| JonoAlderson0 -
Should I document rankings from an Incognito Window?
In addition to IP address you should likely read into Fingerprinting technology. The differences between a incognito browser and your regular window are quite minimal using that technique.
| Martijn_Scheijbeler1 -
How to take down a sub domain which is receiving many spammy back-links?
Hi vtmoz, OK. You can upload a file containing all the domains you want to disavow you don't need to do that one by one. To check thousands of links is not something one wants to do for sure actually... How you could do it: Disavow them all (from Webmaster Tools you export them all to a file) and then you delete a couple of dozens you know are strong and valuable domains. Cheers, Cesare
| Cesare.Marchetti0 -
What does it mean to build a 'good' website.
I am completely agree with you, Kevin
| Roman-Delcarmen0 -
Is it time to go https sitewide?
Thanks for the input! The move to https sitewide will be added to our re-platforming plan.
| yacpro130