Questions
-
Does Google really using unlinked brand and related mentions as a ranking factor?
The simple answer would be yes. However, the amount of page authority that you receive from these mentions are up for debate. Personally, I use Moz Pro's opportunities section to view link-less mentions and subsequently try to convert those mentions into an actual link. Remember, this unlinked mention will be a secondary signal, not a primary, so, it is always better to try to convert to an actual link. In my experience, I have had excellent luck contacting the person who wrote the article to link my website. Best of luck! John
Branding / Brand Awareness | | AdvisGroup0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi John, Thanks for the response. I agree with you about using "rel=canonical". But there are too many pages to manually add these tags. Is there any other way to implement this? Thanks
Search Engine Trends | | vtmoz0 -
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Try to get links to the most relevant page for each search term you are targeting. The link text is important and will influence search engines as to the correct page for each search term. Try to target only one or two important keywords per page. It can be difficult to do, but it will make things more clear to people and the search engines. Best Regards
Search Engine Trends | | Dalessi0 -
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi vtmoz, The most mandatory way to prevent any page to be indexed is by using a meta robots tag with a _noindex _parameter. Then using robots.txt will help to optimize your server resources and is a way that prevent google to crawl any new page that do not have the meta robots tag. And yeah, its very common to have indexed pages even the robots.txt file blocks the entire website. If what you are looking for is to remove from index the pages, follow this steps: Allow the whole website to be crawable (or at least that specific pages/section) in the robots.txt add the robots meta tag with "noindex,follow" parametres wait several weeks, 6 to 8 weeks is a fairly good time. Or just do a followup on those pages when you got the results (all your desired pages to be de-indexed) re-block with robots.txt those pages DO NOT erase the meta robots tag. Hope it helps. Best luck. GR.
Search Engine Trends | | GastonRiera0 -
Domain has been redirected our site; but many incoming links from sub domain. Will this hurts?
Hello vtmoz, That site's (seo.oldsite.com) backlinks are suitable for the new site? Rememer that if you've made a full redirection of the subdomain site to the root new domain, there will be tons of links pointing to that root domain.. and if those links dont share the topic nor are from good sources its probably that Google might penalyze you or just ignore them. Monitor closely your rankings and the Search Console account for any manual penalty message. Hope it helps. Best luck. GR.
Search Engine Trends | | GastonRiera0 -
Backlinks to internal pages help website to rank better or vice versa?
William is right your site structure helps Google and your visitor to understand your website better, and therefore it can push your site up in the rankings. Structuring your website is crucial for both usability and findability. A lot of sites lack a decent structure to guide visitors to the product they’re looking for. Apart from that, having a clear site structure leads to better understanding of your site by Google, so it’s very important for your SEO. Relationships between content Google crawls websites by following links, internal and external, using a bot called Google bot. This bot arrives at the homepage of a website, starts to render the page and follows the first link. By following links Google determines the relationship between the various pages, posts and other content. This way Google finds out which pages on your site cover similar subject matter. In the sidebar of this post, for example, you’ll see links to the category ‘Content SEO’ and to the ‘Internal linking’ and ‘Site structure’ tags. We make sure Google understands that the content on those pages is related to the content of this post by adding these links. Setting up an internal linking strategy It’s crucial for your SEO to evaluate and improve internal linking strategy on a regular basis. By adding the right internal links you make sure Google understands the relevance of pages, the relationship between pages and the value of pages. Ideal site structure The structure of your site should be like a pyramid. On the top of the pyramid is your homepage, and underneath the homepage a number of category pages. For larger sites, you should make subcategories or custom taxonomies (more on that later). Within the categories and subcategories, you will have a number of blog posts, pages or product pages. Internal link structure Your linking structure is of great importance. Each page in the top of a pyramid should link to its subpages. And vice versa, all the subpages should link back to the pages on top of the pyramid. There should be essential content (cornerstone articles) at the top of your pyramid, and these should be the articles you link to from all of your blog posts. Because you’re linking from pages that are closely related to each other content-wise, you’re increasing your site’s possibility to rank. Linking this way will help search engines by showing them what’s related. In addition to that, with all subpages linking to that one main page at the very top of your pyramid, you are creating cornerstone pages. This will make it easy for search engines to determine what your main pages per subject are. Taxonomies and tags Your site will also benefit from adding tags. Tags and taxonomies will give your site more structure (or at least Google will understand it better). Don’t create too many tags. If every post or article receives yet another new unique tag, you are not structuring anything. Make sure tags are used more than once or twice. They should group articles together that belong together. Cornerstone content Content pages of essential importance are called cornerstone content. Cornerstone articles are the most important articles on your website. Cornerstone articles should be relatively high in your site structure, focusing on the most ‘head’ and competitive keywords. Think of four specific pages you would like someone to read in order to tell them about your site or company: these would need to be the cornerstone articles. In most cases, the homepage would link to these articles. IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER Source The Ultimate Guide to Site Structure > Yoast How to Create a Site Structure That Will Enhance SEO > Kissmetrics
Search Engine Trends | | Roman-Delcarmen0 -
Sudden increase in backlinks with "Link Reclamation": Any risk from Google?
Hi Verónica, Thanks for the answer. I don't think what you suggested is technically difficult to implement. We have thousands of redirects to be done. And anyway all these redirects will not be noticed by Google overnight or backlinks will be increased all of a sudden. The different source links which are generating these backlinks will get indexed at different times varying from days to months. So it'll take 1 day to 3 months to get notified by Google about the increase of these backlinks. But there will be risk if Google can see all the redirects at one shot which I am not sure about. This is my hypothesis. Please let me know if you have different ideas on this. Thanks
Search Engine Trends | | vtmoz1 -
Fresh backlinks vs old backlinks: A solid ranking factor?
Hello, It is important to always keep developing new relevant content. The value of a particular link can be affected by many factors and the age of a link can be one of them. Personally, I believe many links are considered more relevant after they have aged for a while. I have noticed that links that are 3 to 6 months old and are of high-quality pass more ranking value that brand new links. Also, the quality of the traffic you are getting through those links is an important factor. As links start ageing and producing more and more traffic I really think they get more value from the search engines. Best Regards
Search Engine Trends | | Dalessi0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
I agree with Gaston's approach right up to step 4. If you add the no-indexed pages back into a block in the robots.txt file, you'll end up back where you started from. Because Google will still discover the no-indexed URLs elsewhere and the robots,txt block will stop them from discovering the no-index, and the URLs will likely start to get added to the index again. No-indexed URLs must not be blocked in robots.txt. Those two processes are mutually exclusive.
Search Engine Trends | | ThompsonPaul0 -
Where does Google finds "Soft 404" and "Not found" links?
Hi Andrew, I have sent you the example URL in PM as I would like to maintain privacy about our domain. Please check the URL and give the suggestions. Thanks
Search Engine Trends | | vtmoz0 -
What happens when we delete all the outgoing links from a forum at once?
I believe that if these links are "nofollow" that changing them will have no direct impact on your status with Google or your rankings. Deleting them will make forum members who don't understand "nofollow" really angry with you. They might also decide not to spam your site, which will be good for your forum in many ways. You might also lose some nonspammers who don't understand and leave. Good luck.
Search Engine Trends | | EGOL0 -
Too many "nofollow" outgoing links are Okay?
Well having too much spammy content is a really difficult battle to fight. Google will eventually tackle the bad, the spammy, the duplicate and the keywords stuffed content out of their rankings. As your website is a forum, i'd advise to make more strict rules to the community and sanction the spammy content. Also educate the community in order to do not violate content google guidelines. Hope it helps. Best luck. GR.
Search Engine Trends | | GastonRiera0 -
Sub domains take away the pagerank/link juice of main website?
Hi, My main domain and sub domain are treated as separate entities in terms of link equity and DA. BUT, my sub domain is a blog compared to my root domain being a much more concise service industry site. So they are pretty different in most ways. How closely related in terms of content is your sub domain compared to your root domain? Since my blog attracts more links because of its article and restraint-free content, I try to funnel some of that link equity to applicable pages on my root domain, how well it works I am still uncertain of. -Ben
Search Engine Trends | | Davey_Tree0 -
How much do branded search organic traffic & direct traffic impact the ranking for their non-branded topic/keyword?
Traffic is an important indicator along with onsite behavior once they are there. The more people visiting your site the better it should be for your search engine rankings as long as they are engaging with the site. Try to generally keep your bounce rate as low as possible and try to make the site as sticky as possible. Best Regards
Search Engine Trends | | Dalessi0 -
How spam score will be calculated for the domain without any content or backlinks?
Hey there! Tawny from Moz's Help Team here. Spam Score doesn't update with the same frequency as the rest of the data in Open Site Explorer, so it's entirely possible that the score you're seeing for your old domain is outdated and no longer accurate. It sounds like it's been long enough that the backlink info for that site has fallen back out of our index — linking data is only stored for around 180 days — but the Spam Score hasn't been updated yet. I would expect to see the Spam Score for the old domain change after that metric updates again. Unfortunately, I don't know when that will be — I don't have an ETA for when Spam Score will be updated next. Sorry about that! If you still have questions, drop us a line at help@moz.com and we'll do our best to sort through everything with you.
Link Explorer | | tawnycase0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi Linda, So, if we noindex the popular page of our website, what difference it is going to make at Google beside that page not showing up in SERP. Actually I have replied to the related thread to your post below: https://moz.com/community/q/log-in-page-ranking-instead-of-homepage-due-to-high-traffic-on-login-page-how-to-avoid Please suggest. Thanks
Search Engine Trends | | vtmoz0 -
Number or percentage of new visitors impact Google rankings?
In general, new visitors are better than returning for many reasons, but whether their number will improve your positions depends on the source of traffic. Direct, paid and referral traffic will not affect your rankings directly. There are few cases where spikes in referral traffic improve positions, but this effect doesn't last. Organic traffic will, this falls under user behaviour category, if your CTR is growing and your Bounce Rate is in check, your positions will improve.
Search Engine Trends | | Igor.Go0 -
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hello Vtmoz, Im not following you. You have clearly stated below: Noindexing the login page will force visitors from google on doing an extra step to log in.
Search Engine Trends | | GastonRiera0