Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Hi Andrew, I have sent you the example URL in PM as I would like to maintain privacy about our domain. Please check the URL and give the suggestions. Thanks

    | vtmoz
    0

  • Hi Verónica, Thanks for the answer. I don't think what you suggested is technically difficult to implement. We have thousands of redirects to be done. And anyway all these redirects will not be noticed by Google overnight or backlinks will be increased all of a sudden. The different source links which are generating these backlinks will get indexed at different times varying from days to months. So it'll take 1 day to 3 months to get notified by Google about the increase of these backlinks. But there will be risk if Google can see all the redirects at one shot which I am not sure about. This is my hypothesis. Please let me know if you have different ideas on this. Thanks

    | vtmoz
    1

  • Hi William and EGOL, Here is the additional info on our Wikipedia page which answers your questions and give more knowledge on similar scenarios: Our Wikipedia page is pretty old. It was first created on 2005. Website link was pointing to our homepage. Suddenly this got deleted in this January due to lack of reliable resources and the page sounds little spammy and advertising. We didn't create this page. If so this couldn't survive for so long. So, back to the actual discussion, even the link from Wikipedia is technically a "nofollow", we can see the importance Google gives to this page to boost a website's ranking with a strong backlink. Thanks

    | vtmoz
    0

  • Hi Vtmoz, Google will not respond at all. I mean Google has no way to know that it was a mistake, unless many websites hosted by the same host provider had the same behaviour. Said that, I assume that you fixed the issue and into the Google Search Console you checked the robot texts, updated the sitemap (s) and "fetched as Google" too. Good luck! Mª Verónica

    | VeroBrain
    1

  • No question - the page's authority is divided up amongst all links on the page, not just the internal ones. That's why I made the recommendation I did. To be clear - you're not "losing Pagerank" for the page that contains the links. You're losing the ability of that page to pass some of it's power to other pages on your own site, by having that power sent to external sites instead. Paul

    | ThompsonPaul
    0

  • Well having too much spammy content is a really difficult battle to fight. Google will eventually tackle the bad, the spammy, the duplicate and the keywords stuffed content out of their rankings. As your website is a forum, i'd advise to make more strict rules to the community and sanction the spammy content. Also educate the community in order to do not violate content google guidelines. Hope it helps. Best luck. GR.

    | GastonRiera
    0

  • Hi John, Thanks for the response. I agree with you about using "rel=canonical". But there are too many pages to manually add these tags. Is there any other way to implement this? Thanks

    | vtmoz
    0

  • Hello Harry, I think you mistakenly asked this question twice! I already answered you the other question. Best, Roberto

    | AgenciaSEO.eu
    0

  • Hi, You probably want to start reading this: https://moz.com/blog/seo-split-testing-a-b-test-changes-google, it's a great post by Will Critchlow explaining what you should be thinking about if you really want to set up valid SEO experiments. I would always advise to run your SEO and UX experiments separately. You analyze them different, you bucket in a different way and the changes that you make for SEO don't always have to influence UX. Hopefully this will get you going. Let me know if you have more questions. Martijn.

    | Martijn_Scheijbeler
    0

  • Try to get links to the most relevant page for each search term you are targeting. The link text is important and will influence search engines as to the correct page for each search term. Try to target only one or two important keywords per page. It can be difficult to do, but it will make things more clear to people and the search engines. Best Regards

    | Dalessi
    0

  • I agree with Gaston's approach right up to step 4. If you add the no-indexed pages back into a block in the robots.txt file, you'll end up back where you started from. Because Google will still discover the no-indexed URLs elsewhere and the robots,txt block will stop them from discovering the no-index, and the URLs will likely start to get added to the index again. No-indexed URLs must not be blocked in robots.txt. Those two processes are mutually exclusive.

    | ThompsonPaul
    0

  • Hi vtmoz, The most mandatory way to prevent any page to be indexed is by using a meta robots tag with a _noindex _parameter. Then using robots.txt will help to optimize your server resources and is a way that prevent google to crawl any new page that do not have the meta robots tag. And yeah, its very common to have indexed pages even the robots.txt file blocks the entire website. If what you are looking for is to remove from index the pages, follow this steps: Allow the whole website to be crawable (or at least that specific pages/section) in the robots.txt add the robots meta tag with "noindex,follow" parametres wait several weeks, 6 to 8 weeks is a fairly good time. Or just do a followup on those pages when you got the results (all your desired pages to be de-indexed) re-block with robots.txt those pages DO NOT erase the meta robots tag. Hope it helps. Best luck. GR.

    | GastonRiera
    0

  • William is right your site structure helps Google and your visitor to understand your website better, and therefore it can push your site up in the rankings. Structuring your website is crucial for both usability and findability. A lot of sites lack a decent structure to guide visitors to the product they’re looking for. Apart from that, having a clear site structure leads to better understanding of your site by Google, so it’s very important for your SEO. Relationships between content Google crawls websites by following links, internal and external, using a bot called Google bot. This bot arrives at the homepage of a website, starts to render the page and follows the first link. By following links Google determines the relationship between the various pages, posts and other content. This way Google finds out which pages on your site cover similar subject matter. In the sidebar of this post, for example, you’ll see links to the category ‘Content SEO’ and to the ‘Internal linking’ and ‘Site structure’ tags. We make sure Google understands that the content on those pages is related to the content of this post by adding these links. Setting up an internal linking strategy It’s crucial for your SEO to evaluate and improve internal linking strategy on a regular basis. By adding the right internal links you make sure Google understands the relevance of pages, the relationship between pages and the value of pages. Ideal site structure The structure of your site should be like a pyramid. On the top of the pyramid is your homepage, and underneath the homepage a number of category pages. For larger sites, you should make subcategories or custom taxonomies (more on that later). Within the categories and subcategories, you will have a number of blog posts, pages or product pages. Internal link structure Your linking structure is of great importance. Each page in the top of a pyramid should link to its subpages. And vice versa, all the subpages should link back to the pages on top of the pyramid. There should be essential content (cornerstone articles) at the top of your pyramid, and these should be the articles you link to from all of your blog posts. Because you’re linking from pages that are closely related to each other content-wise, you’re increasing your site’s possibility to rank. Linking this way will help search engines by showing them what’s related. In addition to that, with all subpages linking to that one main page at the very top of your pyramid, you are creating cornerstone pages. This will make it easy for search engines to determine what your main pages per subject are. Taxonomies and tags Your site will also benefit from adding tags. Tags and taxonomies will give your site more structure (or at least Google will understand it better). Don’t create too many tags. If every post or article receives yet another new unique tag, you are not structuring anything. Make sure tags are used more than once or twice. They should group articles together that belong together. Cornerstone content Content pages of essential importance are called cornerstone content. Cornerstone articles are the most important articles on your website. Cornerstone articles should be relatively high in your site structure, focusing on the most ‘head’ and competitive keywords. Think of four specific pages you would like someone to read in order to tell them about your site or company: these would need to be the cornerstone articles. In most cases, the homepage would link to these articles. IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER Source The Ultimate Guide to Site Structure > Yoast How to Create a Site Structure That Will Enhance SEO > Kissmetrics

    | Roman-Delcarmen
    0

  • Hello vtmoz, That site's (seo.oldsite.com) backlinks are suitable for the new site?  Rememer that if you've made a full redirection of the subdomain site to the root new domain, there will be tons of links pointing to that root domain.. and if those links dont share the topic nor are from good sources its probably that Google might penalyze you or just ignore them. Monitor closely your rankings and the Search Console account for any manual penalty message. Hope it helps. Best luck. GR.

    | GastonRiera
    0

  • Hello, It is important to always keep developing new relevant content. The value of a particular link can be affected by many factors and the age of a link can be one of them. Personally, I believe many links are considered more relevant after they have aged for a while. I have noticed that links that are 3 to 6 months old and are of high-quality pass more ranking value that brand new links.  Also, the quality of the traffic you are getting through those links is an important factor. As links start ageing and producing more and more traffic I really think they get more value from the search engines. Best Regards

    | Dalessi
    0

  • I believe that if these links are "nofollow" that changing them will have no direct impact on your status with Google or your rankings. Deleting them will make forum members who don't understand "nofollow" really angry with you. They might also decide not to spam your site, which will be good for your forum in many ways.    You might also lose some nonspammers who don't understand and leave. Good luck.

    | EGOL
    0

  • Hi, My main domain and sub domain are treated as separate entities in terms of link equity and DA. BUT, my sub domain is a blog compared to my root domain being a much more concise service industry site. So they are pretty different in most ways. How closely related in terms of content is your sub domain compared to your root domain? Since my blog attracts more links because of its article and restraint-free content, I try to funnel some of that link equity to applicable pages on my root domain, how well it works I am still uncertain of. -Ben

    | Davey_Tree
    0

  • Hi, It's not really a SERP destination feature, it's basically an extension on the knowledge data that Google has. If you Google for example: seo software you'll likely get a similar setup with a carrousel with the top software for SEO. It's incredibly hard to get as it seems to be information that Google is saving themselves and not directly getting from another site. Martijn.

    | Martijn_Scheijbeler
    0