Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • No worries, you often get the weekly dips when global - no idea why. However, when adding country it all flattens out.  If the question is answered or happy please mark it accordingly.  Hope that helps.

    | ClaytonJ
    0

  • 2 because it's easiest to remember. In 2019 exact-match domains have less impact on SEO, it's more about 10x content and demonstrating a solid value proposition (watch up to the point where issue #1 is fully outlined). SEO is a pretty vast field in modern times. Coding tweaks and URL slugs are still somewhat important, but they provide slight, slight bonuses to your core value proposition (the value-add of your site, to the internet). I don't think engagementrings.com is too bad, but without a solid 'idea' and value-prop behind it, the URL won't magically make it rank alone

    | effectdigital
    1

  • Wouldn't be possible to draw any kinds of conclusions with such top-line data. From this post, we don't know what the keywords are or which websites were involved in these movements. We'd want to be looking at actual keywords, looking in the WayBack machine to see how content on both sites changed, looking in Ahrefs to see if there are any matching link trends for either site Alexa score is ancient I wouldn't be looking at it any more to be honest. Regardless, it's not possible to check for black-hat attacks on "purple line" or "blue line", we need domains here! If you want a comprehensive audit of exactly what happened, no one can supply it for 'mystery' websites based on a couple of charts You're also looking at things in a very binary way. How do you know they didn't do something good **when **you screwed something up? Why does it have to be one or the other? In SEO usually there are a convergence of factors surrounding such large movements!

    | effectdigital
    1

  • If they contain anchor text or are obviously trying to game the system by boosting rankings you definitely want to remove them, but you need to have a good eye for these things. just because spam score is high doesn't always mean its spammy. look into why moz thinks its spammy. Give it the smell and look test. if it looks fishy, then its fishy. If you only have your citation listed with a www link you don't have much to worry about.

    | waqid
    0

  • Google "recognises" the original source & will rank it higher, this has been the case for some time

    | jasongmcmahon
    0

  • We decided to test it, so we noindex / nofollowed all of those sites, they can still use the tool but it won't show up in the index.

    | bwb
    0

  • In addition to the responses which have already been posted to your question, there is another way to go about this, frequently used. It's not necessarily "better", but might be a more feasible alternative for you. 1. As long as you still own your old domain, then you could go into your DNS settings of the old domain (with your registrar/nameserver account), and set a simple pattern redirect to point all traffic from old.com/* to new.com/$1. Note that different domain services have different ways for you to specify this, but you are looking for whatever method preserves the file path from the request and appends it to the new domain. 2. So, then if you've done #1 as above, you would then set up your 1:1 redirects for every old page on the new server. This has the disadvantage ofc reating a small "redirect chain", meaning every redirect will have 2 hops instead of just one. Which makes it slightly less optimal than one of the other solutions mentioned which would only have 1 hop. But, at the same time is has a feasibility advantage, and lets you maintain all of your redirects in one place.  But yes, you would want to redirect all of your old pages. You might not need to do all 1:1, if you have some patterns, because you can also use pattern redirects. And if you do, then those ones you might want to put back out in your DNS settings for the old domain, so they wouldn't have 2 hops. 3. Technically, #1 and #2 should take care of #3. However, for the most valuable ones, you might also want to reach out to the sites and request an update. Just because fewer hops is more optimal. But technically a small chain of redirects should work.

    | seoelevated
    0

  • This is a great answer. From the sounds of it OP has: A scraper site gone wrong, creating malformed links to your site Some kind of shady negative SEO attack trying to create garbage URLs on your site A data pollution attack trying to mess up your analytics A hack-attack Any mixture of the above If a site has been hacked, sometimes it can take some proper dev work to pull it out at the roots. A hacked site is a liability to Google and Google don't like to rank hacked sites / content I would suggest checking whether the site has been hacked with some urgency and back everything Gaston has said

    | effectdigital
    1

  • Google's John Mueller has stated that exit intent pop-ups do not attract a Google penalty: https://www.youtube.com/watch?time_continue=746&v=gS4_JH-QqSg "What we’re looking for is really interstitials that show up on the interaction between the search click and going through the page and seeing the content. So that’s kind of the place we’re looking for those interstitials. What you do afterwards like if someone clicks on stuff within your website or closes the tab or something like that then that’s kind of between you and the user." Google won't be monitoring the user's full behaviour on your site beyond the initial bounce/non-bounce (at least as far as the search arm is considered, Analytics is, of course, different!), e.g. they won't see that x happened that caused (or appeared to cause) the user to leave.

    | Xiano
    1

  • Hey friend, I'm from NUSRAT BLOG ACADEMY a blog learning site want to suggest to post different sites to post your articles with the same link or author name. Because It will help you to get a unique backlink. on the other hand same link and post on in a site, google find less value. Thanks a lot. Md Alauddin Administrator and Author, Nusrat Blog Academy

    | Sssogggh
    0

  • I can't make any judgement without looking at Analytics and much more data, but there's a possibility that you might have to consider making a version specifically for the US audience in order to tailor it for readers of that country. You can learn more about Internationalization and Hreflang here: https://moz.com/learn/seo/hreflang-tag

    | NickSamuel
    0

  • This is essentially what i was thinking too. i was hoping to get a few more responses from the community, but i value your input. I have already started moving in this direction. Thank you,

    | donsilvernail
    0

  • Thanks for the time for answer NO, we do not change ON page elements consistently or even URL, which you describe. I can't understand about "From your snippet I see issues right away"  can you explain what issues are occurred in snippet.

    | CommercePundit
    1

  • Thanks for the info! It's good to get a bigger picture of the nefarious 'globe' network which seems to link to every site on the entire internet, with absolutely zero value-add whatsoever for end users. It's interesting to see that you guys got hit by some variants of that pure-spam domain, which didn't seem to hit us. Clearly the problem is far more widespread than we had at first anticipated We also disavowed a whole load of non-globe related domains, those weren't in our export What I'm talking about in terms of the 'targeted' methodology, is not the deployment of the disavow - but the decision making process before the disavow file was compiled. We really made sure that, we got a very granular view of each and every link before deciding whether to disavow or not. We had rows of metrics against each link, before we decided whether to keep or disavow any particular link In almost all situations, once we reached deployment we used to domain-level disavow directives. There were only 1-2 exceptions, where the client had good editorial pieces on a site - yet also spammy banner / sidebar links from paid advertising. In such situations we used a mixture of disavow directives, to try (as hard as we could) to let to good links through the net. That being said, very few people will be in that same situation. In the majority of cases, if you don't want one link from a domain - you don't want any!

    | effectdigital
    0

  • 1. Good! 2. You are confused for good reason. There has not been clear direction for here for some time. If you used HREFLANG between the two, it seems for the last number of years the content would not be seen as duplicative. You are telling Google that the content is the same but in different languages inherently. 3. There is so much that goes into this, but I can tell you with years of experience under my belt that the numbers don't ever tell the whole story.

    | katemorris
    0

  • Hello.  My thoughts. Question 1.  I really don't think you're duplicating content by summarizing what someone else says.  I would make sure the article is primarily your content and not just rehashed content because as soon as you add links you're giving some of your Page Rank away.  It is not bad SEO practice to list sources and links at the bottom of the blog post, in this case I think it is a must.  You must give credit to the original writer and ensure that your content writer isn't plagiarizing anything.  Not preaching, just words of caution. Question 2.  Custom, relevant content is most beneficial for SEO.  Appropriate links to other credible sites is good for SEO.  Rehashing someone else's blog post probably isn't beneficial if that's the meat of the article. Question 3.  I try to not use nofollow links because there's someone on the other side of that link doing SEO.  When that someone sees I've given them a followed link, they come check me out and that creates an opportunity for a link in return.

    | WDaubenmire
    1