Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • Is that company does not have your legal permission you can take actions on that, I have the same issue on the past. First I tried to contact the source without any response at the end a court give us a permission to get those articles down using the hosting companies. Example if a site hosted on HostGator use our brand without our permission I can request Hostgator delete that content usually Hostgator deactivate the site  until the content is deleted by the owner

    | Roman-Delcarmen
    1

  • Thanks for replying Will. You have mentioned a few ways to deal with this there - and they all seem to point out to the fact that this should not really be a high-priority issue for us at the moment. Especially, if you think that sub-domains do not really have a major effect to the main site (I would not even think it's even worth us deindexing to be honest as it may be relevant to some people and we can just allow Google to continue indexing as it is). Surely, all considerations point to this: we can come to the conclusion that we won't be doing any SEO-related work on these pages. Therefore, how do I set up MOZ to ignore these two sub-domains and only show crawl errors related to the main site? We just don't want these pages to be crawled at all by MOZ given we won't do any work on them. Thanks

    | e.wel
    0

  • Sorry to say that without an in-depth look, there isn't going to be a way to tell. Unless you changed something, it is most likely testing or an update from Google. I've seen rankings go up and down over the years on multiple sites for a variety of reasons and sometimes what seems like no reason. It can be the negation of the value of backlinks, or an entrant to the market, or Google trying other sites out to see if they perform better in the eyes of the searcher. In this industry, there is only so much you can control. Focus on creating top quality content, focus on the searcher, and you'll come out on top.

    | katemorris
    0

  • Did you keep the old url in webmaster tools?  With sitemap of old domain that redirects?  You should make sure that was submitted for crawl so google see's the 301's.  Also make sure you redirected all versions http, https, www, etc of old domain* to new https//: version. Are the pages in fact 404 pages?  are they pages in your sitemap?  be careful too that they are not bad internal links.  Did you crawl site with moz?

    | david-johns-sheetlabels
    0

  • Hi Tim, Thanks for your reply. I also disawod the links, hope this will keep us safe. It looks like something automatically, like the xrumer forum spam

    | remkoallertz
    0

  • Hi There! I want to be sure I'm correctly understanding your scenario. Is your company a digital-only or a local business? In order to appear in Google's local results you must have: A physical location in the city of search. Make face-to-face contact with your customers. If either of these factors is absent, the business is not meant to appear in the local results. Please, feel free to provide further details so the community can envision your specific scenario.

    | MiriamEllis
    1

  • Annoyingly the disavow tool does not support complex matching. If you're after wildcard or regex matching for your disavow uploads, that's something you won't be able to get your hands on. It is a shame because, coordinated network link-bombardment really has no simple 1-click solution for webmasters right now (that's pretty poor!) You'd have to build something more complex which connects with the API of the tool which detects all of these links. It would have to have its own database and be programatically capable of updating that database. You'd need it to filter out all of the domains which don't match your pattern (and come up with regex / SQL queries for matching that exact pattern in a robust, reliable manner). It would have to de-dupe existing / new entries and then generate a text file for you. It would also have to be capable of comparing the file it generates against your existing file, so it doesn't lose your manual-mode disavows To me, it sounds like a lot of trouble to go to. I'd make a post about it on Google's forum here: https://productforums.google.com/forum/#!forum/webmasters - try to attract the attention of someone from Google and let them know that, these kinds of attacks do happen and you want the Disavow Tool (as a Google product) to properly allow people to defend themselves.

    | effectdigital
    0

  • Hello Rachel, It could be negative linkbuilding, also could be that some automated pages are creating those links without restrictions. I would worry that much. On one hand, Google (though their spokesman) said that the algorithm is pretty good finding and diminishing the strength of spammy links. So, if you talk to someone from Google right now, they might said: leave them, probably the algorithm will consider those as spammy and there will be no harm. On the other hand, If you are absolutely sure that those links are spammy and malicious... then go ahead and disavow them. Remember that it's possible to disavow an entire domain. More info here: Disavow backlinks - Search Console Help Hope it helps. Best luck GR

    | GastonRiera
    0

  • What Jordan says, getting the Schema wrong is just unfortunate but it shouldn't hurt you.

    | Martijn_Scheijbeler
    1

  • if you didn't build the links or earned them through good content and you don't feel they are good. Always better to just disavow them. Even more so if they are .ru sites as I am sure your site is not in Russia

    | aarongray
    0

  • I'm of another point of view. If done wisely link building brings good. I've been using this pattern for a long time and it worked very well. None of my websites were penalized. That's why I'm convinced that such a distribution does not hurt rankings but I was wondering may be someone is using another scheme that performs better.

    | Lynn12
    0

  • Do we lose backlinks and domain authority when we change domain name? Assuming your site structure, file and folder names, and content remain the same and you do 301 redirects then no, you would not lose backlinks and domain authority. You should also: benchmark key metrics (for example, site speed and rankings); update internal links (don't forget about your non-html pages as well); check for and fix and redirect chains; tell Google (via Search Console) that your domain name has changed; set up new properties in GSC (www, non-www, http, https), set your preferred profile and geography, and eventually merge profiles into a set; transfer your disavow file in GSC; track 404 errors (that could indicate you've missed some files) and customize your 404 page to help visitors that encounter them as a result of your domain name change; update any external URLs with UTM codes that point to your site (UTM codes get stripped with redirects); update your domain name in Google Analytics; update your sitemap and add to GSC as well as your robots.txt file; I usually update major sites that point to me as well, things like Google Maps, local data aggregators, and social media sites; and check your traffic, rankings, and GSC daily for a while so you can react quickly if something's amiss. Unfortunately, it is rarely the case that only your domain name changes in which case you'll need to a page-by-page mapping. Don't forget about images and PDF files that may be indexed separately from HTML pages. Check out this slide presentation (https://www.slideshare.net/bastiangrimm/migration-best-practices-smx-london-2018) from SMX London 2018. It was recommended by John Mu at Google and is really useful.

    | DonnaDuncan
    0

  • Re: pipe or hyphen in browser title, does anyone know which is the more accessible option? GOV.UK, a leader in accessibility, uses hyphens...

    | ScopeUK
    0

  • Hello there, Actually if used correctly, content syndication can be an effective way to help your site ranking, I'm assuming you have a main blog that you want to syndicate the content on it's to other blogs with different groups of visitors, so ultimately they might visit your main blog from that syndicated blog post. There are some things you can do on syndicated posts such as canonical, noindex, etc. You can read on this 2 post which explains each method in details: https://searchengineland.com/syndicated-content-189097 https://neilpatel.com/blog/the-step-by-step-guide-to-syndicating-content-without-screwing-up-your-seo/ Hope this helps Joseph Yap

    | Seenlyst
    0

  • Hi Robin, Nigel has offered some good advice here - the one thing I would also add is that you may want to set up mobile switchboard tags to make it clear to Google that the desktop version is the canonical version for PCs and the mobile version is canonical for mobile. See more info here: https://developers.google.com/search/mobile-sites/mobile-seo/separate-urls#annotations-for-desktop-and-mobile-urls

    | bridget.randolph
    0

  • Hi Gaston Riera, Thank you for replying, i want to do noindex my old domain and then move content to new domain.( No matter about authority and traffic loose) Thanks about 15 hours ago

    | HuptechWebseo
    0

  • I had a simlar issue on a couple Wordpress freebie sub domains I made while conducting reputation management for clients. What had ended up happening was The site would index immediately and then 24 hours later be ghosted completely. Turns out I was submitting the news sitemap that it automatically generated and being that I wasn't in their list of approved news sitemaps, I guess it just ripped everything out, as I'm sure the news sitemap and the regular one had the same pages listed just with more detail on the news one. I doubt it's the exact same occurrence but if you recently submitted a sitemap, I'd check it closely, as it has been known to trigger a similar problem, at least for me!

    | TucsonAZWebDesign
    0

  • so weird, Ive checked on multiple friends and family devices and we all see "HYPR Biometrics" Sitename is NOT "HYPR Biometrics" and should not be.  don't know where this is being pulled from. Would forcing breadcrumbs help overwrite that title?

    | gray_jedi
    0

  • Ah - sorry for my misunderstanding.  So you are leaning towards combining the pages. So unit-conversion.info has a combined page: http://www.unit-conversion.info/metric.html When I search for "convert from micro to deci", they appear as number 8.  If I click on their page, it defaults to base and mega, so I then have to change the dropdowns. The number 1 result for that search is this page https://www.unitconverters.net/prefixes/micro-to-deci.htm - it has Micro and Deci preselected. unit-conversion.info only has 460 but Unitconverters.net has 50,000 pages indexed by Google.  Despite the "thin content", they still appear number 1 (admittedly, this may be due to other factors). As far as user experience goes, I would prefer to land on unitconverters.net because I have less things to click. I guess the art is in finding the sweet spot in being able to give a search result with context without spinning out too much thin content. Thanks again for your detailed response!

    | ConvertTown
    1