Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • I actually wrote something relatively recently which might be of interest to you: https://searchnewscentral.com/blog/2019/07/12/what-might-ai-machine-learning-image-experiments-teach-us-about-search/ The conclusion I basically came to was: "If I were working on an eCommerce store selling rolls of fabric, I’d say that an image of a rolled up bit of fabric would be good for a mechanical mind to interpret. A zoomed in image of just the fabric’s texture, would also be pretty good! A lady standing by a fireplace with a wine-glass in one hand and a fabric-roll in the other? That would be very difficult for a mechanical mind to interpret." Play with Google images. Type in your product (or competing products) and see which types of image gain the most prominent positions. That will give you an idea on, how advanced Google is in terms of interpreting certain objects. Do the images need to be super obvious with cut-outs against a blank background? Can you be more adventurous? Also look at the image thumbnails for your products (or competing ones) on Google Shopping. See what's doing well there IMO obvious is better for search algorithms, but then again may not have such good conversion rates as more adventurous creative

    Branding / Brand Awareness | | effectdigital
    0

  • Let's talk you can call me here? https://moz.com/community/users/383525 On the right side under "Additional Contact Info"

    Intermediate & Advanced SEO | | BlueprintMarketing
    1

  • True Google said this and a lot of tests out there proof that what Google says is not true for 100% but like I said, not a showstopper.

    Technical SEO Issues | | paints-n-design
    0

  • Thanks for that. You certainly know your stuff. I'm beginning to wonder if it is worth keeping the old site alive as it might require a lot of work to clean up and be useful linking to my new website when it's built. Not sure its ranking and links will help my new website that much, or hinder it. Big decision.

    Branding / Brand Awareness | | brizc
    0

  • It's really unfortunate this has happened to you! One thing you might want to check for is compromised / hacked content as well, since here in the Moz community (and in-fact the wider SEO community) we are seeing increased instances of negative SEO combined with hack attacks Another thing, you have to be really careful when updating your disavow file. If you don't download the existing disavow file first and add your entries on to it, then all of your prior disavow work will be undone. When you upload the disavow txt file, that is your COMPLETE disavow, not simply some additions to it. So if you handled that wrong, you may have disavowed the new stuff and un-disavowed loads of legacy work (which could land you right back at square 1) In general, disavow work does not result in increased SEO performance. If a load of sites were giving you boosted SEO metrics, and then Google decides they are bad and nullifies them, doing a disavow doesn't magically make Google give back SEO authority that you should never have had in the first place. Those pipelines are severed Have you considered that in its early stages, a negative SEO attack often actually boosts rankings? Then, when the network is uncovered, that juice is all cut off. Maybe the past positive performance, wasn't all down to you. The negative SEO attack (before it reached critical velocity) may have been boosting your rankings. Now that it's flipped negative, most of that SEO authority will be cancelled out. That could be a bitter pill to swallow, but I have seen it a lot of times Maybe this drop in performance, is actually where the site should be ranking under your own efforts. If the negative SEO attack initially went undetected and boosted certain rankings, then was discovered and spun around - then actually you are where you should be. Could be complicated to explain to an enraged client though The reason you process a disavow file, is to list all the spammy links and disavow them, so that Google will not (in the future) give you a manual penalty which nullifies all (or most of) your rankings. Since you don't know which links Google currently does or does not consider to be spam, this inevitably results in a few non-spammy links (which were contributing SEO authority) getting nullified, which makes results go down NOT up. What a disavow does, is trade a very small amount of your current performance away, in return for mid-term insulation against manual actions (which believe me, are truly horrible) It never ceases to amaze me how people don't understand what disavows are truly for and how many people say "hey I did a disavow and I saw slight drops and no gains". It's only under very exceptional circumstances (like manual actions) where disavows can potentially (if they are accompanied with linked Google Word Docs, explaining in detail what has happened to Google, through a reconsideration request) result in raised rankings. 99% of the time they just insulate you form things getting worse and give you a very slight results dip One thing you may want to look at is if your disavow was too aggressive. Spam can mean different things to different people. Did you download all links from all tools (Majestic, Ahrefs, Moz, Google Search Console), re-crawl them in SF to see if they are all still live? Did you then fetch metrics (Ahrefs URL Rating, number of linking domains to each URL, PA and DA from Moz, CF and TF from Majestic, Sessions from domain from GA) for all URLs using service APIs and URL Profiler / Netpeak Checker? Did you put all the metrics against each URL in a massive spreadsheet? Did you normalise and boil the metrics down to a final score to see which links have good SEO authority, and which ones don't? If you just looked at them with your human eyes, if you only looked at links from one backlink source without doing a master de-dupe from all backlink data suppliers, then you probably didn't do something anywhere near extensive enough to clean up properly. Cleaning up a negative SEO attack properly with minimal risk, is a huge undertaking that could take an expert around 5 days to compile and update the disavow file Remember you're dealing with an algorithmic devaluation. Algorithms see in numbers, not with your actual eyes. If you didn't have a data-led approach to your work you are extremely likely to continue suffering in the mid to long term I would go over the work again much more forensically Finally: Remember that Google and Moz aren't connected data sources. DA (Domain Authority) is NOT used in Google's ranking algorithm (at all). It's  an estimate only (to replace Toolbar PageRank which Google took away, and stopped SEOs seeing). Even TBPR wasn't great, but it was at least aware of disavows, Google penalties and algorithmic devaluations. Moz's DA is 100% unaware of these things and doesn't factor them, so if you are looking to DA to save you - or for things to 'just get better' over time - think again!

    Intermediate & Advanced SEO | | effectdigital
    0

  • You can't do this from a rankings POV but you may be able to do it from a CTR POV, as you specify in your question I reckon that if there is a way to do this, it's probably this: https://marketingplatform.google.com/intl/en_uk/about/optimize/ What this will do is serve traffic equally from Google, to two different versions of a web-page. The question is, when it does this magic - will it alter the SERP titles on Google's front end? I'm not certain, whether it's quite that advanced - but I'd at least give it a go If that fails just run some PPC ads as a testbed and use the page title text you have written, as the ad-title (I think there is such a thing). Then you can see which gets better CTR from ads. Since PPC ads and SERP snippets have a broadly similar format (at least the text ones do, for the most part) it could be a good testbed

    Online Marketing Tools | | effectdigital
    0

  • Yes, the same thing happened with my new domain. after that, I am unable to take it back. Bank Branches made me out.

    Intermediate & Advanced SEO | | Jgfbjvvnnvvvb
    0

  • Hi, I hope this helps, Do NOT point desktop pages to m. pages via a rel="canonical" tags use rel="alternate" for that & make sure rel="canonical" tag on the m. URL pointing to the corresponding desktop URL Annotations for desktop and mobile URLs On the desktop page, add a rel="alternate" tag pointing to the corresponding mobile URL. This helps Googlebot discover the location of your site's mobile pages. On the mobile page, add a rel="canonical" tag pointing to the corresponding desktop URL. We support two methods to have this annotation: in the HTML of the pages themselves and in sitemaps. For example, suppose that the desktop URL is https://example.com/page-1 and the corresponding mobile URL is https://m.example.com/page-1. The annotations in this example would be as follows. Annotations in the HTML On the desktop page (https://www.example.com/page-1), add the following annotation: <code dir="ltr"><linkrel="alternate"media="only screen="" and="" (max-width:="" 640px)"<br="">href="https://m.example.com/page-1"></linkrel="alternate"media="only></code> On the mobile page (https://m.example.com/page-1), the required annotation should be: <code dir="ltr"><linkrel="canonical"href="https: www.example.com="" page-1"=""></linkrel="canonical"href="https:></code> This rel="canonical" tag on the mobile URL pointing to the desktop page is required. A page have a self-referencing canonical URL In the example above, we link the non-canonical page to the canonical version. But should a page set a rel=canonical for itself? I strongly recommend having a canonical link element on every page and Google has confirmed that’s best. That’s because most Sites & CMS’s will allow URL parameters without changing the content. So all of these URLs would show the same content: https://www.example.com/page-1 https://www.example.com/page-1/?isnt=it-awesome https://www.example.com/page-1/?cmpgn=twitter https://www.example.com/page-1/?cmpgn=facebook Using a mobile website version of their desktop version, they need to implement a canonical tag on their mobile website page with an URL of the desktop version. For example, Your main domain: iamexample.com Your mobile version: m.iamexample.com Then, have this tag in the section of your main domain - And, have this tag in the section of your mobile version page - Mobile-Specific URLs, Such as AMP Pages or a Mobile-Specific Subdomain Creating content with mobile in mind is a marketing must -- just be sure to remember to set your canonical URLs when you have pages that are specific to mobile but have the same content as a page on the desktop version of your website. For AMP pages specifically, Google also provides detailed guidelines on how to correctly differentiate your Accelerated Mobile Page from your standard webpage. SEE: https://developers.google.com/search/mobile-sites/mobile-seo/separate-urls https://yoast.com/rel-canonical/ https://moz.com/blog/cross-domain-rel-canonical-seo-value-cross-posted-content https://moz.com/learn/seo/canonicalization https://moz.com/blog/rel-canonical Hope this helps, Tom

    Technical SEO Issues | | BlueprintMarketing
    0

  • Hey Aspirant If You are use wordpress then use redirection plugin or use redirect all 404 pages to homepage. use these two plugins maybe this can help you. Thankyou.

    Moz Local | | Trymybest
    0

  • If a sitemap existed, you could pull from cache or server. If not, you can recreate one via Google Analytics/Log data.

    Web Design | | KevinBudzynski
    0

  • Not a problem. The UK is just in a weird situation with its listings where in the end, the input outweighs the outcome (IMO)

    Local Listings | | effectdigital
    0

  • Hey Dude, it all (Ranking) depends on your ON PAGE SEO optimization and few healthy backlinks you are giving back to your blog. While you are askingabout caching and indexing, you need to check first if your site is visible to google bot or not, have you submitted your website into Google search console? Do all the needful things to make your site more visible to search engine and try to get few healthy but natural backlinks from the web. Also make sure your sitemap is in xml format and visible and submitted to Google Webmasters tool Hope this will help you to sort out this matter. Sr. SEO executive at Pick Camera

    Search Engine Trends | | tranhainam
    0