Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • No, not Google Search Console. The Paid/Organic Report in Google Adwords

    Paid Search Marketing | | Lei_Zhang
    0

  • Thank you for your response. Yes our URL stays the same, so it is dynamically served content.  The problem is the content that is displayed on mobile is purely a listing of our product catalog.  The on-page seo content that would display on Desktop is removed on mobile.  Since MFI wants to index our mobile version, is it better to try and update our mobile content to match desktop content, or to remove the mobile content altogether and have the Desktop version display on phones until our new site is launched?

    Behavior & Demographics | | Technical_Contact
    0

  • There's a few things you need to marry up if you want to do this. You need the referring page or domain / hostname (to validate that the session came from a backlink you know about). Once you filter the data down like that, you just need to filter by user-agent ("googlebot" - or any user-agent string which contains "googlebot"). Then you just want to look at the IP address field in the tabular data and you have your answers! Here's the problem, most IP-level data is contained within basic server-side analysis packages (like AWStats which is installed on most sites, within the cPanel) or alternatively you can go to the log files for much of the same data. Most referrer-level data (stuff that deals with attribution) is contained within Analytics suites like Adobe Omniture or Google Analytics. In GA, you can't usually get to 'individual' IP-level data. There used to be a URL hack to force it to render, but it was killed off (and many people who used it were banned by Google). The reason for that is, Google don't want too much PID (Personally Identifiable Data) harvested by their tool. It creates too many legal issues for Google (and also, whomever is leveraging that data for potentially nefarious marketing purposes) Since you won't get enough IP-level data from GA, you're going to have to go to log files and log analysis tools instead. Hopefully they will contain at least some referral level data... The issue is, getting all the pieces you want to align in a legally compliant way Obviously you have your reasons for looking. I'd check if you can find anything on your CPanel in AWStats (if that's installed) or get the log files and analyse them with something like Screaming Frog Log File Analyser I can't promise this will return the data you want, but it's probably your only hope

    Intermediate & Advanced SEO | | effectdigital
    0

  • Thank you so much for you answer! the home page in the subdomain is redirected but none of the actual pages in the subdomain are, and because there are so many of them, it would be easier to block them in robots.txt, even if there is small change that Google will still index them. But because the home page is redirected, I don't want to confuse Google with a Disallow: / Could I do Disallow: / and then Allow: /homepage.html

    Technical SEO Issues | | RaquelSaiz
    0

  • Wish we were able to give instant 24/7 support, but alas we're just volunteers ¯_(ツ)_/¯ If you have time, would you mind explaining how this issue was resolved?

    Technical SEO Issues | | zeehj
    0

  • Hi Ben, You're welcome! Good follow-up question. If your company (XYZ) places patients in facilities you don't own (Sunshine), then you aren't authorized to create GMB listings for Sunshine. Nor would any other service that places patients in those facilities. Sunshine would need to market themselves as the owner of that location. Hope that makes sense.

    Local Listings | | MiriamEllis
    1

  • Unfortunately it's going to be difficult to dig deeper into this without knowing the site - are you able to share the details? I'm with Martijn that there should be no connection between these features. The only thing I have come up with that could plausibly cause anything like what you are seeing is something related to JavaScript execution (and this would not be a feature working as it's intended to work). We know that there is a delay between initial indexing and JavaScript indexing. It seems plausible to me that if there were a serious enough issue with the JS execution / indexing that either that step failed or that it made the site look spammy enough to get penalised that we could conceivably see the behaviour you describe - where it ranks until Google executes the JS. I guess my first step to investigating this would be to look at the JS requirements on your site and consider the differences between with and without JS rendering (and if there is any issue with the chrome version that we know executes the JS render at Google's side). Interested to hear if you discover anything more.

    Technical SEO Issues | | willcritchlow
    1

  • As I see you are using the Google Translate API, so in that case, keep in mind the on-page for every single language

    Technical SEO Issues | | Roman-Delcarmen
    0

  • Thanks Kate, will do the best I can in the light of your answers. But as you've probably understood by now, with quite limited resources.

    Technical SEO Issues | | GhillC
    0

  • Thank you for the links and detailed reply. Looking at this template do you think it can have iframe issues ? https://goo.gl/7DhYsv

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hi Scott, As I've said before, having a lot of noindexed pages is not an issue.  What you should focus your efforts is in finding any mistakenly noindexed page. And yes, Yoast has a config that makes archive and tag pages noindex. Hope it helps. Best luck. GR

    Moz Tools | | GastonRiera
    0

  • If it's text based stuff that's being hidden, or items which should be visible that are marked up with schema - then yes Google could potentially consider it to be cloaked content for the purposes of SEO manipulation. If your site has lots of SEO authority and is well trusted, something like this shouldn't be a problem. If your site is new and has a lot of convincing to do, it's just another needless negative signal being sent to Google for no reason. In SEO, it's rare for one thing to make or break a site. People say that SEO has no 'magic-bullet' solutions, so by the same token most optimisation methods (positive or negative) must have relatively equal weighting. That means you should never ignore the small things! If you start picking and choosing which best practices to obey (or not) - you'll quickly sink yourself

    On-Page / Site Optimization | | effectdigital
    1

  • Usually this is accurate but it could involve you taking measures to insulate your SEO authority further. A 301 redirect won't transfer 100% of the link equity from one URL to another. If the pages are highly related and share much of the same content, almost all of the link equity flows through! If the pages contain significantly different content or are not related thematically, as little as 0% of the equity can flow through the 301 redirect (it's not a simple input / output equation) The SEO authority of a given URL is still partially (maybe mostly) defined by Google's PageRank equation. Whilst 'toolbar' PageRank is dead, 'real' PageRank (which SEOs have never seen) is still an integral ranking factor. Google still (for the most part) considers the web to be an amalgam of interlinked 'pages' (rather than websites, or domains). That's not to say that domain-level checks don't happen, they do. For the most part though, since Google lists individual web-pages in its results (not entire sites launching with a single click) - page level metrics remain extremely important. If you combine both of these pieces of knowledge, you'll see why Moz's link explorer may state that some of your URLs which now result in 301s, are worth more (or more attention) in terms of your SEO. Other tools like Ahrefs or Majestic will do exactly the same thing, it's not accidental. The fact is that a page with loads of great backlinks, will usually outperform another URL receiving similar calibur link equity which is then diluted (a little or a lot) through redirects (even including the mighty 301!) Due to all of this, whilst the 301 redirect is a great measure to translate as much equity to the new URL as possible, it's not 'as good' as having all of those links altered to point to your new resultant page. Link amends usually always out perform 301s if they are managed in their totality, the viability of getting every link switched over though (as the coding for those sites is not under your direct control) is minimal The suggestion is always to put the 301 layer underneath, but to get as many of your links actually shifted to point to your new address - as possible! Certainly your very best backlinks should be moved over. In this way, even if the new URL is a little different and Google's page comparison algorithm kicks off, you've partially circumvented some of the issue Due to all these factors, migrations of any kind (even internal ones) often result in slight traffic dips and dents. Although that's true; moving to new architectures which are better, unlocks the long-term 'space' to achieve more than you ever did before. Without growing room, you stagnate (and in the competitive world of internet marketing - that's a big no-no)

    Other Research Tools | | effectdigital
    1

  • Thanks for the question! Just to clarify, when say "references" do you mean that someone has mentioned your blog, or that someone has linked to you? Actual links to your blog will be more valuable for ranking purposes than just mentions. You can learn more about the benefits of link building here: https://moz.com/learn/seo/links-link-building For link data, our index updates daily. Moz crawls and indexes billions of pages, adding fresh link data every day. When discovered or lost links are found, we'll update our database to reflect those changes in your scores and link counts. We prioritize the links we crawl based on a machine learning algorithm to mimic Google's index. Unfortunately because it is run by an algorithm, we can't say for certain when a particular link will appear in our index. You can read more about our index here. Hope that helps!

    Link Explorer | | moz_support
    0

  • Tim Holmes gave a good answer but it does assume your redirects are being applied via a .htaccess file which is the usual method if your website is hosted on a Linux / Apache server. If your website runs on a Windows / IIS server, then instead of implementing your redirect rules via .htaccess you'd be using web.config instead. Obviously most plugins (especially on common platforms like WordPress) are coded to interact with a .htaccess file. If you're running on IIS instead they could break stuff or at the least fail to function entirely. On Google you can find many posts complete with web.config instructions: https://www.google.com/search?num=100&q=https+redirect+for+web.config This is the one which Google gives the knowledge-graph entry to: https://www.ssl2buy.com/wiki/http-to-https-redirect-using-htaccess-or-web-config The second part of the content deals with Windows. Checking that your SSL certificate is correctly installed, valid and provided by a supplier which Google accepts is highly advisable. If browsing to an HTTPS URL on Chrome yields warnings or 'not secure' messages, it's safe to say that Google has not accepted your SSL certificate. If you can't even browse to HTTPS URLs, something is likely wrong with the install! Hope that helps

    Technical SEO Issues | | effectdigital
    0