Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Yep, as far as I am aware it is only Mainland UK. Sorry about that!

    Moz Local | | eli.myers
    0

  • Hi GR, Thank you very much for answering my question and the taking time to send an image with hierarchy. We will surely take your suggestion and remove .html and to include H1 tags. You are very helpful, Thanks again. Regards, Abdulw

    On-Page / Site Optimization | | abdulw
    0

  • Hi Dieumerci, Gaston provided a great response to your question! Did it help answer your question? If so, please mark it as a "Good Answer." Regardless, we'd love to have an updated on your duplicate content issue. Christy

    Moz Tools | | Christy-Correll
    0

  • Hmmm, depending on the product and number of reviews I would either have a database holding your reviews so you just add to the DB and it updates your dynamic variables per scheme. If it is for a brand and the same aggregate review is over the entire site you could maybe have a config file that references a single dynamic field which will update the entire site, this method may not need a DB. You may have to do some leg work in the first instance to make the figures dynamic. I tend to use PHP and MySQL for this purpose.

    Search Engine Trends | | TimHolmes
    1

  • Hi xdunnings, I would strongly advice you to use the relevant keywords as an inbound link in your content instead of 'click here'. When you use 'click here' in your link text, you just let search engines know that your content contains a link. By using keywords in your link text instead, you let search engines know what content visitors may expect when clicking the link. The search engines' ranking algorithms are partly based on inbound links to a website. Therefore the text that is used for these links matters to the eventual ranking. Another reason is that people don't always read your hole page or article. Therefore it is important for visitors and searchbots to use clear headings tags and anchor text on your page or article so they can have a quick and clear overview. Hope this helps you!

    On-Page / Site Optimization | | WeAreDigital_BE
    0

  • Hi xdunningx The focus keyword is an arbitrary concept made up by Yoast - it forces you to think about what the piece should be ranked for in terms of a single target keyword or phrase. It should be one - three words ideally. As long as the target keyword you want to rank for is on the page, preferably in the H1 and used as the feature image ALT text you are good to go. Yoast is great but it does have this irritating feature where it fails to recognise the focus keyword on the page. Regards Nigel

    Keyword Research | | Nigel_Carr
    0

  • How long does it take before it removes most of the duplicated search result pages from the index? Every site is different but I have seen it take 6 - 9 months for pages to drop out. Is it still crawling those pages but has not fully decided to remove most of them? It's possible. As Gaston has already pointed out, search engines will need to access those files again to see you want them noindexed. How bad is this for SEO? It temporarily dilutes the amount of SEO equity available to flow to pages you DO want indexed.

    On-Page / Site Optimization | | DonnaDuncan
    0

  • I suggest this free proofreading tool. Its works awesome.

    On-Page / Site Optimization | | mp3converter
    0

  • Hi Gaston, thanks for the response. Didn't even think of disavow! Great reminder. Fingers crossed it registers soon. Id like my stats to be a bit more reliable.

    Link Building | | ACEmina
    0

  • That would be a good outcome! Thanks for starting an interesting discussion.

    Local Listings | | MiriamEllis
    0

  • The semantic purpose of a header tag is to mark a collection of words as a header introducing a section of descriptive content. If what you have instead is a series of headers with no content in between, you're defeating the purpose. There's no "reason" to mark those list elements as headers. If you're using because it's an easy way to create the styling for those terms, you should get out of that habit. There's no major damage done by having headers follow each other, but there's no real benefit either. And for what tiny value those headers bring for helping search engine crawlers better understand your page content, you've thrown that away. In other words - don't do it. Hope that helps? Paul

    Search Engine Trends | | ThompsonPaul
    0

  • Hi Coppell, Google puts the burden on business owners to become aware of and read Google's guidelines. They won't stop you from creating most listings that violate the guidelines, but these listings can then be taken down for violations if a) Google catches them, or, b) a consumer or competitor reports them. The only time you should create a Google My Business listings is if it is guideline-compliant. Otherwise, you are putting effort into something that's a liability to your brand and reputation, instead of being an asset. So, unfortunately, you were unaware of Google's guidelines in this and have built rankings and reviews around an ineligible address. Regarding the new address, I had asked some questions. You mention it has no postal delivery. Why is that? Is it on a brand new street? How do business owners occupying your street receive mail? Also, you mention you are sharing this office with a friend. Are they in the same business category as you?

    Local Listings | | MiriamEllis
    0

  • Sorry - I missed the part about you looking specifically at the Moz crawler. While useful, it's a stand-in for what will actually be used for rankings - namely the actual crawls by the search engine crawlers themselves. I'd be looking right to the source for that info if you're concerned there's an issue, rather than trusting just Mozbot. You can find the SE crawlers data in Google Search Console and Bing Webmaster Tools. Look for trends and patterns there, especially around the sitemap report. The challenge to a Screaming Frog-rendered sitemap is that it can only find what's linked. If the site has orphaned pages or an ineffective internal linking scheme, a crawl could easily miss pages. It's certainly better than no sitemap, but a map generated by the site's technology itself (usually the database) is safer. P.

    API | | ThompsonPaul
    0

  • Thanks Eli! I guess I was wondering if the MOZ Bot only followed pages that were in the sitemap. It was generated by Screaming Frog I have trusted it to include all relevant pages! I have put in a more detailed description in the response below. Overall I need to investigate further but i'm satisfied that the sitemap has not caused the drop!

    API | | Slumberjac
    0

  • Hi Arun It's certainly best practice to move to a root directory. As you say visitors are then coming to one domain, not a subdomain. All you need to do is page by page redirect through a 301. When you say they are being 'redirected to 301 code' this is perfectly OK. The 301 code just tells Google that the page has moved permanently. It takes Google a short while to recognise the new pages as replacing the old ones and for that period you can see old and new in Google, causing a short period of duplication which could affect the rankings. You just need to sit it out - by all means, do a Fetch in Search Console to help speed up the process. Search Console>Crawl>Fetch as Google. Regards Nigel

    Technical SEO Issues | | Nigel_Carr
    0

  • Simone, I've meant to set the configuration in Google search console, going to: search traffic -> international configuration. There go to the country tab and select your country. Best luck. GR.

    Intermediate & Advanced SEO | | GastonRiera
    0

  • Hi, I would advise against this for a couple reasons. First of all, dropping and re-adding pages to the index isn't quick, it often takes weeks before Google will obey a noindex tag. Second, even if it were quick, is going to cause problems at the authority level - is your site a trustworthy source of information? This week, yes, next week, no. It's best to have long-term content that can build trust/authority over time without ephemeral. An alternative approach you might take is writing content about the particular trials offered, this will help prevent thin content. You might also consider adding a call to action on empty pages that prompt users to provide an email address and you'll notify them when trials of type XYZ have opened back up.

    Intermediate & Advanced SEO | | LoganRay
    0

  • I know this is an old thread, but we are still having the same problem.  I finally got around to sending a note to flywheel about this problem and it came back that everything is fine.  I am not sure what to do here?  It's on a shared hosted, so I don't have console\audit log access, however Flywheel is one of the best wordpress hosting companies out there (only thing they do). As far as accessing the robots.txt file, I can go directly to it without any problems? https://southernil.com/robots.txt

    Getting Started | | DougDeVore
    0