Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Doing this will make the 10 active (currently rendered in the homepage listings) blog posts have a better chance of ranking for long tail terms, but **yes some PageRank will be lost from your homepage **(or rather, redistributed) If you have lots of SEO authority, you're doing the right thing! There's a reason that farmers use complex irrigation systems. Those same systems would also be poorly placed to water the daffodils hanging from your balcony. Although the question which I answered just a few moments ago is slightly different to your own, you will find (if you read my full answer) that there are lots of relevant parallels to what you are asking: https://moz.com/community/q/best-structure-for-a-news-website-including-main-menu-nav More links, more expansive nav and increased granularity of URL architecture are all great things. The only problem is, like a complex irrigation system, they need a large water supply (or rather - a large pool of SEO authority to draw from). If you don't have lots of 'juice' to spread, you could bleed out instead If you only have a bucket of water to begin with, then a simpler, more streamlined approach is usually better. You don't want the system to spray one tiny droplet of authority (which makes no difference) to thousands of pages. For every tactic there is a proper time and place I'm not trying to push you in either direction, just allow you to make a more informed decision for yourself

    Behavior & Demographics | | effectdigital
    0

  • Not a problem, I always try to give a solid answer Maybe you could do a compromise and have a "categories" entry that breaks down into the main categories (but not all of them) or something like that. Always remember, you can hedge your bets to test a bit! Having every news category in the top-line nav, breaking down into sub categories could be excessive. It's a shame that triple-expansion nav never caught on (then you could have categoires->category->sub-category on hover). Whilst that is technically feasible, it's not really very user friendly (at all) as it makes menus really jumpy and dysfunctional (in most instances anyway) On lots of eCommerce sites now I notice that they have auto-completing product (or category) based pseudo-search bars. Like you'll type a bit of a word, and all the relevant products will come back and you click on one (instead of 'entering' the search, and seeing a page of results). Maybe you could innovate and create a similar thing for news stuff. Have people type a bit of a category (or tag / topic) and then pre-fill with a couple of categories and a load of articles (or something like that) Just throwing out ideas. Not that yours is bad! Just always trying to think "how could this be more?"

    On-Page / Site Optimization | | effectdigital
    0

  • Well, in that case, you need to focus on 2 thinks, internal linking. use this search operator site: yourwebsite.com  "keyword" In that way, you can add strong internal links using your main keyword, also you should try with schemas adding your brand info or you should check all the Do-Follow links pointing to your competition and compare that with your own backlink profile. There are several ways to do it. But without any other information, there is not too many advice that comes to my mind. Regards and Good Luck

    Intermediate & Advanced SEO | | Roman-Delcarmen
    0

  • We don't know why these links have been made. for example in our main page all the "href"s included menu, footer and the other hrefs inside the page are calculated which are about 51 but the thing which we see in our webmaster is about 400 (Please take a look at this screen shot ) This unusual number has hunted our set and ranking and we don't know how to solve this problem

    Technical SEO Issues | | jacelyn_wiren
    0

  • The add-on only appears to import MozRank and link targeting information. Can you confirm that rank can be imported? Thanks!

    Other Research Tools | | Warren.Wittman
    0

  • Thanks for your reply! I'm wondering if Google would rank the same page for keywords in both American English and Canadian English. Let's say the page has the word 'color' and 'colour', could it rank that page for both?

    Content & Blogging | | marktheshark10
    0

  • “keyword density, in general, is something I wouldn’t focus on. Search engines have kind of moved on from there.” John Mueller, Google 2014 https://www.youtube.com/watch?v=Rk4qgQdp2UA ---> Check this video from Google https://www.hobo-web.co.uk/keyword-density-seo-myth/ If you want to check the optimization level for a keyword go **on-page-grader ** Despite what many SEO Tools would indicate, the short answer to this is, in my experience, there is no IDEAL %. There is no one-size-fits-all optimal ‘keyword density’ percentage anybody has ever demonstrated had direct positive ranking improvement in a public arena. I certainly do not believe there is a particular percent of keywords in words of text to get a page to number 1 in Google. While the key to success in many niches is often simple SEO, search engines are not that easy to fool in 2018. I write natural page copy which is always focused on the key phrases and related key phrases. I never calculate density in order to identify the best % – there are way too many other things to work on. I have looked at this, a long time ago. IN SUMMARY it's an outdated concept from the paleolithic era of search engines add your keyword to your title, headline and meta tags and that s all and please forget about it focus on a relevant task such as schemas, internal linking, site performance, link-building, amp and the most important focus on creating a good content/copy rather than focus on that. Even if you are not a blogger or publisher and you are a small business owner hire a good writer who can create the copy for your homepage that really converts (1500 words) and forget about keyword density That is how your content is going to look like **Main Keyword ** **Keyword Related --Service 1 ** Keyword Related --Service 2 **Keyword Related -- Services Areas ** Keyword Related -- FAQ Hope this info will help you Regards

    Content & Blogging | | Roman-Delcarmen
    0

  • Either way, switching canonicals from one to another, it will do no good in terms of equity or in terms of spamminess. If you have hreflangs done, then you should be good.

    Intermediate & Advanced SEO | | DmitriiK
    0

  • Yeah, that could make sense! I haven't been able to replicate it so definitely a weird example. Thanks!

    Local Listings | | adlev
    0

  • can anyone think of any other reasons why the images wouldn't get indexed. Only thing I can think of is perhaps if your subdomain is serving wrong status codes for your images, unlikely but possible.

    Technical SEO Issues | | ThomasHarvey
    0

  • Is that company does not have your legal permission you can take actions on that, I have the same issue on the past. First I tried to contact the source without any response at the end a court give us a permission to get those articles down using the hosting companies. Example if a site hosted on HostGator use our brand without our permission I can request Hostgator delete that content usually Hostgator deactivate the site  until the content is deleted by the owner

    White Hat / Black Hat SEO | | Roman-Delcarmen
    1

  • In my opinion, it would be a business decision that's then implemented by the SEO. So ask your bosses these questions and what experience they want for their customers.

    Technical SEO Issues | | ThomasHarvey
    0

  • Hi there. A bit difficult to understand your actual question, but as far as I see it, you are asking if you should remove any unrelated and therefore unapproved images from the backend of the gallery. And if that would have any SEO effect. Am I correct? If so, then I see couple potential benefits of removing those photos from the backend - first, just for the mental health of administrator - I'd go nuts if I had to scroll through a bunch of old junk every time I have to moderate new photos. Second, depending on how your frontend and backend are connected and process things, it, in fact, might speed up the process of rendering the gallery on the frontend. Faster website = happier users = more conversions. Cheers.

    Intermediate & Advanced SEO | | DmitriiK
    0

  • Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank I would include core addresses I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&". There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs) Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other! No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files Hope that helps!

    Intermediate & Advanced SEO | | effectdigital
    0

  • There's nothing technically wrong with the link / page: It has a self referencing canonical which doesn't point elsewhere The link does not utilise a no-follow tag The page is not blocked from indexation via Meta no-index or robots.txt The domain and sub-domain level metrics for this site are strong in Majestic, Moz and Ahrefs! The Moz link explorer will only handle domains, sub-domains or exact URLs. On Ahrefs and Majestic I also have the option to check SEO authority by 'path', which means the cumulative SEO authority and inbound metrics for all pages contained within the "/employabilitypoints/" folder Although the sub-domain has nearly 100,000 backlinks connecting with it, I can only find evidence of 7 backlinks to this specific section of the site (and not all of those are still live) It might be possible that although the site is pretty good, the area of the site where you gained the link isn't worth that much. As such, all crawlers (including Moz, Googlebot, Ahrefs etc) are unlikely to prioritise updating their databases re: this specific area of the web Although all crawlers work differently and have different paths they follow in terms of indexing web-content, most of them will prioritise 'more important' pages (or site sections) for updates. Different crawl bots determine what they feel is important, in different ways Although neither Moz nor Ahrefs seem to have picked up on the link, Majestic SEO has and using ctrl+F you can find it recorded here: https://majestic.com/reports/site-explorer/top-backlinks?folder=&q=jftwines.com&oq=jftwines.com&IndexDataSource=F

    Link Building | | effectdigital
    0

  • Always avoid knee-jerk reactions to these kinds of things. It's possible that rather than the new location being the issue, the method of migration was more of a problem instead. If that's the case then moving back wouldn't help anyway (as you'd have to go through the migration again, assumedly)! I'm not saying that is or is not the case, just trying to highlight that you don't know what the problem is (and thus any hasty action is extremely unwise). You need to understand what an enquiry is or what it used to mean on your old site. Does that mean a phone call, an email, a contact form submission (maybe all of these things)? Was one of these things far and away the most popular method of contact? If so - which one was it? That gives you a place to start your detective work It may be that your contact forms are still coded to send mail to an inbox on your old domain which can no longer receive the comms and thus no enquiries are being process. Maybe due to the change of domain, the event-tracking code needs to be adapted (this could be the case if you changed your UA property / number in GA when you moved) If it's calls that are down, what call tracking solution are you using? There are loads. Usually they run a script on your site to swap out a phone number and thus attribute calls from visitors to your site who were ascertained via different channels (PPC, SEO, email, display, affiliates etc) If you left that all in place, maybe you need to log in to the back-end of the call tracking system and update it there to listen for calls from a new domain! Maybe it's just something small and silly like that Because you didn't give loads of detail I can't help huge amounts (unless you share more) - but I hope that, what I have written here will help you to track down the culprit of your problem. To me it sounds like a tracking issue, rather than a performance issue (but I could be wrong) Always remember that the web has thousands of ways to trip you up if you aren't paying attention. Often developers will push for a 'logical', one size fits all solution. It's only afterwards that people realise, they needed an expert!

    Intermediate & Advanced SEO | | effectdigital
    0

  • Thanks for the response. Conversions and revenue are down, by significant number. I know that the update is supposed to be about relativity of content to users' intent etc, and it's all good if conversions are on the same level. But they are not, that's why I am raising this question here. _"Due to that, I'm not really seeing the same correlation which you are seeing" - _sorry, the screenshots were not as representative, here are the ones where you can see what i'm talking about clearly (I included higher volume keyphrases only here, and that's the problem we are having): https://dmitrii-regexseo.tinytake.com/sf/MzAyNjgzN185MDczNDY1 https://dmitrii-regexseo.tinytake.com/sf/MzAyNjg0MV85MDczNDkz _"Are you sure you're going down - and that others aren't investing more and going up?" - _well, i don't know how much others are investing, but looking at competitive research, I'm not seeing anything extra competition is doing - across all available metrics (content freshness, amount of content, backlinks etc) - we are outperforming. _"As you said you had done nothing on your site" - _not really, we are always improving, I'm just saying that there weren't any major overhauls or anything drastic.

    Search Engine Trends | | DmitriiK
    0

  • Hey Roman!  This is possible with the use of our API and Google Sheets API for you to write results to a Google sheet.  That would require a developer to build that functionality for you but its definitely possible.  Here are some resources if you wanted to try it out yourself. Google API documentation: https://developers.google.com/sheets/api/ Moz API documentation: https://moz.com/help/links-api Hope that helps, let me know if you need anything else.

    API | | dave.kudera
    1