Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Well since you are already a technical seo, then you'll easily be able to apply your learnings of JS into it.  Part of good onsite/technical SEO is ensuring that the websites are easily crawled, have minimum load times and are structurally sound.   We had three people in my agency that just did JavaScript SEO.  So it's definitely something worth learning.  I think if you have a passion for coding then you should continually learn new languages and you can apply your SEO background to helping a company decide which platform and processes to use when building sites. Google Developers has an entire training course on JS SEO.  Check it out.  https://developers.google.com/search/docs/guides/javascript-seo-basics

    Educational Resources | | DarinPirkey
    0

  • Hello Darin. Maybe I did not express myself correctly. I talk about the dot (".") in permalinks (slug) from Wordpress. When I enter "Domain.de Erfahrungen" as post title, Wordpress rewrite the permalink as "myproject.de/artikel/domain-de-erfahrungen". WP replace the dot with an dash. I found already an solution to change that, so that the permalink now looks like "myproject.de/artikel/domain.de-erfahrungen". So, I want to know what's the best strategy for Google. Wordpress has - by design - following permalink: /domain-de-erfahrungen/ Invasion forum just delete the dot: /domainde-erfahrungen/ Trustpilot has following permalinks: /review/www.domain.de/ As my keyword is "domain.de Erfahrungen" - I want to know, what permalink should I use to get best results for Google.

    Technical SEO Issues | | cwltd
    0

  • Couple of questions. Are the aggregated reviews going to be for the entire site or just that single product? 2)Do you plan on having multiple people be able to review each individual item and are those reviews coming from users of the site or a third party? 3)How many products/pages do you have?

    Intermediate & Advanced SEO | | DarinPirkey
    0

  • There's actually no such thing as 'exactly similar'. If something was 'exactly similar' to something else, you would instead describe it as 'the same'. Similar means close to the same, but different. Some degree of difference is implied! I'd try to work the pages to be as similar as possible, but if the URLs weren't exactly 'the same' I wouldn't be super worried. Another thing, are you confusing AMP with having a mobile site, or a mobile responsive site? AMP is usually separate to your mobile site: Desktop site: loads for desktop / laptop users Mobile site: may use the same URLs as your desktop site and load the same content, but responsive styling makes the content suitable for a mobile display. For some older sites they might have separate mobile URLs, but this is generally frowned upon for modern SEO (e.g: ancient "m." subdomain sites) AMP pages (not a site): versions of your normal web pages, which are stripped down and load really fast. Most mobile users in the first world will not see these very often, but if they're in an area with terrible signal / speeds the AMP page can load instead of the regular mobile page. Situational URLs for users whom have permanent or temporary bandwidth issues So it's the mobile site's pages and the AMP pages (#2 and #3) which should be similar. You should not be making #1 and #3 similar, it wouldn't make any sense to do that. I guess some desktop users might see AMP URLs from time to time, but I'd have thought that AMP takes over from mobile 99% more often

    Search Engine Trends | | effectdigital
    0

  • In my experience, Google quite often see this (rightly or wrongly) as an illegitimate attempt to increase your search footprint. Quite often some listings get banned, denied, removed, or increased scrutiny is put on your original listing Google really wants serrate listings for separate businesses, not separate listings for different services. That's why it's called Google My Business and not Google My Service. The core remit of the platform is to list businesses, not fragmented services. To those who try to derail this, sometimes it can work pretty well - but usually (in the end) it causes more problems than it solves You could label your services as different businesses and give them different internal addresses (e.g: Unit 1, Unit 2 etc). If that seems really inaccurate, then it probably says that your attempt to diversify online in jumping the gun a little (and Google may not embrace it)

    Local Listings | | effectdigital
    1

  • Yes that was the question, thank you. It is correct that for example 1 sub-category with the exact same list of products appears in 2 differents categories (but with a unique URL). So as you say in the second part of your first paragraph I should merge them or 301 redirect one of the categories to the other. If I understood your answer right, would it be better to choose 1 category (let's call it category1) as a "father" for this sub-category ? And maybe create a 301 redirection link for this sub-category that would be present in the second category (category2) ? Thank you, Max

    Link Building | | Sodimaccl
    1

  • I'm glad I helped. As these are sites you closely work with, you can notify them regarding the issue and their webmaster can look over the issue, as excessively generated links do hurt their site even more than yours. As for the links being do-follow that would be beneficial to your website is they weren't so excessive, so no need to mention that. On the pages that you can't find the link, try opening "inspect element" or right-click on the page and click "view page source" and then find (ctrl+f) your site name. If your URL shows up, then it means that the link is in a hidden section of the site. If it doesn't show up it means that the link has been removed but Moz hasn't crawled the site yet to update it. If the links seem too spammy for you and the webmaster doesn't fix them, then you can disavow the domains until a later time when the issues are resolved. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com

    Link Building | | Dalerio-Consulting
    1

  • You need to clarify whether you mean images on their own page, or images on their own URL (two different things) This is an image on its own page: https://www.bloodstock.uk.com/events/boa-2019/gallery Depending upon the nature of the page, you may or may not want to de-index URLs like this. In the example of Bloodstock festival, it would be crazy to de-index their gallery images which many people are explicitly looking for. In other circumstances, you get 'weird' pages which end users are never meant to see. Fragments which have minimal styling and just the image, those can usually be de-indexed. Sometimes an image on a single page is very useful for users (imagine if Pinterest banned all actual pins from the SERPs) but other times they're just back-end fragments which have escaped. Know the difference This is an image on its own URL: https://assets-bloodstock.s3.amazonaws.com/uploads/captioned_photo/image/8557/display_desktop_Ross_the_Boss_Bloodstock2109_KatjaOgrin-94.jpg When you load this up, there's no container. No HTML, no site at all - JUST the image on its own. Don't de-index these, or Google can't see your images - even when they're embedded on a web-page! Hope that helps

    Intermediate & Advanced SEO | | effectdigital
    1

  • This may be a little bit outside the scope of the original question, but another use for hreflang tags is when you only have a single language, but localized content in different countries. So, for example let's say you have web sites/pages in the US and Australia. Both are in English. But you differ the content for each market. In that case, you would use hreflang tags to relate a similar page/site between the two countries. "en-us" and "en-au". I realize that's not exactly what the original post asked. But adding this info to what has already been answered.

    On-Page / Site Optimization | | seoelevated
    0

  • From my perspective, this may be a very good strategy, and not a problem. It depends why you have 4 landing pages though. I see you linked your site, but I'm instead going to answer more generically, at least in part since I don't read or speak the language of your site. Let's hypothetically say you sell a type of day planner. And you've optimized your product page for the query "day planner". But you know that your day planner is highly relevant to teachers, personal trainers, doctors, and lawyers. You might want 4 very specific landing pages, targeting phrases like "day planners for teachers", with content on those pages which resonates with and helps teachers to understand how your one day planner would be great for their needs. And a separate page for the personal trainers, and for the doctors, and for the lawyers. Your product page might rank best for "day planner", but one of your landing pages might rank best for "day planners for teachers". And I think that's a valid strategy. As opposed to trying to get one page to rank well for all 4 of those audiences, which may also be a valid strategy. I've seen each of those strategies work, in different situations. It very much depends on the competition around your listings, and how they are targeting the audiences (or not), in terms of which is a better strategy (one page with multiple targeted queries, vs 4 pages with individual targeted queries).

    On-Page / Site Optimization | | seoelevated
    1

  • If the domain has already been redirected somewhere else and if the redirects were accepted by Google, much of the authority for that domain may now have moved to a new location. In modern times the practice of buying domains and 301 redirecting them for extra link juice, is ineffective unless you are operating under very specific circumstances (and even then it's usually considered black-hat) Nowadays Google often checks to see if the new content and pages are similar to the old ones. If they're not then quite often the redirect doesn't work for SEO purposes

    Inbound Marketing Industry | | effectdigital
    0

  • Check which architecture receives your highest quality backlinks and then force that structure using 301 redirects

    Technical SEO Issues | | effectdigital
    1

  • I think you tried to attach two charts there but for me at least, they look like exactly the same chart. Entrances isn't the metric you usually use to evaluate what people call 'traffic', that's actually more commonly associated with the sessions metric Let's look at what you said you did: "1. Tried contacting website owner which we  think spam and add all such domains to our disavow list" - this can and usually will only make your results go down further. Your view of what spam is, will not be perfectly aligned with Google's view. As such, some of the links which you think are spam, may have been giving you ranking power - which you have killed off using the disavow tool. If Google think a link is spammy, they nullify it themselves without disavow. If you mark a link as disavow, Google do not give you back the ranking power. Why would they? You are agreeing with them that the link is bad, so why would they give you ranking power for it? Disavows can (under normal circumstances) only make results dip or drop further. The use-case for a disavow, is if you believe your backlink profile is SO bad, that Google are about to give you a manual link penalty (which will kill ALL of your results). If you think that is about to happen, you can stop it from happening using a disavow submission. Disavow trades away (loses) some of your current performance, in exchange for future insulation against manual actions (which are WAY worse than algorithmic devaluation) "2. We found little duplicate content on sites like Quora, we made those answers down by reporting to Quora" This probably won't do anything for your SEO. Quora probably still won't delete the posts. If they're still there, it's still duplicate content (down vote or not) "3. Reported to DMCA on 3 articles articles(partial) from our website." This is a good move keep doing this. Actually focus more on this if you can find more stolen content! Use CopyScape to track it down "4. We are trying improving user experience" This is always good and you should always be doing this "5. Removed one of our page that shared by many people but our page was not indexed by Google." I mean, it won't really help to do this as a one off. If you have loads of pages that Google refuses to index, then action might be needed. But if it's just one page and people liked to share it, it seems kind of crazy to erase it to me "6. Checked and modified content if any our articles are having more keywords than what SEO experts recommend." Just so you know keyword density is not an accepted measurement in modern SEO. If your articles previously read like they were really spammy, you did the right thing. If they were reading fine anyway AND getting the keywords in, you may have hurt yourself more by doing this "7. We are working on researching more and figuring our what else can might have gone wrong with our traffic." Focus on R&I is healthy "8. Working on improving EAT " A lot of sites got stung, are still getting stung and will continue to be hurt by this. If you're writing for finance this massively concerns you and should probably be your #1 thing, not your #8 thing. Really focus on this a lot "I attached our traffic drop graph. I believe this drop is not natural it happened because of some issue at our end and we are not able to figure out the exact reasons." It could be many things but poor EAT on a finance site will kill your site in 2019 "S_urprisingly another site with not so high quality content started ranking now in the top._" That's your opinion and you are welcome to it, but the quality of your sites and other sites is being determined by mathematical algorithms and not by human minds. What you think is quality content, may be very far removed from what Google's mechanical mind perceives as high quality. Another thing, older sites which are more established can rank above your with lower quality content than you have (as their SEO authority and trust is higher). You need to think about winning trust and links. Maybe some crappy sites do rank well, but when they were first made they filled a hole (in the query-verse) or they were good for their time. You have to be good in YOUR time. What they did to earn their success (which they ride along on) may have been drastically less than what you have to do in 2019. Never forget that. Comparative analysis-paralysis doesn't get you ahead, it holds you back. Vision is what's needed now "I am here to get community members/experts help on this. I could provide you if you need any further details.    Thanks a lot for your time. We really appreciate any tips that you can share with us. " Not a problem hopefully some of my comments have proven useful to you guys

    Intermediate & Advanced SEO | | effectdigital
    0

  • I don't work for Moz but Moz Pro refunds are considered on a case-by-case basis, please email help@moz.com from your account email and provide the last 4 digits and the name on the card that was charged. We'll be able to review your account and respond within 24 hours.

    Technical Support | | DarinPirkey
    0

  • I would gravitate to marking everything up and letting Google decide what they want to show. Most of the time when you try to 'sculpt' what Google can see in terms of structured data, it usually results in a structured data spam action. Sometimes it can take weeks, months or years for that to happen - but Google always want to be given the full picture. Google don't take too kindly to being funneled in a certain direction. Schema and rich snippet spam have been a big headache for Google since they started utilising structured data more, some stuff (like author avatars for posts in SERPs) has been entirely taken away in the past (though someone has told me recently, they have been seeing these again for Google mobile layout only). Google do have some official guidance here: https://developers.google.com/search/docs/data-types/faqpage They give a microdata example of implementation: https://search.google.com/test/rich-results?utm_campaign=devsite&utm_medium=microdata&utm_source=faq-page In their example, nothing is missing or has been left out. Since that's how Google have illustrated their example, that's what I'd aim for myself

    Intermediate & Advanced SEO | | effectdigital
    0
  • This question is deleted!

    0

  • Thank you for the kind response. Always a pleasure helping other marketers and business owners. As you disavow backlinks, google (or any other search engine where you disavow them) will not calculate the links when ranking your sites, but they will still be present where posted and will show up when researched on sites like Moz, SemRush, Ahrefs, etc. The backlinks are not removed from Moz because the list of the disavowed links are stored on the database of the search engine to be used by them. For future disavowing you can check the newly-added links or filter the full site backlinks list with the disavowed links list. You can find the disavow links tool in the search console/webmaster tools of the 2 main search engines: Google and Bing. If you need help with the disavow process for your site, feel free to let us know. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com

    White Hat / Black Hat SEO | | Dalerio-Consulting
    1