Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Well, to prevent information / data leakage you should certainly disable directory browsing For example, on your homepage I can right-click your logo image and copy the image URL: https://www.myqurantutor.com/wp-content/uploads/2019/07/MY-QURAN-TUTOR-LOGO-400x56.png But I can edit the link to the directory level, for example: https://www.myqurantutor.com/wp-content/uploads/ Now I can see all your uploads, ever: https://d.pr/i/C7DTY4.png (screenshot) I can browse all your folders, even some backup files. There's also some info I can use to fingerprint your site build if I want to. To patch this, usually all you have to do its add "Options -Indexes" to your .htaccess file I didn't detect a firewall shielding your site, which would make it way easier to DDoS if someone wanted to do that. Some kind of firewall or traffic offloading facility might be useful Your site isn't using an HSTS entry ("Strict-Transport-Security") in the header so browsers can attempt to connect via HTTP without being intercepted (though you may handle that via redirects instead, an HSTS policy helps). You don't seem to be using "X-Frame-Options" in your header which helps browsers to know, whether content from your site can be rendered inside of frames (on other domains). If you allow frame embeds, that can lead to clickjacking and stuff (though for some webmasters there's no real way around it as allowing their site's content to be embedded, may be a requirement) I can't really find any fields which seem as if they would be vulnerable to SQL injection, but I'm not really an expert at scanning for that kind of thing. I'd assuredly lock down the site from an SQL-I perspective, if you haven't done so already

    Educational Resources | | effectdigital
    1

  • Doing a "site:" search for your website shows that there are over 90k pages sitting in the index. I'm not sure how accurate that is, but at least the site is properly indexed. I'm not sure what you mean by "how can i change my content" - this is a WordPress site, so I'm assuming adding/changing content isn't a difficult task since you have what appears to be a well-built site.

    Intermediate & Advanced SEO | | WebQuest
    0

  • the galeries are image-only. So you will not see them in SERPs for any querry without brandname I am sure. No Penalty for same alt and title, its better to use allways the same alt for the same image. What these Galerys are: Thin content and maybe duplicate content - because no content but images. Additional information for the images (maybe on hover, next to the best or most important once) and some content about the galery could be a solution.

    Keyword Research | | paints-n-design
    0

  • It's because the search engines actually treat every parameter based version of your URL as a separate page, so it really does look like duplicate content to the search engine. So Moz is crawling them in the same way.  This is informative for you. In your case, you have canonical tags back to the version with the query string, which would be the fix for you anyway.

    Local Strategy | | Kenn_Gold
    0

  • Hi there, If you haven't done anything wrong or spammy in an SEO matter, then you shouldn't worry that much. The first thing that I'd advise you to look for answers is in the previous history of your domain. It's common that old domains have a history of being used in spammy tactics. The second thing to consider is that this is just a private metric, it DOES NOT MEAN that you will have problems or going to be penalized in google search. Then, I'd take a deep read and try to understand what other factors could be impacting your really high spam socre. Here you can find the 27 most important factors for Moz's spam score. Hope it helps, Best luck.

    White Hat / Black Hat SEO | | GastonRiera
    0

  • Hi! This might answer your question: https://webmasters.stackexchange.com/questions/67654/seo-if-another-website-uses-a-iframe-from-our-website-would-the-links-in-the-i It was a little ambiguous which method you were going to use for the embed code, if you were planning to have others embed via iframe, then the links would be you to you.  In that case, it may hurt your site if a more popular site is treated as a canonical and is indexed instead of yours.

    Intermediate & Advanced SEO | | gennarojewelers
    0

  • No - translations don't count as duplicate content, but you should ensure that your site has a proper multiregional build-out (e.g: site.com/press-releases/artice (EN) vs site.com/fr/press-releases/artice (FR) You should properly 'build out' the site in an international way, don't use low quality auto-translate plugins or live-translation features. You will need all your hreflang tags set up properly, so Google knows they are alternate language page variants (see: https://yoast.com/hreflang-ultimate-guide/) See this from 2011: https://www.youtube.com/watch?time_continue=2&v=UDg2AGRGjLQ&feature=emb_logo where Matt talks about whether translations are duplicate content. AFAIK Google's stance hasn't changed loads. The translation must add value and you must use human-translated content (written by someone competent enough, that it doesn't read as if it was written by a machine) More recently John Mu (from Google) has said that auto-translated content won't gain penalties but the rankings will suck so basically still get humans to write stuff: https://www.seroundtable.com/google-auto-translating-content-penalty-28413.html Interestingly Google recently said that they think there may come a time in the future where auto / machine-translated content is acceptable: https://www.seroundtable.com/machine-written-content-google-guidelines-28338.html ... but as of now, it's still considered poor and against guidelines!

    Content & Blogging | | effectdigital
    0

  • As Effect Digital mentioned, this happens when Google is trying to figure out which page best satisfies the intent behind the search. I would advise checking the rest of the pages that are ranking for the term and see what kind of content these pages have and what intent are they satisfying. You can then see which of the pages on your website are in line with that intent and adjust your on-page SEO elements accordingly. I  would also recommend looking into the backlinks pointing to these pages and see which one has the most links in place and take that into consideration when you decide to make any changes.

    Intermediate & Advanced SEO | | WebQuest
    0

  • You need to pay for Moz to bulk fetch Moz-created metrics. If you have an active Moz subscription with API access, I believe you just have to visit this URL: https://moz.com/products/mozscape/access to get your "Access ID" and "Secret Key". Without these most tools will fail all Moz-related bulk metric fetches. Such tools just provide automation, they don't provide 'free' access to Moz's data

    API | | effectdigital
    0

  • That very much sounds like, for some reason Google has gone from viewing that particular page template as "decent, worthy of rankings" to "ok, will rank if I can't find something better". One thing I am wondering, if you have been hit by this: https://moz.com/blog/google-review-stars-drop-by-14-percent ... which is also related to this: https://webmasters.googleblog.com/2019/09/making-review-rich-results-more-helpful.html Specifically where they say "Self-serving reviews aren't allowed for LocalBusiness and Organization. Reviews that can be perceived as “self-serving” aren't in the best interest of users. We call reviews “self-serving” when a review about entity A is placed on the website of entity A - either directly in their markup or via an embedded 3rd party widget. That’s why, with this change, we’re not going to display review rich results anymore for the schema types LocalBusiness and Organization" It seems as if 'something' existed on your product detail pages which Google was valuing highly, which they no longer value at all. Thus you aren't seeing complete drop-off, but a high correlation between declining (or removed) results and pages utilising that feature Basically self-hosted reviews and some embedded reviews 'no longer count' towards Google rankings (at all). The news broke in September 2019, but I wouldn't be surprised if the roll-out was more recent. Moz posted that they noticed movements on Sept 24, which is very nearly November. As we know, these types of updates tend to slowly crawl across Google's query-spaces, it's not often true that everyone gets hit at once Maybe your site is just in the late batch

    On-Page / Site Optimization | | effectdigital
    0

  • Hi Lee! Thanks for clarifying this. Yes, then, if your offices are staffed during stated business hours by your own staff, have unique phone numbers for each office, and are meeting face-to-face with customers you are: Eligible to have 2 listings Free to link those listings to whatever page on your website you want to. But, now, let's get into the nuances of this. Which page/pages is it best to link to from GMB listings? This is a subject of long debate in the local SEO industry. The two schools of thought  go something like this: You should link all of your listings for a multi-location business to your homepage, because homepages typically have the strongest DA and this can then boost the local pack rankings of all locations. -or- You should link each location's listing to that location's landing page on the website, because it's better Usability. UX would suggest that the local customer clicking on the GMB listing wants to be taken to the page for that listing's location - not taken to a homepage and then having to click around to find the address/phone number/other details for the location they want to visit. Option 1 might work best for you, because if you've only got 2 locations, you can likely display their complete NAP on your homepage without it being burdensome. But, it's a decision you'll need to make based on whether you believe rankings or UX  need greatest emphasis  for your customers and your business. Hope this helps!

    Local Listings | | MiriamEllis
    0

  • Hi effectdigital Yep I won't be doing that! I don't think I'll be using the robots.txt file at all for this. Thanks Frankie

    Intermediate & Advanced SEO | | Frankie-BTDublin
    0

  • Ah I get what you are saying! You are wondering whether to write about those pass-through points in your current article (under H2s) or whether to have a completely new article (e.g: "Strasbourg bike tour" as a sub-heading under "Alsace bike tour" vs making it another separate article / post) IMO you could do both, but only once you get enough experience and content. Let's say you had done one bike tour 'from' Alsace and discovered Strasbourg, and then later you had done another separate bike tour 'from' Strasbourg. In that instance, both locations should have a post. In the Alsace post you would mention Strasbourg, but link it over to the 'main' Strasbourg bike tour article If on the other hand you have done a bike tour from Alsace (where you discovered Strasbourg) but you have not yet completed a separate bike tour 'from' Strasbourg, in that case I would H2 it as a sub-topic under your Alsace article Hope that helps

    Intermediate & Advanced SEO | | effectdigital
    0

  • You're not competitive enough. For very difficult keywords, sometimes it takes whole teams of digital professionals (SEO people, Digital PR people, creatives and devs to execute design concept of 10x content, R&I dept) to secure such keywords. Trying to solo such a thing sounds like madness to me Your best bet is to be honest with yourself. Google the keyword and see what comes up in positions 1, 2 and 3. What do they have that you don't? For example if the keyword was "car insurance" you'd see sites like Money Supermarket, Go Compare and Confused.com at the top. If you were a single car insurance company, you might say "but my page looks so much nicer! It has better design and SEO!" - newsflash, Google isn't looking to rank the pages with the best SEO Sites like Money Supermarket and Confused.com are bound to rank at the top. By using them, a user can compare deals from hundreds or thousands of insurance companies and brokers. Sometimes the technology is the content, and no matter what words you write or what images you use you'll never ever beat it without significant investment If you're struggling away alone, you might have to rethink what keywords you are going for. In 2020 SEO isn't just about links and bits of writing. It's about coming up with a unique product that no one else has, which users would prefer to the current industry norms. If you can't do that, you can't win

    Intermediate & Advanced SEO | | effectdigital
    0

  • PageSpeed insights is cool but I find GTMetrix often gives more useful info. If you sign up for a free account you get access to a cool waterfall tab Here's the complete waterfall for your homepage: https://d.pr/i/a6OUZb.png (screenshot) In blue I highlighted some images which are over 50KB which could be compressed a little more. In red I highlighted all the calls which your site / page makes to YouTube.com. As Greg Painter specifically said, all of these calls to YT do seem to be slowing things down and causing you problems! Any chance you can embed those resources (which your page is calling for) natively on your site? It would be faster than keep calling out to YT which has to satisfy enormous amounts of traffic as it is

    Web Design | | effectdigital
    0

  • Hi, I'd simply delete all the ones that get no traffic. Do the rest provide value? If so? Consolidate the rest into logical useful tags. We have done this quite a few times and removing the bloat has positive impacts. You can crawl the site after with something like screaming frog to find any 404's. You can create redirects for broken pages. This will make sure no 404's exist. I hope that helps? PS If you don't need tags at all you can remove them from the index to test the water...  If you have Yoast you can set tags to no index on this page: (yourdomain.com)/wp-admin/admin.php?page=wpseo_titles#top#taxonomies Then if you don't need them and none are providing no value. Delete and apply redirects for broken pages if necessary.

    Technical SEO Issues | | SolveWebMedia
    0

  • That's highly dependent on the query itself. For example, for research-based searches, you'll see authoritative websites like Wikipedia dominate the search results as those kinds of searches are not relevant to any location. Think of a medical term, or a name of a movie,  or a type of flower for example. However, for navigation-based searches, and local services based searches, you'll see very localized results. If you search for "plumber in CityName" - you'll see different results every time you use a different city name. It comes down to the search intent as Google always aims to serve that intent to the best of its ability & understanding.

    Local Listings | | WebQuest
    0

  • Thanks Paul. It is just this page that the issue is with. it is also a brand new site that isnt ranking. so hopefully it resolves itself eventually. Thanks, Ryan

    Web Design | | RyanMeighan
    0