It might just be Google being Google, using too many off-page signals (e.g: links, local citations / accepted directory listings). I wonder if there are inbound signals contradicting on-page factors
Posts made by effectdigital
-
RE: Google Adding Incorrect Location to the end of Title Tags in SERPs
-
RE: Need help understanding this Moz Chart comparing link metrics against competitors...
From the looks of it that just means that 60% of your external backlinks are 'followed', meaning they contribute to Google's rankings. External links are the links which your site has gained from other domains (which link to you) thus they are probably not under your control anyway
-
RE: URL Parameters
Just so you know, if a URL results in a 5XX server error then it usually won't render your canonical tag to begin with! You might want to check your sitemap XML, to check that it's not 'undoing' your canonical tags by feeding these URLs to Google. Indexation tags must be perfectly aligned with your sitemap XML, or you are sending Google mixed messages (e.g: a URL is in sitemap XML so Google should index it, but when it is crawled it contains a canonical tag citing itself as non-canonical, which is the opposite signal)
Everything which Gaston said is right on the money
-
RE: No index for http version of website
No you don't need to change anything. In-fact, you actively DON'T want the HTTP sitemap to be feeding Google a list of HTTP URLs, which I am sure you are trying to steer Google away from. Only feed Google the HTTPS URLs, delete the HTTP sitemap from search console if you can so that it doesn't keep flagging false positives, and feeding Google bad (insecure) URLs
-
RE: How can I optimise my key pages for new (related) key phrases as they arise, without compromising the original optimised keywords?
Expand the content with new sections thematically bound to the new keywords. If you dilute your page by making it about too many things, your original ranking could suffer. Be careful how you proceed! As long as the new keywords are highly related to the old one it shouldn't cause too many problems. Another concern would be, expanding the content to the point where the page's performance (loading speed) suffers. If you embed too much non-optimised rich media, that could be a problem. Check your before and after URLs using Google Page-Speed Insights, and learn how to properly optimise images /videos (and other media) before you upload them into your content. Don't 100% rely on CMS plugins to do this for you
-
RE: Google Adding Incorrect Location to the end of Title Tags in SERPs
Are there references to the London address on the non-London URLs? For example, on the Scotland store page look for references to the London address, e.g: in the footer of the page. You may need to alter some on-page instances to correct this, but the structured data should help
-
RE: Does &pws=0 still work?
This post from Jan 2020 seems to assume that you can still use &pws=0
... but I don't know how reliable it is!
-
RE: Moz bot not discovering important links (high DA sites link)
Rogerbot will take time to crawl links on the web. Lots of those big sites, where the links are very valuable, have thousands or hundreds of thousands of pages. One thing that I do think would be nice, if Moz Pro users could have some crawl allowance devoted to them - and could ask rogerbot to ping URLs (and update Moz's index). It would obviously have to be limited so as not to skew Moz's regular crawling operations, but it could be really helpful for some users
-
RE: Inbound Links - Redirect, Leave Alone, etc
If you want to disavow and redirect at the same time, you probably wouldn't want to use a 301 which passes SEO authority (and negative equity) along to the resultant page. I'd probably use a 302 or a 307, and then disavow the linking domain (or page) in Google's disavow tool which is here. I might also try to no-index the redirecting URL, though with a redirect in place this could not be done within the HTML / source code. You'd have to deploy the no-index directive via the HTTP header instead, using X-Robots
-
RE: Entities and SEO
You're thinking along the right lines, especially focusing on corpus-based co-occurrences between words. The corpus isn't the keyword though, it's the body of text which you check the keyword against (and it has to be substantial to be accurate, in the Gigabyte range). I don't know of any pre-built software to assist you. You would have to develop scripting knowledge
-
RE: Is there a way to view the top 100 highest priority keywords?
If you gain access to the site's Search Console (Google product) then you can view all the search queries connecting with the site over time. The ones driving the most clicks would be the priority ones to 'look after' in a take-over scenario. Use the search-query data from Search Console, do not use the "keyword" data from Google Analytics (as at this point, years after Not Provided, it provides a virtually worthless sample)
-
RE: Keyword rich domain names -> Point to sales funnel sites or to landing pages on primary domain?
To me this depends upon the traffic build of your old domains. If they mostly receive direct and referral traffic, then the redirect idea could work very well. If they gain most of their traffic from Google, redirecting them will eventually make them stop ranking as Google don't like to rank (in the long-term) redirecting URLs
Once that occurs, your main site may gain ranking features from your old sites, but even with perfect redirects (using the mighty 301) you would still stand to lose rankings. Google will basically check how similar the last active cache of the redirecting URL is to your new page (the redirect destination). Even with a 301; if the content (in machine / Boolean terms) is highly 'dissimilar', then your new page will only receive a fraction of the SEO / ranking authority of the old (redirecting) URL. This is to stop webmasters buying up authoritative expired domains, redirecting them to themselves and gaining free ranking power
From Google's POV, a lot of ranking 'power' (authority) still comes from links. Which sites have the best links? Do other sites in the same area of the web (same theme) have more quality links? How fresh are those links? Are there any positive / negative trust signals to refract along that axiom?
When a page ranks well on Google, it is because it has recently (or historically) 'impressed the web' (thus gaining backlinks and un-linked citations). If you replace a page which has 'earned' links with another page (like a sales funnel) or redirect it to a completely different page, why should that new page benefit from the same links? The webmasters who linked to the old URL, may not have chosen to link to the new page (be it a replacement or redirect destination) so it shouldn't see loads of SEO authority coming from a past legacy
Obviously if you just change domain and the pages are essentially the same, then it's fair that those pages retain their former Google rankings. This is why Google has to validate the 'similarity' of old vs new pages (whether they replace the current content, or exist at the end of a redirect)
Be careful with your path forwards. You could have a 'great idea' only to lose most of the traffic which those domains were supplying
Obviously if the old domains which you are sweeping up, don't see much traffic from Ads or Google (SEO / organic), then you can do pretty much whatever you want with them. But if the traffic came mostly from Google (organic) then it may be tricky. It may also be tricky to redirect the domains if paid ads are served to them, as ads will often be 'disapproved' if they point to a redirecting URL (true of FaceBook and Google ads). So at the very least there would be a major overhaul of your ads campaign(s) which would be required
-
RE: Why not just use an alias if the only change is a different domain Name?
It depends what you mean by 'alias'. If you means configuring the old domain to properly 301 redirect all URLs from the old site to the new site (so the old site becomes inaccessible, due to serving as a redirect platform) then yes. If you mean doing something else, like pointing the old domain to your new site - other than by 301 redirects, it's probably not a good idea for SEO!
-
RE: What is the alternative of Quora?
Unfortunately that area of the web is in decline. Most Q&A systems now function via apps or closed communities like Online Geniuses (via Slack) or the Buffer Community. This is sad, because Q&A forums (like this one) add great value to the web
Although Quora is one of the best Q&A sites, it's way worse than it used to be. They made loads of changes to it to make it too simple (when people sometimes have complex questions). They really angered their own community to the point that loads of people left. Lot's of people started saying that Quora was bad, even their own users (openly)
It used to be a place that an average person could go to ask a question and get an intelligent answer from a smart person. Now it's barely better than Yahoo Answers (in my opinion)
I used to use it all the time, back in 201X's it was worthwhile. Now it's not
-
RE: Google says Geolocation Redirects Are Okay - is this really ok ?
Yes it will cause a problem if you just do it in such a basic way. Google crawl from numerous data-centres, based in different countries. As such, Googlebot will crawl from different places and will keep thinking different areas of your site are going up and down all the time. The remedy of course, is to exempt Googlebot (user-agent) from your redirects

-
RE: How can I secure my website?
Well, to prevent information / data leakage you should certainly disable directory browsing
For example, on your homepage I can right-click your logo image and copy the image URL: https://www.myqurantutor.com/wp-content/uploads/2019/07/MY-QURAN-TUTOR-LOGO-400x56.png
But I can edit the link to the directory level, for example:
https://www.myqurantutor.com/wp-content/uploads/
Now I can see all your uploads, ever:
- https://d.pr/i/C7DTY4.png (screenshot)
I can browse all your folders, even some backup files. There's also some info I can use to fingerprint your site build if I want to. To patch this, usually all you have to do its add "Options -Indexes" to your .htaccess file
I didn't detect a firewall shielding your site, which would make it way easier to DDoS if someone wanted to do that. Some kind of firewall or traffic offloading facility might be useful
Your site isn't using an HSTS entry ("Strict-Transport-Security") in the header so browsers can attempt to connect via HTTP without being intercepted (though you may handle that via redirects instead, an HSTS policy helps). You don't seem to be using "X-Frame-Options" in your header which helps browsers to know, whether content from your site can be rendered inside of frames (on other domains). If you allow frame embeds, that can lead to clickjacking and stuff (though for some webmasters there's no real way around it as allowing their site's content to be embedded, may be a requirement)
I can't really find any fields which seem as if they would be vulnerable to SQL injection, but I'm not really an expert at scanning for that kind of thing. I'd assuredly lock down the site from an SQL-I perspective, if you haven't done so already
-
RE: Which SEO tactic should I concentrate on first?
In SEO, most of the ranking factors are weighted relatively equally. It's less about cherry picking the easy-wins, and more about putting in volumes and volumes of effort (overall)
Moz does have some notes on top ranking factors: https://moz.com/learn/seo/on-page-factors - but they're not numbered, or in a particular order (as far as I know)
You can search Google for "top Google ranking factors" but you'll just get loads of different answers, ordered differently.
This list isn't awful: https://optinmonster.com/seo-ranking-factors/ - but I'm sure many people here will take issue with it, as they would with any list that was posted
I actually think a number of Moz's Whiteboard Friday videos are pretty spot-on:
- https://moz.com/blog/how-to-create-10x-content-whiteboard-friday
- https://moz.com/blog/does-seo-boil-down-to-site-crawlability-and-content-quality-whiteboard-friday
- https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday
^ These are my three go-to videos in terms of beginning to learn about (or teach) SEO
-
RE: Does google penalize you if you post content in french and english on a website
No - translations don't count as duplicate content, but you should ensure that your site has a proper multiregional build-out (e.g: site.com/press-releases/artice (EN) vs site.com/fr/press-releases/artice (FR)
You should properly 'build out' the site in an international way, don't use low quality auto-translate plugins or live-translation features. You will need all your hreflang tags set up properly, so Google knows they are alternate language page variants (see: https://yoast.com/hreflang-ultimate-guide/)
See this from 2011: https://www.youtube.com/watch?time_continue=2&v=UDg2AGRGjLQ&feature=emb_logo where Matt talks about whether translations are duplicate content. AFAIK Google's stance hasn't changed loads. The translation must add value and you must use human-translated content (written by someone competent enough, that it doesn't read as if it was written by a machine)
More recently John Mu (from Google) has said that auto-translated content won't gain penalties but the rankings will suck so basically still get humans to write stuff: https://www.seroundtable.com/google-auto-translating-content-penalty-28413.html
Interestingly Google recently said that they think there may come a time in the future where auto / machine-translated content is acceptable: https://www.seroundtable.com/machine-written-content-google-guidelines-28338.html
... but as of now, it's still considered poor and against guidelines!
-
RE: How many topics per H2
Ah I get what you are saying! You are wondering whether to write about those pass-through points in your current article (under H2s) or whether to have a completely new article (e.g: "Strasbourg bike tour" as a sub-heading under "Alsace bike tour" vs making it another separate article / post)
IMO you could do both, but only once you get enough experience and content. Let's say you had done one bike tour 'from' Alsace and discovered Strasbourg, and then later you had done another separate bike tour 'from' Strasbourg. In that instance, both locations should have a post. In the Alsace post you would mention Strasbourg, but link it over to the 'main' Strasbourg bike tour article
If on the other hand you have done a bike tour from Alsace (where you discovered Strasbourg) but you have not yet completed a separate bike tour 'from' Strasbourg, in that case I would H2 it as a sub-topic under your Alsace article
Hope that helps
-
RE: Links Not Detected by MOZ, AHREFS, GSC-ARE THESE QUALITY LINKS?
Hi there
I am guessing you are the owner of Metro Manhattan judging by the pattern of the web-pages, in terms of where they are linking to! I think we may have chatted before actually (correct me if I am wrong)
https://patch.com/new-york/bayside/bayside-queens-priciest-area-retail-office-space-study
This post is indexable on Google. It has really good authority (90/100 domain rating on Ahrefs) and would be a great link, but the link has been no-followed. It doesn't influence Google's rankings at all. Your link shares the URL with other links which also point to other external sources
SEO rating: 0/10, as it's no-followed
This post is indexable on Google. Solid domain rating on Ahrefs (73/100). You don't share the post with too many other link destinations. The link is followed, so it should count for Google
SEO rating: 7.5/10
https://patch.com/new-york/brooklyn/flatbush-ave-priciest-retail-spot-outside-manhattan-study
You only posted five links, you'd think your supplier would manage to get five links on different domains, the more links you have from one domain (in a short space of time) the less they matter. Regardless, this link won't count for SEO at all as it's no-followed
SEO rating: 0/10, as it's no-followed
An average domain rating on Ahrefs of 53, not bad at all. Sadly, your link supplier (not the site owner) 'lied' to you when they said this was one of your backlinks. There are no links from this page to your page. If there's no link at all, how good could the link be?
SEO rating: -5/10, as they (your link suppliers, not the owners of Thejewishvoice.com) fibbed to you. Naughty naughty!
Evidence screenshot of failing to find link: https://d.pr/i/WfUUgV.png
https://atalyst.com/investment-banking-interview-metro-manhattan/
The Ahrefs domain rating for "atalyst.com" is only 16/100, that's not something where you could consider it a 'quality' link in my opinion. The link to Metro Manhattan on this URL is followed, so it does count towards rankings - but load the page and look at it! It's very thin content, I'd go so far as to say it's 'not content' at all. It's an image and a strapline and a link. If you think Google cares about this, you are wrong
SEO rating: 2/10, might help a little, but barely at all
So now, I have looked at each link and graded it out of 10. The scores were 0, 7.5, 0, -5 and 2. Each link could have scored 10/10, which means our total max possible score is 50 (10 x 5 links)
When we add the numbers up, we get 4.5/50, or a 9% rating
For something to be considered 'quality', I think it should have to be "50% decent or higher". 9% isn't good enough (IMO)