Category: On-Page / Site Optimization
Explore on-page optimization and its role in a larger SEO strategy.
-
URL Indexed But Not Submitted to Sitemap
If you Care about the Search Console Error ONLY It just means Google has decided to index some URLs which do not appear in here: http://www.servicesarab.com/sitemap_index.xml so add them or don't, it really doesn't matter that much and it is extremely unlikely to change your SEO performance much at all If you are Really Concerned about SEO Performance personally I think that this is quite telling: https://analytics.moz.com/pro/link-explorer/spam-score?site=servicesarab.com&target=domain According to Moz, 45% of the domains in their index which share similar features to yours, have apparently been banned or penalised by Google (not really good) You site has (according to Ahrefs): https://ahrefs.com/site-explorer/overview/v2/subdomains/recent?target=servicesarab.com 18,840 backlinks from 242 referring domains. That's a pretty crazy balance, not very diverse at all https://d.pr/i/0ROVCA.png (screenshot) It seems like end of 2017 / beginning of 2018, tons of links for the site were produced in a very artificial and obvious way In the end, old SEO tactics that try and trick Google into thinking your site is great, are not very useful in 2018/2019 It's doubtful that there is a strong technical-SEO reason for the site's problems. It is more likely that Google is just restoring your site to 'where it should be' in the rankings It seems as if the usage of older SEO techniques tricked Google for a little while: https://d.pr/i/tsiXrh.png - but in the end, the unsustainable approach was not good enough
| effectdigital0 -
After 301 redirection non-English keyword points to English language pages
Hi, So the situation looks like this website domain1.co.uk in not regional - it's in only for UK but had multiple languages polish language pages on domain1.co.uk website were redirected (301 redirect) to domain2.com but with /pl "bit" so: domain1.co.uk/page1 > domain2.com/pl/page1 and domain1.co.uk/page2 > domain2.com/pl/page2 when user in UK look for Polish keywords, for some the keywords, the result is correct (it shows domain2.com/pl/page2) and for some Polish keywords user in UK got served domain2.com/page2. Please notice that in both cases is the same location and different polish keywords. pages on both sites was mostly "mirrored" which means article in Polish had English equivalent, so even when Polish user gets to English page, it's still relevant, thus increasing bounce rate as user don't necessary have to understand it. Would you need more info to get clearer picture of the situation? Thank you.
| Optimal_Strategies0 -
Ensuring that Google Display my Meta Descriptions
Delete all the other content on the page Nah just kidding. There's really no way to achieve that, other than writing a Meta Description which Google would prefer to render, against the rest of the page's content
| effectdigital0 -
What to do to index all my links of my website?
at least, how fast google is indexing your domain depends on your structure (and more). I read a few weeks ago something about an "indexing race" just to se wich way is the fastest to get your pages indexed (https://searchengineland.com/ready-set-go-googlebot-race-314894). Funny because not my preffered solution won. It depends on google if it would index every page of a domain - for your given example I think Google will not. Several products in different colors (and thats the only difference) on different urls. Not sure if there was the difference (the color) mentioned anywhere, or if it is just a different image (not my language). Thats not best practice. You have to deal with your crawl-budget for 1-2 million pages, but at the moment I think - all you can do is wait. And I saw a lot more pages indexed (460,000) - but somewhere in serps google stops showing me your pages, because they are to equal to others. I think (what I saw) there are a lot of Products without any single description, so you can do a lot of things to improve, until you wait.
| paints-n-design0 -
Duplicate content in sidebar
Hello, I'd say not, in the same sense that having a NAP in the websites footer doesn't have any negative effects on a website SEO ranking, in this specific content its the opposite
| jasongmcmahon1 -
Best practice to have gated white paper indexed by Google
I think I have the same question. I was just going to ask it less intelligently. Our gated content always takes the shape of a PDF form on Hubspot. What are the best practices for getting some SEO benefits from the Case Studies and white papers that we gate? (In case anyone is interested, here's a landing page where you can see what I mean: https://www.cockroachlabs.com/case-studies/mux/) Thanks everyone!
| DanKellyCockroach0 -
Will page be marked as 404 if you replace country specific letters from url?
Not a problem. Just remember that web user's browsers, Google's crawlers and SEO-tool crawlers all tend to react to certain factors a little differently Never lose sight on pleasing Google first. Never lose sight that Google wants to please users the most (as without them, all their ad-revenue disappears)
| effectdigital0 -
How to get rid of Google's manual action penalty on spammy schema markup?
Yes, I have submitted home page to google for indexing and it is indexed. Also, I have sent reconsideration request twice after removing all the schema codes from the website but it got rejected.
| Ananya_Ramje0 -
What are good tests to propose different SEO agencies when you're trying to vet them?
Thank you!!! These questions are golden! I agree. It's important to see if they have a good understanding of how to measure and track our goals. We ultimately want to make sure that our investment in an SEO agency has a good ROI. Thanks again! I appreciate all of the help!
| NBJ_SM0 -
Canonical Homepage Multi-language Website
You should only need example.com -> Canonical to example.com/nl
| jasongmcmahon0 -
Is an Info Directory best on a Sub-Domain or Sub-directory
Don't use the sub-domain unless you have to which is generally when you want to run debit services under one domain such as a shop alongside your main website - or - you offer 2 completely different offerings under one domain selling cars & boats etc
| jasongmcmahon0 -
Creating Tables with Multiple Links
Hi there, Should be OK as long as you don't repeat the same schema in the same page... One schema type per page.
| jasongmcmahon0 -
How do I redirect SEO from pages on old website to new website with new domain name?
301 redirects & point the root domain also, your hosting provider will help with this.
| jasongmcmahon0 -
Should I be disallowing my forum in the robots.txt file?
I have a problem my site's robots.txt is blocking moz can anyone help me Regarads OMOD
| waleedkhalid0 -
My organic search dropped over 25% since March Update....Any insights?
I will. I have a team starting tomorrow!
| Purecbdvapors0 -
301 Redirects - Large .htaccess file question
Sorry I meant to add some links that might be useful. here they are https://moz.com/community/q/will-301-redirects-slow-page-speed https://moz.com/blog/heres-how-to-keep-301-redirects-from-ruining-your-seo https://moz.com/community/q/massive-301-permanant-redirects This is an older question on here but may have some useful insights. https://webmasters.stackexchange.com/questions/113246/does-too-many-301-redirects-harm-serp-rankings http://www.thesempost.com/best-practices-for-301s-in-large-htaccess-file/ https://www.impactbnd.com/blog/301-redirect-lessons
| MrWhippy0 -
Internal Website Linking from Syndicated Blog Posts
Hi, It depends, if the links are topical and flow naturally in the content then they should help. If they are spammy and not related at all to the content, then I would think they will be ignored at best. Internal links help show Google etc. which pages you believe to be the most important on your site, that doesn't mean you should include lots of spammy links on your pages though. More internal links pointing to a page that is topical and relevant to the content is the best way. Steve
| MrWhippy0 -
Keyword Cannibalization vs. Optimizing Site
.... each of your pages/articles would still be focussing on a single keyword (or keyword cluster) while bearing in mind the overall goals. You are 100% correct. They are all about the same long tail keyword with even longer long tail keywords as variants. You wouldn't have two pages competing for "Stihl Chainsaw MS170 Maintenance" for example, but you might have multiple pages that are talking about "Stihl Chainsaw MS170" in various ways, all probably linking to the page where the customer could buy the actual product or parts, etc. You are right. And, those pages linking to one another will support the attack on "Stihl Chainsaw MS170"... and all of them plus all of the pages for other models will support the attack on "Stihl Chainsaws".
| EGOL2 -
Help recover lost traffic (70%) from robots.txt error.
Firstly, I would definitely take the opportunity to switch to SSL. A migration to SSL shouldn't be something to worry about if you set up your redirects properly, but given that most of your pages aren't indexed at all, it is even less risky. You will eventually get the traffic back, as far as how long, it's very difficult to say. I would concentrate on crawlability, and make sure your structure makes sense, and that you aren't linking any 404's or worse. Given the size of your site, that wouldn't be a bad thing anyway. From your description of your pages, I'm not sure there is any "importance hierarchy", so my suggestion may not help, but you could make use of Google's API to submit pages for crawling. Unfortunately, you can only submit in batches of 100 and you are limited to 200 a day. You could, of course, prioritise or cherry pick some important pages and "hub" pages, if such things exist within your site, and then start working through those. Following the recent Google blunder where they deindexes huge swathes of the web and, in the short term, the only way to get them back in the index was to resubmit them, someone has provided a tool to interact with the API, which you can find here: https://github.com/steve-journey-further/google-indexing-api-bulk
| Xiano1 -
Canonical: Same content but different countries
In response to your second question, it's fine to have /usa/ although /us/ or /en/ would be a more typical deployment (lots of people go like, /en-us/ and /en-gb/ as that structure allows for really granular international deployment!) As long as the hreflangs are accurate and tell Google what language and region the URLs are for, as long as the hreflangs are deployed symmetrically with no conflicts or missing parts - it should be ok Note that Google will expect to see different content on different regional URLs, sometimes even if they're the same language but targeted at different countries (tailor your content to your audience, don't just cut and paste sites and change tags and expect extra footprint). Stuff like shipping info and prices (currency shown) should also be different (otherwise don't even bother!) Your hreflangs, if you are doing USA as your EN country, should not use 'en-gb' in the hreflang (instead they should use 'en-us') If you're thing God the HTML implementation will make the code bloated and messy, read this: https://support.google.com/webmasters/answer/189077?hl=en There are also HTTP header and XML sitemap deployment options (though IMO, HTML is always best and is the hardest, strongest signal)
| effectdigital0