Thanks, valuable advice! I will put it to good use.
Posts made by Bob_van_Biezen
-
RE: Duplicated content multi language / regional websites
-
RE: Duplicated content multi language / regional websites
Thanks for your comment Dirk!
Rewriting the content would be the best case scenario. Do you think it's a absolute must to rewrite those words (let's say, because Google would els filter out the .be domain if it's a exact copy) or would it be an extra to make the website convert even better and add a extra trust signal to Google?
It would probably be a pain in the ass for this webshop to check all there product descriptions for any possible words to change. They would probably not launch the .be website if it would take them a week or two to go through all the pages.
-
RE: Duplicated content multi language / regional websites
Thanks for both of your opinions! Since this client is looking for the quickest fix possible, what is your opinion on the optional points:
- Crosslinking every page though a language flag or similar navigation in the header.
- Invest in gaining local .be backlinks
Do you think they are neccessary or add enough extra value to justify the extra costs (especialy for the extra backlinks)?
-
RE: Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Hi Gianluca,
Your comment made me doubt my research. I started a new question about it. Do you have a minut to give your vision on my situation? I would really appreciate it.
https://moz.com/community/q/duplicated-content-multi-language-regional-websites
Best regards,
Bob
-
Duplicated content multi language / regional websites
Hi Guys,
I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research.
The case:
A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries.
They are willing to implement the following changes:
- - Href lang tags
- - Possible a Local Phone number
- - Possible a Local translation of the menu
- - Language meta tag (for Bing)
Optional they are willing to take the following steps:
- - Crosslinking every page though a language flag or similar navigation in the header.
- - Invest in gaining local .be backlinks
- - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant).
The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands?
Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website.
Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example.
I would love to hear from you guys.
Best regards,
Bob van Biezen
-
RE: Do You Know What's Triggering Your Local Packs?
Thanks for the link, didn't know about that. Could indeed be game-changing!
-
RE: Crawled page count in Search console
Hi Don,
Thanks for the clear explanation. I always though disallow in robots.txt would give a sort of map to Google (at the start of a site crawl) with the pages on the site that shouldn’t be crawled. So he therefore didn’t have to “check the locked cars”.
If I understand you correctly, google checks the robots.txt with every single page load?
That could definitely explain high number of crawled pages per day.
Thanks a lot!
-
RE: Crawled page count in Search console
Hi Don,
You're right about the sitemap, noted it on the to do list!
Your point about nofollow is intersting. Isn't excluding in robots.txt giving the same result?
Before we went on with the robots.txt we didn't implant nofollow because we didn't want any linkjuice to pass away. Since we got robots.txt I assume this doesn’t matter anymore since Google won’t crawl those pages anyway.
Best regards,
Bob
-
RE: Crawled page count in Search console
Hi Don,
Just wanted to add a quick note: your input made go through the indexation state of the website again which was worse than I through it was. I will take some steps to get this resolved, thanks!
Would love to hear your input about the number of crawled pages.
Best regards,
Bob
-
RE: Crawled page count in Search console
Hello Don,
Thanks for your advice. What would your advice be if the main goal would be the reduction of crawled pages per day? I think we got the right pages in the index and the old duplicates are mostly deindexed. At this point I’m mostly worried about Google spending it’s crawlbudget on the right pages. Somehow it still crawls 40.000 pages per day while we only got around 1000 pages that should be crawled. Looking at the current setup (with almost everything excluded though robots.txt) I can’t think of pages it does crawl to reach the 40k. And 40 times a day sounds like way to many crawled pages for a normal webshop.
Hope to hear from you!
-
RE: Do You Know What's Triggering Your Local Packs?
Hi Miriam,
I added keywords to the business titles of Google My Business listings and I didn’t see any big ranking improvements on categories the business already appeared at.
I added keywords to the business titles of Google My Business listings and I saw a big improvement on categories the business didn’t already appeared on. So adding “SEO” in the business title of an webdevelopment firm could result in them being shown on keywords like “SEO The Hague”.
I Removed keywords from (compeditors) business titles and didn’t see a drop in their ranking visibility.
At this point I believe keywords in the business title could help Google associate your business with a keyword or service type but isn’t (or is only a small) rankingfactor. It’s the difference between being shown, or not being shown at all. Not the difference between position 2 and 3.
Note: I only tracked this with a few Google my business pages for a short period of time.
About the advice from the Google support staff, heard it the last time on 10 dec. 2015.
-
RE: Do You Know What's Triggering Your Local Packs?
Hi Miriam and Kristen,
I didn't do a big case study or what so ever on this topic but I reported this kind of keywords in the business title through Google Maps and I didn’t see any noticeable changes after the changes were applied. At least not with the top 3 in the local pack. I can however confirm that adding this information causes fluctuations since I did some testing with a few websites. Although It didn’t cause a big boost it did help a tremendous amount with being shown on certain keywords. Since there isn’t a category for every niche this helped some business being showed on the right keywords on the first place.
This is a subject were a bigger case study will be needed I think.
Btw, another interesting aspect I found is the local business support desk recommending Google plus activity on the account. Views, followers, posts etc. Did get this tips 2 times after I called them a few times. Not sure it works, but we’re trying this out for our own account at the moment.
-
Crawled page count in Search console
Hi Guys,
I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console.
History:
Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this:
- Noindex filterpages.
- Exclude those filterspages in Search Console and robots.txt.
- Canonical the filterpages to the relevant categoriepages.
This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day.
To complicate the situation:
We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well.
Questions:
- - Excluding in robots.txt should result in Google not crawling those pages right?
- - Is this number of crawled pages normal for a website with around 1000 unique pages?
- - What am I missing?
-
RE: Keyword stuffing?
Hi Dmitrii,
Thanks a lot for your feedback. This really helps making some content based decisions.
Best regards,
Bob
-
RE: Keyword stuffing?
Hi Dirk,
Thanks for your response! This gives me just the bit of extra assurance to push for rewriting the pages!
Best regards,
Bob
-
Keyword stuffing?
Hi Guys,
I'm working on an site which faces some ranking problems. Although some of the problems have been mapped and will get fixed in the future I’m wondering if you could give me an second opinion on the amount of keywords used on the website. Although the texts reads “OK” I’m wondering if the site could experience negative influences of the amount of keywords used.
Website: http://premium-hookahs.nl/
Main keyword: waterpijp / shisha
Besides the general keyword the secondary keywords get used a lot on category and product pages.
I would love to hear your opinion!
-
RE: Outreach with your own name
Hi Andy,
Thanks for sharing. I stumbled upon it right after I posted this discussion on the Q&A. It's an great video and does indeed gives a lot of helpfull information!
-
RE: Outreach with your own name
Hi Josh, thanks and you're right. I should just deal with that fact. Mabye take some more time to get to learn the partners I'm focusing on before throwing in an request or what so ever and see if that could set me a bit futher apart from what most people consider spam.
-
RE: Outreach with your own name
Thanks taking the time to repond Dmitrii! This really helps.
-
RE: Outreach with your own name
When i e-mail them as Bob, or Bob from Client company X they Google my name (my last name is in the e-mailadres and in my senders name) and find out pretty quickly I'm an SEO. Today this was even 4 out of 10 people I e-mailed.
They replay with stuff like "I can see you're an SEO professional" and other related messages.
Did this clearify the explanation for you?