I agree fully with everything you say on this, the difference here I feel is that the client and their previous seo agency were the one putting their own content on tons of websites which makes it even worse. Hopefully a change of content will see the site back to where it belongs
Posts made by GrumpyCarl
-
RE: Self inflicted duplicate content penalty?
-
Self inflicted duplicate content penalty?
Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too.
During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project.
I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!!
Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites.
I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines. The client haven't copied anyone's content as it is very specific for their site but it seems all over the web.
I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings.
From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search
Any thoughts on this, either saying yes or no to my theory would be most welcome please.
Thanks
Carl
-
RE: Big rise in "Keyword not defined"
I would agree, in part. However, even if you don't know which keyword is sending you traffic, If anything this makes ranking reports more important. If we see traffic going up, but cannot directly see which keyword is sending it, then one could draw a link (however tenuous) between the rise in rankings and the rise in traffic
-
RE: Big rise in "Keyword not defined"
Scary how the 100% date, in the chart, has become this December. Was scary enough when it was 2017!!!
-
Big rise in "Keyword not defined"
Hi, all.
Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _
Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
-
RE: Ok to ignore Overly-Dynamic URL from Moz crawl?
Thanks, Mike
Will check out the link you posted. I have added a lot of the filtering options - price ascending, descending and all the related page 2's and onwards to a no crawl file for Google (and Roger Bot) so hopefully that helps. Have also replaced a lot of the html filtering options with menus which cannot be crawled so will wait for the next Moz crawl and see if that's helped matters.
Regards,
Carl
-
Ok to ignore Overly-Dynamic URL from Moz crawl?
I am developing an ecommerce site, just ran it through the Moz crawl to see what's what and it has come back with a lot of issues. Most of these issues are around duplicate page titles (it is not happy with paginated titles, ie Shoes, Shoes Page 2, Shoes Page 3 etc) and it has also found a lot of Overly-Dynamic URL's. Again, these seem to be from some of the search functions and filters used Accessories&pto_sort=priceAsc&pto_page=6 other than spending a lot of time and effort trying to rewrite these urls there is little I can do about them.
Should I just ignore this? I wouldn't imagine it having a massive impact on the rankings of the pages.
Thanks,
Carl
-
RE: Number of reviews in PPC advert
Sorry for the delayed reply, thanks will pass the information on to the client.
-
Number of reviews in PPC advert
Hi all
Got an email from a client asking about this, Ive not come across this one before. The client has a Google + account with about 2500 reviews on their website on. They have linked this into their adwords so these show on their ppc. However, on the ppc ad it says only 650 reviews. Quite a difference!!
Anyone know why this would be the case?
Thanks
-
How to add an affiliate store
Hi all
Just wondering what other Mozewrs' would do about this..... I want to add a revenue stream to a blog of mine and I have decided that an affiliate store is the way to go. I can create a store with merchant datafeeds and pull in products related to my site, and all being well make some pennies from it.
Obviously all the datafeeds are published on many other sites and so it will be very duplicate content. Would blocking Googlebot from the store be enough to ensure that the site doesn't receive a penalty for duplicate content? I would be keen on getting the product category pages indexed but not too worried about the actual products themselves.
I would like to make some revenue from the site but not at the risk of killing the blog.
Thanks
-
RE: Keyword Rankings Compare to X not working
Anyone from tech support online yet? I need to generate reports for clients and it doesn't look good if most of the data is missing
-
Keyword Rankings Compare to X not working
Anyone else having trouble with the compare your keyword positions with competitors section of the analytics? The rankings for my site(s) are fine but whichever competitor I click on to compare rankings to just returns 'Not in Top 50' the competitor is ranking. I have just manually checked and they are very much listed in the engine.
Is this tool broken?
-
RE: 14,000 blog comment spam links placed on one domain!!
John,
Sorry, I think I may have worded the question poorly. The 14,000 spam links are on an external site, pointing to my client's. The client used an seo company a while back and their seo strategy seemed to involve running senuke and Xrumer. With the senuke articles they are fairly easy to get rid of, emailing a site owner and asking them to remove the link from one article isn't too much trouble.
Emailing a website and asking them to remove blog comment spam placed on 14,000 pages of just one site could be more trouble. I cannot expect them to do this so hopefully google will allow me to disavow the whole domain and not expect this to be manually cleaned up
-
RE: 14,000 blog comment spam links placed on one domain!!
Thanks, will check the site out. The links are very clearly blot comment spam. While I cannot give the name of my client here, one sample url from the site should show what I mean. I have broken the link on purpose so it's not active
http://ryantennismusic. com/blog/?p=19&rnment_moveForm(com,_666,/www_paydayadvanceadventeplytocom=556&replytocom=504
The problem I have is that this is clearly a junk link and the penalty we got from Google says
Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
As a result, Google has applied a manual spam action to myclient.co.uk/. There may be other actions on your site or parts of your site.With this in mind i am trying to manually remove as many links as possible. I tried a previous reconsideration and a disavow but no joy with that. I hope they will accept my explanation on this domain
-
14,000 blog comment spam links placed on one domain!!
Trying to clear up a manual link penalty a client has received and I have found the client's site has been blog commenting 14,000 times on just one domain!! I need to do a reconsideration request and show Google that I have cleaned up the mess (the client has 50k backlinks, of which I am not happy with about 75% so far!!) but I cannot expect the website in question to go through 14k pages.
What would people advise here. Would you state the sheer number of links on this site in the reconsideration request and use the disavow tool? Doesn't this suggest that I have been lazy and not put the effort into to clearing things up up
-
RE: Building "keyword" backlinks
thanks for the reply, you make a good point about not overdoing the anchor text and using generic anchors to increase a page authority, that is a good strategy in the ideal world where you can optimise one landing page for one keyword. The trouble occurs when a client wants to rank one page for several different keywords. You may be able to rank for one or maybe two on the strength of good page authority and good onpage seo, but what about the other x keywords they want to rank for? Too few clients think about seo when building their site, in my opinion
-
Building "keyword" backlinks
Looking for some opinions here please. Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth. I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing.
These days my primary sources for backlinks are much more respectable...
myblogguest
bloggerlinkup
postjoint
Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy
I use these sources alongside industry only directories and general word of mouth.
Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed.
The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site.
My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days? The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!!
Thanks
Carl
-
RE: 5XX (Server Error) on all urls
Thanks, will check out that plugin. So, in other words, the pages are loading fine for the user but sending out an error to the bots instead of the loaded ok message. That doesn't sound good!!
On the plus side, at least it has stopped Roger noticing some of the pages have up to 600 links on them because of all the retailer and manufacturer filtering options!!
Many thanks, Carl
-
5XX (Server Error) on all urls
Hi
I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine.
Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report
|
500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 |
Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout?
Many thanks
Carl
-
RE: A client/Spam penalty issue
Thanks for the replies everyone, now comes the fun part when I have to crack on and work way through 48,000 backlinks!