Questions
-
Thin Content pages
Could you possibly add a line or two by each client? It could serve the dual purpose of getting some content in and enticing the user to click on the case studies. I'm not a fan of content just to have it, but that recent Whiteboard on "cruft" comes to mind and I wonder whether it applies here. Would be interested to see what others think. https://moz.com/blog/clean-site-cruft-before-it-causes-ranking-problems-whiteboard-friday
On-Page / Site Optimization | | seosnyder0 -
Next Steps: Following Fixed On-Page Efforts
Hello Steve, It is difficult to say how long it will take because Google handles different sites at different speeds depending on things like domain-level trust metrics, how often the pages get updated, how easy the site is to crawl, etc... Generally speaking, bouncing back after a complete site migration takes a couple of months, which can be shortened by following best practices, most of which it sounds like you've already done. I would submit a new XML sitemap (replace the old one) and then fetch these new pages as Googlebot. Then if you want to write a useful blog post that links to those pages it would help get them crawled and indexed, while also building some internal pagerank for them. Good luck.
On-Page / Site Optimization | | Everett0 -
Rel=canonical on pre-migration website
Thanks - I'm not terribly worried about the test site as we use a password protected and IP blocked development domain that is completely different from the root domain. Its not even a subdomain. Eg. www.realsite.com and www.testdomain.com My dev team is trying to get me to wait and just do a massive 301 redirect > moving the URLs with the query strings (old site) to new page (e.g. multiple many:1) vs doing the canoncial. The new site won't create the query string issue. The challenge I see is that the 150,000+ indexed URLs really should be around 7,000, so the organic value of the real 7,000 pages (other than possibly the root domain) are probably getting punished, even though the site is doing decently well.
Intermediate & Advanced SEO | | ExploreConsulting0 -
Proper Use and Interpretation of new Query/Page report
Hi Steve Yes, it sounds like if you're looking at pages showing in search for only one query they may be ranking interchangeably, or simultaneously. Have you manually performed searches in Google (de-personalized etc) to see if you have multiple pages showing, or if they are switching? Also - do you track daily rankings for these keywords? That's another way to see what's ranking there - rank trackers like Authority Labs will show you all URLs ranking for a query. Would there be any other reason two different users might get a different page? Location, language? How specific is the search query vs. how targeted/focused are your pages? Or is there overlap in content? It really depends on the types of pages and content as far as determining your next step. Normally, we do see secondary pages occasionally ranking for one query, but your percentages sound high (unless you are getting two results a page).
On-Page / Site Optimization | | evolvingSEO0 -
It's not link buying, but...
as long as the link looks naturally possible to be there...id go with it. It's not like you are running a nationwide campaign of asking for positive reviews in exchange for flowers or gifts
White Hat / Black Hat SEO | | DennisSeymour1 -
Sure, but what about non-keyword rich anchor text links?
While we can never really quantify how much benefit a given link affects rank, we do know that a link can convey equity both for the link itself and for the anchor text. Google has indicated that a "safe" method of protecting ourselves against an impression of over-optimization is to use anchor text like "read more", "more info" or "click here", as well as using the target page's title for anchor text or a simple raw URL (as I recall, it was John Mueller that told us that in a Hangout some time ago). Personally, I see the question of link quality as dealing with the quality of the source page and the relevance between the source and destination, and I evaluate links first by those criteria. I see anchor text as a separate issue, considering relevance and diversity. We have successfully cleaned up trashy profiles where the same anchor text had been overused, by mixing up KW anchors with generic anchors, page titles and raw URLs, and have gotten penalties lifted. That said, your last point: "While spammy links on non-keyword rich anchor text is certainly not a good practice, is it nonetheless effective?" makes me wonder exactly what you mean by "spammy links". That, to me, sounds as though the source page is either low quality or not sufficiently relevant to the destination, in which case, I'd say it's an ill-advised practice.
White Hat / Black Hat SEO | | Doc_Sheldon0 -
Difference Between Google Preaching vs. Reality - Crap still rules?
When links where treated less strictly by Google, a decent site with good content, a readership and happy customers, I guess would have had to fight fire with fire in terms of backlinks compared to competitors for the same exposure (I say competitors for the same exposure rather than "competitors" because at one time in the past even "relevance" wasn't as much a factor). A decent site getting future proof links that were harder to get (and therefore less of them) would perhaps have not appeared on google. Given the previous lack of discrimination in links, a decent site would have had to work hard to make google's search results decent, and for the same reason when google update's it's algorithms years later, gets the boot. If that's not Ironic I don't know what is. I'm looking for places to guest post and keep seeing good relevant sites which have had their rank and traffic go downhill. It's giving me a head ache.
Intermediate & Advanced SEO | | Zoolander0 -
How does a Responsive Site kill SEO?
Hello Jason, This is one of the best (ok, IMO the best explanation of responsive design I have seen to date) especially laying out the difference between server side - dynamic design - and responsive. The thought you placed into this whole answer to provide one that is concise and cogent is excellent. I like the point up about the lazy designer and page speed as something to be aware of. Typically, as an agency, we are not often dealing with a client bringing mobile and desktop forward. It is more likely we will be dealing with someone whose site has lost appeal, power, relevance, etc. and we typically build responsive for all. So the thoughts around url changes are helpful as well. Thank You, Robert
Web Design | | RobertFisher1 -
Yet another Negative SEO attack question.
If it can be proven that the intention was to cause harm to another companies profits I would think you could be held liable. There is enough documentation on the web to show that Google penalizes for bad links and that negative SEO exists, if there is proof that you were doing what Google tells you not to do against your competition and it results in a penalty that Google says will happen, it seems like bad intentions can be proven and in that case you could be found guilty in a court of law. I am not aware of any precedents though.
White Hat / Black Hat SEO | | irvingw0 -
Any Benefit to Artificially Boosting the CTR for rank?
Let's say for a second that you are able to manipulate this one ranking signal. Keep in mind there are well over 200 ranking signals. What would one do for the other factors ? There's too many items to manipulate. And if you have naturally been able to achieve 140 of those signals, would you really want to risk all your good and hard work and try to manipulate it ? I wouldn't. I hope this helps.
White Hat / Black Hat SEO | | NakulGoyal0 -
Sculpting anchor text percentage through disavow?
Was the hit from Penguin, or a manual penalty? If it was not a manual penalty, then in theory, you might be safe enough to keep some of those to maintain some diversity. I would caution you though that there's no way to know what threshold exists for how many need to be cleaned up in order to address the penalty vs. how many can remain while working on obtaining higher quality links. This is further complicated by the notion that if it was not a manual penalty, some of the losses could be to current on-site failings that were caught up in other algorithm changes before, around the same time as or immediately after Penguin. For example, what if there were 5 problems with the on-site SEO, and the Penguin update caused a "trigger" due to link anchors? And what if it turns out that you might only need to do some link clean-up but simultaneously also do some on-site work? There's just no way to know in advance. Especially without a full evaluation across the board. Very interesting concept though. And for the record, there truly is no secret percentage formula regarding brand instances in anchors. With hundreds of factors to SEO, one site could have only 20% anchors with the brand in them and still have higher trust than a competitor site that has more brand anchors but weakness in other signals.
Intermediate & Advanced SEO | | AlanBleiweiss0