Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Advice on Link Removal Services
HI Luken, I would never completely trust a 3rd party service. You still have to check each link. Blindly removing or disavowing is not the way to go. That said, I've used Link Detox and feel comfortable recommending them. That said, I still add a layer of manual review on top of their process. Removeem and Rmoov are both fine services, IMO, but may not be right for everyone.
| Cyrus-Shepard0 -
Un Natural Links Removal Strategy
It happens A lot of links are hard to remove. If you're sure you have a penalty from links, I'd build a list of those links you've tried to remove, submit a disavow file to Google, and explain what happened in a reconsideration request. In that request include where the links came from, what you've done, and how you'll continue to clean things up and play by the rules in the future.
| Carson-Ward0 -
Does rel=canonical fix duplicate page titles?
Hello, I also have a similar problem. Please can you tell me if having duplicate page titles in Moz is an issue as I also have rel= canonical in place. Thanks
| AL123al0 -
Domain forward to landing page - good or bad for SEO?
Thanks, Dave. We're going to 301 it, since this page has good SERPs on certain keywords.
| NHA_DistanceLearning0 -
Can changing G+ authorship on a well-ranking article drop its search ranking?
Maybe not in ranking per se. But if you are signed in as you search, and your google plus profile happened to somehow interacted or be connected-with the person (especially if he is really a well known and well connected ie a lot of people have circled him/her on G+); for those signed in users the article might have showed up on the first page (lets say) and with no authorship or someone else might not even show up. With this in mind I would keep the author that has the most influences or people that have circled him, or that has the most activity on his/her Google+ network. Again Google is saying one thing, but with the move of making the search more social this is not unlikely. This also in Google's eyes may not be a pure "ranking" process but more related to "displaying relevant info" from your friends on the G+ network. Just some thoughts
| vmialik0 -
Should I disallow via robots.txt for my sub folder country TLD's?
Joomla will do that to you To answer your questions: Yes, the use of robots.txt in this case makes sense. You will save some crawling budget that can be spent by Google's bot somewhere else. I would't worry about the WMT errors though - nothing bad can happen if you have them there and if you solve the issue those will go away - no need to spent time on those - it dosen't affect your performance in any way. Hope it helps. Cheers.
| eyepaq0 -
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
Google does allow root domains in disavow, but I'm honestly not sure how they would handle this with a mega-site with unique sub-domains like Blogspot. Typically, Google treats these sub-domains as stand-alone sites (isolating their PageRank, penalties, etc.). I tend to agree with the consensus, that the best bet is to disavow the individual blogs, and not the entire root domain. If you're really in bad shape and you have much more to lose from Blogspot links than gain, you could disavow the root domain, but I'm not sure if anyone has good data on the potential impact.
| Dr-Pete0 -
Differences between "casas rusticas" and "casas rústicas"
Ibenzo, As you have pointed out, Google does rank them differently, thus it does see them as different. Typically, it's considered best to optimize for the term without the accent mark because there is more traffic for those terms. Here's a Moz post on the topic that you'll find interesting. So, You Want to Know About Foreign Language SEO? Mozinar Q&A
| Chris.Menke0 -
Effect of Removing Footer Links In all Pages Except Home Page
Footer links don't provide much SEO benefit anymore. It can even hurt your SEO if you overdo it. If the links are not useful for the user than i would suggest dropping them altogether. I do like the suggesting provided by Eric Rubin though. A Javascript based toggle for the footer links would keep the UI pretty and clean while it still gives your users the ability to open the menu and visit any links there. It all depends on the links though. If they are only there because of SEO reasons and not for helping users find good content then i would recommend just dropping them altogether.
| WesleySmits0 -
"No Index" Extensions
Hi Michael, I Strongly recommend you to See this Video : http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls Some great ideas. thanks
| Asjad0 -
Will Google View Using Google Translate As Duplicate?
Response from google: http://www.seroundtable.com/google-translate-auto-content-spam-17524.html And yes its not okay, manual translation all the way.
| vmialik0 -
Why should your title and H1 tag be different?
What I'm not seeing addressed specifically in the thread is can the KW term within your H1 & Title differ? I get that if the whole title and whole header differ slightly like: Title: Best Plumber Local |100% Satisfaction Guaranteed! H1: Best Plumber Local offers 100% Satisfaction Guarantees! *Please note - not my best copywriting effort at play here Then it's not worth sweating over, so long as the searcher does't experience a disconnect between he SERP result and and the landing page messaging. My concern is over the need to target "ugly" KWs - terms that don't fit well in to the UX equation, but have exponentially more search volume than they "prettier" version. Let's say "plumber local" has 1,000 monthly searches vs "local plumber" that has 300 monthly searches. But "local plumber" is much better for copy/user readability. Can you use: Title for the SERP: Best Plumber Local |100% Satisfaction Guaranteed! H1 for UX: Best Local Plumber offers 100% Satisfaction Guarantees! ...and still be nicely optimized for "plumber local" assuming you can find a smooth way to work it into copy (easier than doing it in an H1), alt tag, and site has otherwise good authority/reputation. Also, ugly KW (plumber local) would be used in the url). Thanks is advance!
| vernonmack5 -
Should I have as few internal links as possible?
That sounds like a good compromise solution to just show the main pages reducing the links substantially whilst maintaining an effective customer experience Thanks Oli Ash
| AshShep10 -
Technical Site Questions
My mistake. I must have been up too late and started seeing double. I can't find any equity with what I saw then and what I see now.
| Travis_Bailey0 -
Incoming links which don't exists...
If these are all coming from one site, and you're worried about them, this is actually a good case for the disavow tool. You can disavow an entire domain in a single line: https://support.google.com/webmasters/answer/2648487?hl=en As Michael said, getting Google to actually recrawl/recache all of those pages can take quite a while. With the ad gone, it's probably a non-issue and they'll eventually clear out, but disavow would remove any lingering doubt. Unfortunately, there's no way to tell if you've been penalized without knowing more about the site, traffic, etc. I'd say its unlikely for a paid link from a single site, especially if that link was subsequently removed. Googles isn't usually that aggressive about it, especially if your site generally has solid authority/reputation.
| Dr-Pete1 -
Alrogthimc penalty due to pharma hack that created drug links to home page. What to do?
No problem, it looks like you have a pretty involved project ahead of you. I would do a page level audit in GWT, Majestic and aHrefs. Be very careful which pages you 301. If you want to be surgical - you can try getting good links changed to the new domain. That will be incredibly time consuming. Regarding GWT reporting, I had a client that bought a link farm and did a blanket redirect... right smack into their root domain. Ouch. Luckily they had administrative control of said link farm - when the sitewide footer links were removed - it took about a month for GWT to remove them from link reports. Mileage may vary. Part of it depends on Google knowing it's not there - at least for the purposes of GWT reporting. They had about 2,000,000+ backlinks. A lot of them were sitewides, another portion was article spam and blog spam. Some of it was just scraped. I wanted to remove at least 500,000 (just to be cautious) they only allowed 30K. But I got the link farm out of there. The site stabilized - but it was clear they needed to remove more. Such is life. It would have been a great recovery story.
| Travis_Bailey0 -
Hreflang="x-default"
Hi Avi, thanks for your question! Did any of these responses answer it? If so, please mark one or more as a "good answer". If not, please let us know how we can help. Christy
| Christy-Correll0 -
Going after GWMT queries - Smart or Risky???
Yes this is a solid strategy. Optimize your titles and descriptions for keywords that have good position & low CTR. You can also try adding Google authorship (if it's a content page) and semantic markup for reviews (for product pages) to get rich snippets displaying in the SERPs. Targeting keywords that are ranking in the 5-15 range is also a good idea. Organize those by search volume, and do some onpage optimization. A few tweaks could boost their rankings and give you more traffic for a little bit of work.
| TakeshiYoung0