Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Disavowing Links for Subcategory of Site
It won't hurt to disavow positively known spammy back links and your rankings drops may in fact be due to penguin, but don't yet be 100% sure you were the one hit with the penalty--like I said the way it looks know, it appears more like one/some upstream link(s) got whacked and you're feeling the effects of less link juice. Time will tell. In the mean time, focus on creating content that your audience can use and share and work your social media channels.
| Chris.Menke0 -
Need to shorten and change site-wide meta titles (50.000 pages). OK to do all at once?
thanks Tim. started launch of new shortened optimized titles today for most important pages.
| lcourse0 -
Google disavow tool
Since you have already cleaned things up and disavowed the ones you couldn't, I would not recommend sending a reconsideration request. Instead, I would start doing marketing as usual. Start getting some high quality, optimized, fresh content on your site and proceed with your solid white hat SEO method of operation. Be patient.
| RepLoc_Tim0 -
Help with htaccess
Hi there, To be able to give you an answer, could you please confirm: Do you want to apply specific rules to the pages inside the /development/ subdirectory that override the ones that are already included in the root file htaccess or something else? Thanks for the confirmation!
| Aleyda0 -
Not sure if I should disavow these links or not
No, don't nofollow internal links, especially for pages you want to rank. That doesn't work well anymore, and will ultimately hurt you. I see about 190 links in the primary nav. Can those all really be equally important to users and search engines? I'd be really surprised if scrolling through a list of brands hunting for the right one is the best user experience. Don't take my word for it: run some tests and see how people respond to a simplified menus along the top or side. Generally we want to pick a few terms (e.g. brands or candies) to prioritize, and otherwise work on a hierarchy/nav that users intuitively understand.
| Carson-Ward0 -
Can anyone help me clean up my link profile?
Hi Jeff, Some interesting points I found instantly whilst checking out the link profile. I'll give my initial findings now and then examine them afterwards to summarize soon. You have 569 links (out of a total of 2399 links that I checked to your site, this is a huge portion of your link profile) using the anchor text "twin prams". Every single one comes from the same domain (babyandco.com). You do not rank for this term in Google. You have a fair few links using the anchor text "quinny prams". This time, they are from various domains and not one domain. You do not rank for this term in Google. You have a fair few links using the anchor text "pushchair" again from various domains. You do not rank for this term. The above also applies for Prams, Pram Systems, Baby Car Seats & Baby Buggies. I would hazard a guess that your Google analytics shows drops in visitors coming on to your site from searching these terms. How do you feel about letting me have a look? I think your link profile, while not terribly horrific, certainly isn't anywhere near natural or clean. I think that there could also be other issues playing a part. If I can access your analytics I can check out visitor drop dates and cross reference against Penguin and Panda algo updates to see if there could also be a content issue. Here are a few links pointing to you site which are pretty toxic links (either deindexed from Google or have horrific trust signals): EDIT: Removed the backlinks. I'm sure Moz don't want to be linking to these sites! http://abdulmohideen.fastpage.name/litesun http://bargainbabygear.info/otherresources/browse/73/quinnypushchair.html http://pram-stroller.com/choosing-baby-car-seats-bytrevor-hobbs/ http://www.stufftobuyforbaby.co.uk/cheap-baby-stuff/cheap-prams-and-pushchairs-uk/ http://nursing-australia.com/baby-lo-nursing-chair/ I can give you a full list if you like. I would not submit these to Google Disavow unless you have received a warning though. Instead, you should use it to contact the owners of the sites to get them to remove the links. In total, Link Research Tools shows that you have 7% of incoming links being toxic. A few questions/notes: Has the site got any sister sites, old owned sites, micro sites, niche sites etc 301 redirected to it that you know of? (other than the .com version I mention below) Did you receive an unnatural link message from Google? Can you give me a list of maybe 5 competitors so I can check out their backlink profile please. Perhaps most worrying is that babys-mart.com redirects to the .co.uk version. The .com version has been deindexed by Google. This could be causing problems and is a big issue. It will pass on some very bad trust signals.
| MattJanaway0 -
DMOZ how long?
Thank you to everyone who helped clarify this issue. I thought about becoming an editor, although, from what I read, the SEO value of having a DMOZ link may not be as strong as it once was... Great info!
| bjs20100 -
How about a discussion on Penguin 2.0?
Amazing ..... this huge update (from what I read on other sites and listening to Matt) and almost nothing here on SEOMoz.....What is up with that.
| freestone1 -
URL Question and Advice on Site Architecture
Assuming I understand your specific questions correctly... Having a /computer-repair/ directory with a /laptop-repair/ subdirectory should not cause you any problems in itself. However, if all of the content in /laptop-repair/* also appears in /computer-repair/* you could have a duplicate content problem to deal with. From your response to Thomas' extremely-detailed and helpful - though somewhat off-topic - comment I assume you already know that though. I would not put the location in the directory, such as /virus-removal-wilmington-nc/. Whether or not you are moving makes no difference. I might make an exception there if you have multiple locations, but even then I think it would be a bad idea to have, for example: /virus-removal-wilmington-nc/laptop-repair/ /virus-removal-charlotte-nc/laptop-repair/ ...unless you had some very specific things to say about each location in regard to laptop repair that could not be expressed on the same page. These would be dangerously close to "doorway pages" which is common in any multi-location industry, and the bane of any SEO's existance when trying to clean up the penalties and filters that so often follow such a taxonomy. Keep it simple. Having the location in the URL like that is a tiny, wee little factor that gets an inordinate amount of attention from SEOs, in my opinion. You are of course free to disagree with this, as you did when someone had a similar opinion in the other thread. In that case, let me answer your other question from that thread: "If I include my current location (Wilmington NC) in the destination (New Domain) URL string for any given 301 redirect from my existing website to the new website and then physically move to another city 3 months later is this setting myself up for a BIG Failure? Seeing how I have no idea of how this technically works with GOOGLE as far as how long this (migration process) takes to Fully complete where the OLD domain completely drops off and everything is Fully passed over to the New Domain in terms leaving the 301 Re directs in place on the Old Domain Server. How long does this process usually take with GOOGLE?" In my experience having done several dozen site migrations on sites of many different types and sizes, it should take Google no more than a few weeks to have their index completely updated for a site the size of yours. This is assuming that they are given the appropriate 301 response codes and are crawling the site regularly (i.e. no existing penalties or major algorithmic filters are in place that would limit crawling). Switching your location like this is just one of several reasons I wouldn't put the location in the URL as proposed, so yes it is shooting yourself in the foot a little bit, but nothing that you won't be able to overcome. They will index the new URL, complete with the new location, and ranking calculations will be updated accordingly. Regarding your other question toward the end of that thread, yes you can change the old htaccess file on the other server later, but there is no guarantee that Google will visit those old URLs again unless you have some external links pointing to them. If they came once, saw the redirect to the new URL, and there is no access to the old URL they may not see if you update the old URL's header response to point to a "new new" URL. If that makes sense. You could initiate the crawl by linking to the old URL from somewhere, such as a sitemap - but I'd advise making a "permanent" redirect permanent. In other words, I'd try to get it right the first time. Keeping the location out of that directory name would allow you to do that, unless I'm missing something. Lastly, I disagree that domain.com/topic-in-charlotte-nc/sub-topic/ is "less" spammy than domain.com/topic/sub-topic/ but we all have a right to our opinions. Good luck!
| Everett0 -
Why Keyword Ranking Fluctuate?
Just following-up to make sure you have your personalization turned off if you are checking your rankings manually. See Takeshi's answer below.
| Everett0 -
What may cause a page not to be indexed (be de-indexed)?
"Toshiba MODEL Laptop". I would like to them with the most descriptive & searched terms. I don't think its due to your internal linking structure unless its something really sketchy. You can PM me the link of the landing page and I can check it out if you like.
| OlegKorneitchouk0 -
Are This Site's Backlinks Hurting Us?
As Oleg says, Panda is a content update, the situation you are describing with all those links (which I definitely think could be a problem) is a Penguin issue -- i constantly confuse the two, perhaps you did too? Do you still have the notifications you got from WWT? If this ends up needing some hot, disavow tool action this recent comment from Mr. Cutts might save you a ton of time. http://www.mattcutts.com/blog/what-to-expect-in-seo-in-the-coming-months/#comment-4405076
| reallygoodstuff0 -
National not international domain name does it matter?
Hi Luke, If the website is targeting to a specific country then having a ccTLD is an additional signal of relevance for it (unless the US, where typically the domain used de-facto is .com). Nonetheless, if the Website is targeting to a worldwide audience (let's say a Website in English targeting to any user who speaks that language, for which the country doesn't matter) then it would be more relevant to have a generic domain (like the .com), since the Website is not country targeted. Thanks, Aleyda
| Aleyda0 -
Google Places Listing Active In Two Seperate Google Places Accounts?
Hi Tom, Rather than re-invent the wheel on this, let me just link to Mike Blumenthal's excellent guide on how to handle the scenario you are describing. Be sure to read the article, in full, including looking at the previous pieces linked to in Mike's post: http://blumenthals.com/blog/2012/05/17/basic-places-practices-update-how-do-you-claim-an-owner-verified-listing/ I'm 99% sure this will be exactly what you need, but if you don't find the answer you were hoping for, just let me know.
| MiriamEllis0 -
Panda'd - and I think I know how to fix it...
I would be interested to hear of the sites that have resolved the problem with Panda. I can hardly find any examples of Panda recovery examples.
| julianhearn0 -
API to power all websites
Mat's right. Your content does need to be readable, indexable, and crawlable. It sounds the resulting content will still be HTML output of some sort; however, you are stating that you won't have access to any actual static HTML files for SEO purposes. If this is the case, the backend needs to be extensible enough to allow you to still do your job as an SEO. If you can't provide unique Page Titles, descriptions, canonical link elements, etc..., your organic search results will suffer. Here's another issue: if your pages are built with JavaScript, you need to make sure the output is readable by search engines. I've seen issues with JavaScript-built pages before where search engines are indexing "blank" pages, indexing the wrong pages, or seeing duplicate content where AJAX is used to inject the unique content after the page is ready and has already been crawled.
| GeorgeAndrews0 -
How to find all indexed pages in Google?
You are absolutely right. But if you think that you have duplicate content issues, then Screaming Frog can help you tease that out. That is also why I suggested the SEOmoz tool, since it is supposed to mimick a SE spider, it can give you a pretty good idea of any issues that you might have. Using the advanced operator of site:domain makes sense, but if there are issues there like eyepaq said, it is going to be tough sledding. My suggestion would be to download take a closer look at what GWT is telling you. Are there duplicates there? Is your CMS auto-generating URL's? That is probably going to be your best bet IMO. Best of luck!
| ZephSnapp0