Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Noindex a meta refresh site
You are so welcome, and believe me, I understand how hard it can be to convince a client to either not do something or do something differently, because from their perspective all they see is whether the page works or doesn't work. It cost us $1,000 to get a developer to fix our meta refresh and I would never have been able to convince the company to spend the money if SEOMoz had provided some support and validation to the plan. Good luck!
| danatanseo0 -
Site Indexed by Google but not Bing or Yahoo
Any chance we can see the site? Matt's completely correct regarding Bing Webmaster Central, I'd start there. Also: Submit an XML sitemap. Any errors that pop up during the processing of the sitemap will help you diagnose the problem. Check your log files. See if BingBot or MSNBot show up in the logs. If they don't, then your site's not getting crawled. Check robots.txt - any chance the site's outright blocking them?
| wrttnwrd0 -
How is it possible to 301 specific pages to a new domain?
Glad it helped and the explanation below is nicely done.
| Matt-Williamson0 -
PR calculation
Hi, Do you mean the patent that Google published years ago? They've done a bit of updating to the algo since then. What makes sense for your users for linking categories? Don't forget that they are an important part of your site.
| KeriMorgret0 -
Poor Link Profiles Out Ranking Whitehat
I a agree with Zakaria and George. I too am struggling with the same issue. I even private messaged SEOMoz to get their take on my specific situation. Their answer agreed with Zakaria. But believe me, I understand how hard it is to go to a CEO who sees his competitors outrank his site and convince him that you just need to keep exactly what you are doing. I think you are definitely on the right track by going after your Toyota niche. Sometimes the problem when trying to explain to a CEO or Marketing Director that you are doing the right thing is because they are hung up on "vanity" keywords. What I mean by that is keywords that they want to rank for but probably the site really shouldn't rank highly for based on content and competition. As you mentioned, your competitor's site has a broader scope, meaning they will have a better chance at ranking for broader, and perhaps more competitive keywords. However, by going after "Toyota" you probably have a better shot at actually increasing revenue than you would if you spent a lot of time, money and energy chasing those harder keywords. Good luck Jon. I feel your pain! Dana
| danatanseo1 -
Submitting sitemaps every 7 days
Thanks Maurizio. What I am really most concerned about is submitting hundreds of sitemaps to Google and giving them concern that we might be spamming them. This is why I am considering the second approach where we would submit 6 sitemaps at a time which would total no more than 300,000 links rather than giving them 200 plus sitemaps with 10 million links. I should have been clearer in my reason for this question. The main goal here is to not have Google freakout because we just gave them 10,000,000 links at one time.
| zAutos0 -
Real Estate Local SEO
Hi Joel, Without actually being able to see your specific business, the cause of your lack of first page rankings is hard to judge. Some things a Local SEO would look at in reviewing your situation might include: Your proximity to the centroid of business (are you located within the main cluster of businesses on Google Maps for your key search phrase or are you far away from the cluster) Potential existence of duplicates. This comes up frequently in industries like real estate, law and the medical field because multi-partner practices took Google up on their guidelines that stated you could have 1 Place Page for the main practice and a unique Place Page for each partner, as well. This has all gone to heck now, and no one I know is recommending doing this anymore because of Google's tendency to merge similar listings, sapping rankings, and also, their official refusal to remove duplicates. See: http://localsearchforum.catalystemarketing.com/hot-topics-catalyst-blog-archived/8-google-places-duplicates-�-have-doctor-dentist.html and http://localsearchforum.catalystemarketing.com/local-search-general-discussions/861-dr-dupes-google-local-user-edits.html So, that may be an issue affecting your ability to rank. The consistency of your citations across the web. If you work for a mutli-partner firm, there is a chance their may be mixed up details out there about you or your partners. Lack of consistency can definitely hamper rankings. Your domain age. If other firms are older than yours (web-wise) they may have a slight-to-moderate edge over you, even though you are making greater efforts than they are. Also, real estate is one of the toughest verticals, particularly if you are in a populous area (a big city). It can be tough for newer competitors to break into established SERPs that already have a ton of strong businesses in them. Those are a few things to consider. It sounds like you are doing a lot of things right, Joel. I recommend you check out 51 Blocks' local competitive analysis tool to be certain you are actually doing more than your competitors appear to be doing: http://www.51blocks.com/online-marketing-tools/free-local-analysis/ It's free and a very cool tool. This is as far as I can go without seeing your actual business in action. I hope these ideas are good food for thought.
| MiriamEllis0 -
Missing Suite Number on Google
Hi Greenhornet77, Yes, the suite number matters hugely to NAP. You should be sure it is included on the website and everywhere the business is listed. You must correct all business citation to match. Consistency in citations is a major local search ranking factor. Thanks for asking your question.
| MiriamEllis0 -
Google Disavow Tool - Waste of Time
I have to apologize to Google for my earlier response. After submitting a disavow request I did see my largest site recover. I do have to say I did submit another low quality website and got a quick response that it still was penalized. I only disavowed links and did not have any removed by third party sites. So it looks like you can get penalty removed with disavow and hard work. But don't think disavow alone will be good enough for manual action.
| Sbmarketing20000 -
Rankings drop
Did your drop happen around October 5? That was when Penguin refreshed. If it is Penguin then we have not had a lot of stories of recovery yet, even with the disavow tool. Most likely Penguin needs to refresh again in order for a site to recover. But, I have found that most webmasters really don't know which links to disavow. I'm guessing that there are very few actual quality links to your site. I didn't look at your links but I'm concerned. The page that you mentioned as a potential "natural" link looks like a site that publishes articles. You may argue that it's a guest post, but it's not black and white. Matt Cutts in a recent video recently mentioned that guest posting is ok if you are doing it occasionally and are a great writer who has been sought out by a quality blog. But if you're doing it for the point of building links in a large scale it's not going to work and could be penalized. Unfortunately the only links that are truly natural are ones that are earned because someone else decided that your site was worthy of a link. You've got a pile of backlinks all with the anchor text "anglian windows". That type of pattern doesn't happen naturally. Now, with that being said, before you go disavowing more links you need to be sure that your problem really is Penguin. Panda refreshed on September 27 as well and it affected a lot of sites. Cutting links and disavowing links will harm your site if this is the case. Panda is about on page quality. If you've got information on your site that is duplicated across the web then Panda could be your issue and links have nothing to do with your drop.
| MarieHaynes0 -
Dynamic pages - ecommerce product pages
Hi Cylo, I'm not sure if saying Google caches dynamic pages automatically is an accurate thing to say. I would say it like this: Google is more apt to cache and index a dynamic page if it is given a SEO-friendly URL. Perhaps a Mozzer who is more technically adept than I can comment on the accuracy of that statement. That being said I definitely wouldn't recommend using dynamic URLs (which it sounds like you are not). Here is how you can set up URL-rewrites in your .htaccess file if you are on a Linux server: http://www.webconfs.com/dynamic-urls-vs-static-urls-article-3.php Not sure if that's helpful at all. I hope it is somewhat Dana
| danatanseo0