Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Facebook likes for website OR FB profile page?
Gianluca said it very well, so all I'd like to add is: If you're asking in terms of RANKINGS (which will help you rank better?) use likes on your website. And be sure the proper Open Graph meta tags are installed on your page. The amount of Facebook Likes and Shares from your website can improve rankings. You can generate the meta code here. -Dan
| evolvingSEO1 -
How can someone not call B.S. on this site ranking 4th.
I have only been doing this for a few years. But i have seen this many many times.Where a relatively new site launches and suddenly its ranking high on the first page for a certain keyword .Like you said there are very low qualities across the board regarding this site. My guess is that this site will soon disappear just as quickly as it appeared and will find its real home down on page 3 or 4. Google bounce .
| consumers0 -
Does having a high number of reciprocal links hurt you?
We have vendors from whom I was hoping to get links. To make it more enticing to them I was planning to select a small number of vendors (with good site metrics) and create a "Featured Vendor" module on some pages that would link back to the vendor's site as well--thereby created reciprocal links. I was planning to only have one vendor to one page and only for the pages that we are locally targeting. Is this a bad strategy? Thanks!
| AC_Pro0 -
Migrating online store to subdomain using shopify and effects on seo and energy down the road for seo
Yes, all good insight. We are the developers/agency for the client...and we are only considering this option purely on budget reasons. There is a lot of work to address the move to a new CMS, we require stronger content with the move and on top of it all we have to integrate all content to the new CMS....this equates to a treamedeous amount of work...not the best argument but again, posting this in seomoz allows us to have a nice sounding board. The current CMS is so poorly geared for today's seo strategies that migration to another CMS is 100% required. Moving hosts isn't the problem. The CMS is. To illustrate how pour the current CMS is, we can't even touch the root .htaccess file to redirect WWW. Doing so breaks the entire CMS system, insane. Thus we have www.mysite.com and mysite.com directly competing against each other, duplicate content etc. That's just 1 issue of many. Moving to a new CMS is required, but with this move we have budget constraints wrapped in a very big site, wrapped around seo migration issues, wrapped in making things painless for the client, wrapped in making this all work. We're looking for any insights knowing very well best practices...but having to deal with the reality of budgets. This could end up being "save a penny today, costs big bucks later". We understand this is our unique issue and we may have to bite the bullet a bit, go with something like Cart66 and work through the bugs knowing the light at the end of the tunnel will be a brilliant seo/business solution for the client...but may take some painful hours getting there...hours we may have to suck up to keep a happy client and a relationship we've nurtured for some years.
| MAGNUMCreative0 -
Magic keywords in Google Webmaster Tools
Hi Joe, My guess (and purely a guess) would be that perhaps it is coming from text that is commented out in your Wordpress template. You could check the code and see if you find it there. I ran the SEOmoz Term Extractor tool for a couple of individual pages and it doesn't appear in the list of targeted keywords. If you can identify exactly where it is coming from (or establish that it is definitely not included repetitively in your code) you could try asking at http://www.google.com/support/webmasters whether this is a known issue or something to be concerned about. Hope that helps, Sha
| ShaMenz0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
Hi CP, If you wish to use robots.txt to block crawlers, then your two robots.txt files should be as follows: For your http protocol (http://vistastores.com/robots.txt User-agent: * Allow: / For the https protocol (https://vistastores.com/robots.txt User-agent: * Disallow: / Personally, I prefer to use the noindex meta tag for page blocking because it is a more reliable way of ensuring that the pages are not indexed. (Never try to use both at once) This link explains the difference between the two: [Google Webmaster Tools Help.](http://www.google.com/support/webmasters/bin/answer.py?answer=35302 "Robots blocking crawlers") Hope that helps, Sha ```You can use a robots.txt file to request that search engines remove your site and prevent robots from crawling it in the future. (It's important to note that if a robot discovers your site by other means - for example, by following a link to your URL from another site - your content may still appear in our index and our search results. To entirely prevent a page from being added to the Google index even if other sites link to it, use a [noindex meta tag](http://www.google.com/support/webmasters/bin/answer.py?answer=61050).)
| ShaMenz0 -
SEO Strategy for URL Change
Hi, This video from Matt Cutts talks about how Google will best handle the 301's and how long it might take. If the site is large and database driven, then using database lookups to generate your 301's would probably be the best solution. Long lists of 301's have a potential to create processing issues - not to mention the pain associated with having to create them! I would also do the work to at least attempt to get external sites to change any existing links. Given the legal ramifications, I would not want to leave any anchor text out there for the other party to take issue with if I could avoid it. (I would also keep a detailed record of all the work done in this area, just in case there is any kind of legal challenge). Hope that helps, Sha
| ShaMenz0 -
Has Panda 2.5 Hit?
The trick is, Google isn't going to issue a release saying that they DIDN'T do an update, unless everyone seems convinced they did. Even then, they aren't consistent. It used to be that we didn't get official notifications of algo updates at all - that's a pretty recent development. The last official roll-out was the global release on August 12th, which some called Panda 2.4 (although I think of it as just 2.3 to a broader audience). Given the timing of Panda data updates, it is possible to see the impact of a Panda release a couple of weeks after it happens, but a 30% traffic increase doesn't sound like Panda, unless your competitors got hit. Has you ranking changed? Which keywords have more volume? Are there seasonal trends going on? You need to dig deep into the data, but it's not pointing to Panda from what I can tell.
| Dr-Pete0 -
Why traffic to my link has dropped suddenly?
Are there specific keywords you were ranking on that drove traffic to this page. I don't see that this page has many direct links, and it's not optimized for any popular terms, so I'm surprised that login form was getting much direct traffic. Is it possible your internal link architecture changed? Maybe you're pulling a ton of search traffic to the home-page, but it's not getting pulled through to the deeper page now? I'm also a bit confused, because Rediff.com is an outside site service. If you can explain what you were ranking for and dig into the specifics, that would certainly help. It also appears that you had some pretty strong social signals to that page. It's possible that those gave you a strong short-term rankings boost, but that it faded over time. The social signals aren't as long-lasting as permanent links.
| Dr-Pete0 -
Does having many 302 redirects on a site hurt rankings?
A 302 is designed as a TEMPORARY redirect. How temporary? I like to think of it in terms of hours. That makes it a lot clearer when thinking "how long is too long". For the most part, 302's should not be used. There are some corner cases where a 302 is the best solution, but in my experience most of the times 302s are implemented inappropriately and should not be used. A 302 is harmful to your rankings as it prevents your PR from flowing naturally throughout your site. If you are using a 302 for internal linking, you are definitely damaging your SEO. For external links, a 301 is used when the content has been permanently moved to a new location. I am a bit confused regarding your implementation of a 301 for a "buy now" button. That sounds more like a direct link, and I am not clear why a redirect is being used. A final point. Affiliates pointing to their master site should use nofollow links to be compliant with Google's policies.
| RyanKent0 -
What is the best approach to a keyword that has multiple abbreviations?
Personally I would make the root domain a catch-all for every term. You don't need separate pages for each phrase you want to rank for. If the entire website is about the same thing, there's no need to create landing pages for different phrases. (Google also frowns upon this) I would limit the usage of the <abbr>tag to once or twice per abbreviation. You also must realize this is onpage SEO, which is great and it does help, but the majority of your ability to rank for your phrases will come from external SEO.</abbr>
| deltasystems1 -
Google Places in the UK, No Postcard
Shelly, this page might help. Best, Anthony
| Anthony_Trollope0 -
Should I use the main keyword in the title tag for the site on all category pages?
Thank you! And the same strategy should be used on the title tag? Because the URL strategy is already known to me and implemented on my site.
| ikomorin0 -
Sitemap - % of URL's in Google Index?
If all the pages in your sitemap are worthy of the Google index, then you should expect around a 100% indexation rate. On the flip side, if you reference low quality pages in your sitemap file, you will not got them indexed and may even be hurting the trust of your sitemap file. As a point in case, Bing just recently announced that if they see an error rate greater than 1% in the sitemap, then they will just ignore your sitemap file.
| irvingw0 -
Duplicated Pages (Travel Industry)
6 duplicate page titles seems a relatively small amount. Duplicates are an issue in that search engines have to decide which page to rank. Applying the canonical link element can help with duplicate page titles sometimes, but if you are optimizing for different keywords on each page i do not think Google penalises heavily for this (I could be wrong) it just makes it harder for them to know which page to rank. If you are certain of the keywords you wish to rank for (and show this) throughout each page individually I do not think you should have a problem with your page titles.
| CMoore850