Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Pay on Organic Search Results
I dont think it will take much effort I think that all of the time and then I find out that those guys are really smart and very hard to beat. Probably RyanKent is the SEO for the guys I am trying to beat... I think it would be pretty expensive to beat him, might take a very long time, and a lot of work and maybe impossible. Now imagine that you are talking about the same keywords that RyanKent and I are fighting over. Do you think it will be cheap to beat both of us working really hard to beat each other? This is why smart SEOs don't work for somebody who says... "I'll pay when you get me #1". ================================ Now imagine second situation... You say "I'll pay when you get me #1" and I take the job and hit the site with a big ton of links and you pay me... then I yank the links. This is why smart website owners should not say... "I'll pay when you get me #1"
| EGOL0 -
How are pages ranked when using Google's "site:" operator?
Your answer in under 2 minutes from Matt Cutts: http://www.youtube.com/watch?v=Qigo05nAqKw
| RyanKent0 -
Is Google just taking long time to re-index or did I make a boo boo?...
I found out that my robots.txt file was blocking this page. The odd part is that I had setup robot to block a folder mortgage-calc/ and even through my page was not in this folder the spider must have thought they were connected, I don't know why but my page was indexed again.
| CliffAuerswald0 -
Do we have to do different work for SEO for an affiliate site than for a normal blog?
Hi raybiswa, Matt Cutts gave some insight on this. I can not for the life of me, dig it up, I believe it was a video. The topic was either affiliate links or ratio of advertising to content or maybe reviews? Sorry! The basis of the advice was the same as always great content, more content, typical stuff. But, then he gave a little tidbit that I have found vitally important when dealing with affiliate marketing. He made a statement that said as long as you're "giving visitors choices, you'll be fine". I tested this, I had a pretty established page that was ranking very well and by chance it had 3 affiliate links at the bottom to 3 different partner stores. For fun, I deleted two of them. The page dropped SIGNIFICANTLY, I wasn't using SEOMoz tools at the time but, traffic dipped from 60-80 visits per day to less than 10. I left it for about a month and it never regained the ranking, until, I added the two options back to the page. It came back to previous levels within 2 weeks. So, give visitors options, use real reviews, watch your ratio of content to aff links and that works fine for me. Also, I have a mini site that deals with the sale of tickets. There are only a couple ticket providers available so this method works for me. At the bottom of each event description I use a link to a single page that has links to the 3 main providers. I no index and no follow that page with the aff links. That seems to work very well and only a small percentage of click throughs seem to be lost, not having the aff links on every page is well worth the slight loss in the click through transition. I hope this tidbit helps you out. I also hope that you weren't asking about YOU being the sales page and having affiliates pointing links at you! If so, I completely misunderstood! And that is a totally different topic!
| dogflog0 -
Can you see the 'indexing rules' that are in place for your own site?
Unfortunately, that would be specific to your own platform and server-side code. When you look at the SEOmoz source code, you're either going to see a nofollow or you're not. The code that drives that is on our servers and is unique to our build (PHP/Cake, I think). You'd have to dig into the source code generating the Robots.txt file. I don't think you can have a fully dynamic Robots.txt (it has to have a .txt extension), so there must be a piece of code that generates a new Robots.txt file, probably on a timer. It could be called something similar, like Robots.php, Robots.aspx, etc. Just a guess. FYI, dynamic Robots.txt could be a little dicey - it might be better to do this with a META NOINDEX in the header of the user profile pages. That would also avoid the timer approach. The pages would dynamically NOINDEX themselves as they're created.
| Dr-Pete0 -
How to Target Keyword Variations?
Hi Corey! Since they are so similar and basically just variations of the same keyword, I would say to stick to one page. What you can do is research which variation has the highest search volume and use that on the title tag. You can then make sure to use that same variation on the H1(on-page title), then you can use a couple of the variations throughout the actual on-page text (don't over do it...). Also, if you have an image, make sure to use one of the variations as the alt text...Lastly share it socially, and work on some inbound links using the variations above as anchor text. Best, Margarita
| MargaritaS0 -
NOINDEX or NOINDEX,FOLLOW
Hello again, They are the same. And I assume that interpretation is the same because default state is: **<META NAME="ROBOTS" CONTENT="ALL"> **that means index, follow and you can decompose every tag to parts: <META NAME="ROBOTS" CONTENT="NOINDEX"> <META NAME="ROBOTS" CONTENT="FOLLOW"> and that tells robot to noindex, but do follow is default so if you don't use this tag robot will follow anyway. Marek:-) http://googlewebmastercentral.blogspot.com/2007/03/using-robots-meta-tag.html
| mad2k0 -
How to use my time: Make my site bigger or make link wheels?
great response, I love the.... _"I would give these texts to my competitor and hope that he puts the dead weight on his site." _ You make a very valid point. Content needs to be engaging, and if the content is just that, you stand a good chance of uses sharing your content. shares = exposure = more visits = more sales
| dan1el0 -
Track video in GA
Thanks. You almost always manage to get Good Answer rating from me. One final question... Is the code to be added in the button link ?
| seoug_20050 -
Ranking Ranking Factors!
As a predictive metric, both Domain Authority and number of linking root domains are among the highest in terms of predicting a site's ability to rank for a given term, all other things being equal. But there's a better way to answer the question. If you want to know what it takes to crack the top 5 for a given keyword, run the query through the SEOmoz Keyword Analysis tool. If your a PRO member, be sure to run an Advanced Report. This will give you a good idea why the top 10 are ranking the way they are, and what you have to do to beat them. Each keyword is different. The reasons one set of results is returned in Google will inevitably vary from the reasons it returns a different set for a different keyword. Sometimes you need social shares, sometimes Domain Authority, sometimes raw links. Most often, it's a subtle combination of dozens of factors.
| Cyrus-Shepard0 -
Site structure question
I like your plan from a relevance standpoint. Speaking from my gut I think it would work; purchasing a new vehicle is rarely a 'hurry up let me get to the buy button already' situation. That's a lot of money to most people, so I think the conventional wisdom of 'get them to the funnel right away' might not be appropriate to this situation. People are going to hit the site for two reasons from what I can tell 1)Feature Research/Do I want to buy this, which your rich category approach above supports. 2)Price Check/Inventory Check. I'm going to assume you can't buy these online, and that a person has to walk into the dealership to buy something. In the second case, it's possible that price seekers might resent that extra click, but I'm guessing if you made it very obvious where they would have to go that would be offset by the better relevance/information given by the expanded category pages.
| icecarats0 -
Submitting URLs multiple times in different sitemaps
Hi msquare, According to Identity on a blog post about multiple sitemaps: It is still just one URL, but the engines prefer to not have URLs duplicated across sitemaps. In this way, you are sending a lower quality signal about your sitemaps. Certainly this can happen, like any duplication, but if it is rampant, hundreds-thousands of URLs duplicated across multiple sitemaps might lead the engines to believe that your are trying to manipulate them in some way and I wouldn't be surprised if they trusted the sitemaps less, which may defeat the purpose of having them in the first place. I would argue the same. If you have two sitemaps with lots of similar content then it can be seen as inaccurate which is contrary to the point of the sitemap in the first place. Have a read of that article too that I linked above, because there's a good introduction to multiple sitemaps and indexes, so that you can avoid as many problems as you can!
| JoshPugh0 -
Were small sites hit by Panda?
All sites were potentially affected by Panda, large and small. Panda is a group of algorithm changes rolled out over time. Each change had the potential to affect a percentage of websites directly. Even if a site was "panda-proof" they were still affected because they would have moved up in rankings as other sites were penalized. As an example, take one of the questions asked of Google's Panda testers..."would you trust this site with your credit card information?" Google would present a group of people various sites then ask the question. Users would say yes or no. Then Google would algorithmically review the sites to locate commonalities between the sites which received a yes or no response. If Google determined users trusted sites with a visible SSL badge more then they would adjust their algorithm to boost sites with a visible SSL badge. This change would affect all sites, large and small. Either a site has a SSL badge or it does not. Panda is a group of many such changes.
| RyanKent0 -
Annzoseo - Keeps Calling, funny SEO Phone Conversation!
This has been an on-going topic all over the web and I agree, realistically there are now real qualified SEO Certifications yet other than PPC and such from the engines. However even with a certification that is no reason to choose one or the other it comes down to the work. What did you do and what can you do? I only point out Bruce Clay's because of the difficulty of the test I have taken others and passed however this was like taking my MCSE test but with a twist, plus you are to uphold the standards Bruce has put forth with his certification .. If any of the certified are caught spamming they will loose their certification as well as many other criteria where as other just give you the cert if you pass WOOPY I like his morals and love the refresher courses to keep up to date with technology. With any certification, you can be a book worm and pass the test but can you really do the work? That is the question!
| Ben-HPB0 -
Does the home page must get the biggest amount of internal links?
I, too, work with a site that has many (thousands) of product pages and dozens of categories. Over time, the homepage did get the most links and it ranks for the brand - which is fine. What's better for the user is that the category or product pages rank for specific keywords. I mean, for the best user experience, I don't want someone to have to go to the homepage and navigate down a few layers to get to what they want. Links to those deeper pages have helped make them more visible. The more pages that rank for the right keywords = more online real estate and more ways for the audience to find you. So, it depends on your strategy and preference - and - is it better for the user to go to the homepage or to the product page? And, what's the reason those two would compete against each other for keywords (if that's the case). All the links into the site will eventually help the domain authority.
| josh-riley0