Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
How should i best structure my internal links?
Hi Alice, Many thanks for your response, very helpful indeed. Thanks Again. Craig
| Towelsrus0 -
No reply from Google despite reconsideration and hard work!
Steve, sorry, I completely misread that original question, if it is a manual penalty, things are somewhat worse and recovery, in my experience is a whole lot harder. You will have to be absolutely fastidious in your approach and get rid of everything manipulative or start over. Sorry buddy, skim reading whilst stuffing my face at lunch time!
| Marcus_Miller0 -
Optimized page titles post penguin
Peter, Are you saying that every product is an FSX Addon? If so, then you are going to need to think this through and really get clear on Keyword research. Other keywords that get a ton of queries are FSX as a stand alone, FSX Simulator, etc. Also, FSX Download might be a clever one for those where there is a direct download as many who buy software do not wish to wait. So, my thought is this: You could make it start with the FSX piece: FSX Download, FSX Missions, FSX Aircraft, FSX Addon PIPE the specific aircraft PIPE then your Brand Name. So: FSX AddOn | Boeing 747 British Airways | Your Brand I think this will help you and as long as your content is unique (so dont have 90% of FSX Addon on the page and 10% on the plane). There are ways to avoid duplicating the content by potentially using Categories then make the product the individual aircraft download. I would need to see more of the site and think it through to be clear on resolution. Hope this bit helps,
| RobertFisher0 -
Is SEOmoz.org creating duplicate content with their CDN subdomain?
Hey Irving! Good catch! Turns out that my Web Team knows about it too and are working on it! Only cause you said you'd wear it proudly, send me your info to Help@seomoz.org with ATTN: JOEL in the subject and I'll see if I can't scrounge up a shirt for you.:) Cheers, Joel.
| JoelDay1 -
Social Media and SEO
By using them properly. Getting people in similar niches interacting with your profile (retweeting and commenting). Follow industry leaders and interact with them, add value to their social presence and they will add value to yours. We are working on a reputation management case and we learned that in order to help give more authority to a social media site is by using it properly.
| SEODinosaur0 -
Optimal URLs for SEO and UX
Hi Peter, Given that the site is 10 years old and that the URLs were already updated once fairly recently. I would leave them as they are exceot for those that have more than 3-5 keywords, or those that contain "stop" words like "and" "the" "of" etc. This would be pretty easy to do if you dumped all your URLs into excel and sorted them accordingly. If you feel very strongly that your search traffic would improve if you changed them, I would suggest picking one section or category of the site and doing those first. Monitor what happens. If you get good results, then go ahead and change the rest. Hope that helps! Dana
| danatanseo0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Thanks for the reply. I think at the end of the day, yes it does help the user to have it all in one place. And thanks for reiterating what I already thought... directories are better than sub-domains. To be honest, this article is what had me questioning sub-directories in the first place: http://www.seobook.com/subdomains-google-panda
| essdee0 -
Manipulate Googlebot
Hi This is a valid concert. As Mat correctly stated, Googlebot is not easily manipulated. Having said that, Googlebot impersonation is a sad fact. Recently we released a Fake Googlebot study in which we've found out that 21% of all Googlebot visits are made by different impersonators - fairly "innocent" SEO tools used for competition check-ups, various spammer and even malicious scanner that will use Googlebot user-agent to try and slip in between the cracks and lay a path for a more serious attack to come (DDoS, IRA and etc). To identify your visitor can use Botopedia's "IP check tool" - it will cross-verify the IP and help reveal most fake bots. (I`ve already searched for 66.249.71.25 and it's legit - see attached image) Still, IPs can be spoofed. So, if in doubt, I would promote a "better safe than sorry" approach and advise you to look into free bad bot protection services (there are several good ones). GL TBHPK
| Igal_Zeifman1 -
Sitemaps index file and internal pages
Hi, This is the ans of your 1st question: I have displayed following tag which will help you in creating video site-map. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=80472 For second ans you said that you have single pages and URL for each Feedback. If you have to crawl each and every page then you have to add those pages in site-map.
| SanketPatel0 -
SEO - Why variation in Google positioning? Are we penalized?
Hi Celine, How are you checking your rankings?
| Unity0 -
Is text that is burned into the code stronger than text that is retrieved from the DB?
Moosa and Marie are both correct, there is no difference between hard-coded text and text rendered from data retrieved from a database.
| GeorgeAndrews0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Oleg gave a great answer. Still I would add 2 things here: 1. Go to GWMT and under "Health" do a "Fetch as Googlebot" test. This will tell you what pages are reachable. 2. I`ve saw some occasions of server-level Googlebot blockage. If your robots.txt is fine and your page contains no "no-index" tags, and yet you still getting an error message while fetching, you should get a hold on your access logs and check it for Googlebot user-agents to see if (and when) you were last visited. This will help you pin-point the issue, when talking to your hosting provider (or 3rd party security vendor). If unsure, you can find Googlebot information (user agent and IPs ) at Botopedia.org.
| Igal_Zeifman1 -
Domain name match and SEO
I read somewhere (on here I think) that hyphens in domains are OK so long as not too many. It appears the hyphenated variants of your first not available domain are taken for the good tdl's but you might have some more scope with hyphens otherwise. Also shouldn't "Personal" have only 1 letter "n"?
| Zoolander0 -
Thousands of 404 Pages Indexed - Recommendations?
yeah all of the 301's are done - but I am trying to get around submitting tens of thousands of URL's to the URL removal tool.
| BeTheBoss0 -
News sites & Duplicate content
Ditto on what Donnie said. Purple Cow, if you want that site to be an authority, it needs to be authoritative. Why would anyone buy the Washington Post if it just copied all its articles from the New York times? Get a few staff writers to combine and tweak articles as Donnie mentioned or to write original content. Good luck!
| CleverPhD0