Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Hi Antony, I haven't seen a heatmap study or clickthru-type report covering this specifically. However, my gut feelings is that you are better off being #1 with places attached as the listing is so large and visible. I think you have cause to jump for joy with that top local ranking. I bet you are getting the lion's share of interest on that page of results. Sorry I don't have anything to quote you. It's possible someone has done a heatmap study and that might be something you could research.

    | MiriamEllis
    0

  • Backlinks are the foundation of the original algorithm.  Check out the first paragraph in "Design Goals" section 1.3.1 in Brin's and Page's original paper on google: http://infolab.stanford.edu/~backrub/google.html  The last couple of lines point to how backlinks are what guide's google to figure out relevance to a significant degree. However, as people have both witnessed and experienced, google's punishing specific types of links like blog networks right now and also probably taking into consideration notions like what Geoff Kenyon wrote back at the end of January before a lot of the sh*t hit the fan: http://www.seomoz.org/blog/anchor-text-distribution-avoiding-over-optimization I think the key is to keep the ratio more heavily weighted towards the brand and the urls of the original website and definitely not having too much of one specific anchor text.  However, some companies still seem to be "under google's radar" like this one I noticed recently: http://www.opensiteexplorer.org/links?site=www.autson.com.  I guess if you "make it appear natural", you can still make it under their radar.  At the same time, I would HIGHLY encourage a larger proportion being more naturally built links despite the pain and time it takes. Google's going to constantly disrupt the business though...how else can they keep their stock price going up by pumping it with all the adwords spends?  

    | SeattleOrganicSEO
    1

  • Mmm... if I search "Vallnord" in Google.com, I see as first result http://www.vallnord.com/en and the description snippet is actually its meta description tag. As second result I've the .com (the catalan version). In this case the meta description is generated by Google itself as the original one is everything but a description: <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Arcalís | Arinsal | Pal. Andorra</a>" /> In the case of the spanish version, it doesn't appear in the google.es SERPs when I search both for Vallnord and Vallnord castellano. You have to click on "more results from Vallnord to see it. In this case, I suppose it is due to the fact the catalan is a language that you cannot geotarget as you can do with spanish or english, for instance (I'm not going to enter in "nationalism" issues :D). That means that it appears in any Googles, also the spanish one. And having the .com a stronger link profile than the spanish one, it appears in the spanish SERPs. The reason why your meta description doesn't appear when you do a search for "Vallnord" is quite simple, imho: the word Vallnord it's not present at all in the meta description, therefore Google compose the snippet assembling phrases of your home page where Vallnord is present.

    | gfiorelli1
    0

  • Ok, for some search terms the site name is appearing in the title tag and for some keywords it’s not… I would suggest you to check if you have recently changed the title tags of your website… if yes then this might be the crawling issues that some pages are crawled in Google and others are not. So the pages which are not crawled and updated are showing the site name while others don’t… One more thing, sometimes when Google think that the current title tag is not relevant to the query it changes it to brand name (at least in Google US) Although there are rare chances of this but I would also suggestion make some solid and relevant title tags against a query. If you see the pages are not crawled, I would recommend you to upgrade the .xml sitemap and submitted it again t to Google Webmaster Tool.

    | MoosaHemani
    0

  • Bing controls Yahoo so make sure you are using Bing's Webmaster Tools and have XML sitemaps. But the truth is that they are both different so it's really hard to control them both.

    | Copstead
    0

  • Thank you Tommy, your answer is very detailed! I'm really grateful. Francesco

    | seomoznicchia
    0

  • In my view it still has a large influence (although Google has stated this is diminishing). Short term strategy - go for the second domain. Long term strategy - go for the first. You'll create a better/more memorable 'brand name' and eventually Google's algorithm changes will reduce the importance of keywords in the domain name. You'll be able to get good SEO results, it will just take a little longer.

    | bradkrussell
    0
  • This topic is deleted!

    0

  • Hi Claudio The idea is that if you have multipule sections each completely difference topics then you can use multiple H1's but if the topics fit under a single heading i would use a h1 once for the overriding heading, and then a h2 for each section. Having completely different topics on the one page makes it hard to optimize and rank. Also bing so far dont allow it. http://thatsit.com.au/seo/reports/violation/the-page-contains-multiple-h1-tags What would be good, is if search engines would see each section as a different page if it has its own H1, and that maybe the the future, but at the moment i would try to use only one.

    | AlanMosley
    0
  • This topic is deleted!

    0

  • nope, work harder on being a good company and build links to the stuff below. I assume you have a lot of exact match social media accounts? Build links to those and get them to rank S

    | firstconversion
    0

  • Thanks.  I'm sure there will be more empirical studies related to rich snippets.

    | southernresearch
    0

  • From an SE point of view XML sitemaps are enough, if you have a large site you may want to consider having more than one sitemap for different categories. As Kieron suggested HTML Sitemaps are useful for people to navigate your site it might be worthwhile writing some PHP to convert the XML into HTML and making your HTML Sitemap a little more dynamic?

    | ChrisDyson
    0
  • This topic is deleted!

    | Hartz
    0

  • Google analytics has been slow but listening to user requests often. I remember early last year I did a comprehensive comparison between google analytics and PIWIK , which at the time offered real data. A lot of people in the Analytics including Avinash kaushik  mentioned PIWIK as one product offering real time data. This feature is quite nifty but at the same time does not yield instant insights. I think over the course of time, this will get much better offering real time info on user patterns and what people are doing and clicking on your site. This is of course after a user permits being tracked.

    | weboptimizers
    0

  • That's a great question! The thing is, there are websites that naturally have high bounce rates and low avg. visit duration time. One example is Wikipedia: people google "what year did  Titanic sink", get their answer and quit without visiting other pages. Can we say that high bounce rate or low avg. visit duration time negatively affects Wikipedia's rankings? No way! I think the algorithm Google applies is more complex. The search engine might stereotype the web and assume that in case of insurance websites, for example, visit duration >N and bounce rate

    | OlgaG
    0

  • I just had a link at one of their embed codes. Except a 300 characters snippet/excerpt, everything else is being served via a javascript. Links are shortened using their own URL shortening service and the images also look like hosted by their API. I don't see any duplicate content issues. From an advertiser/content provider perspective: Big brands have a decent possibly of getting brand exposure for their content without worrying too much about duplicate content issues. From a publisher perspective, if you have the audience for a specific kind of content, you sure can pull some content, but it would neither hurt nor help you towards your SEO/content.

    | NakulGoyal
    0

  • I think it is unlikely that some effective Negative SEO has been performed and had an effect on your site rankings so quickly. The movements you are experiencing are more likely related to: Your site was linked to from a network which increased pagerank but these links are no longer of any value and so this has reduced the incoming link value to your site. The significant reduction of backlink quantity in a short period has caused a ripple in your SERPS score, this will eventually even out as more time passes. So there are two things affecting your short term SERPS position and this will become less significant as time passes and these events become more distant. No doubt it is upsetting to find such a move however with the rollout out of Panda 3.4 (and other algo changes) everyone can expect movement in some way or another. While you await your appropriate SERPS positions to return, proceed to focus your efforts on building good unique content, knowing that visitor activity on your site is being evaluated and by providing a great experience to your visitors you will find that your are doing exactly what Google is looking for.

    | ebmocwen
    0

  • I own a company and usually write my own blogs but not every time. The times I don't I pay to have them written and thus own the copy. Can an author be a company and the link point to the company about us page?

    | JAARON
    3

  • Hi Peter, Like Pashmina, I'm wondering if what you are seeing relates to the Venice update. Did you catch Mike Ramsey's YouMoz post on this? http://www.seomoz.org/blog/understand-and-rock-the-google-venice-update Some changes have definitely happened. Miriam

    | MiriamEllis
    0