Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Not sure why it gets downloaded. Should be a perfectly normal .txt file. I doubled checked in Google Webmaster tool and from the it says the robots.txt file can be read. Any idea how to remove the download?

    | Resultify
    0

  • Thanks again for the feedback. I see what you;re saying but to be honest I cannot fathom as to why they would be of any significant importance, and so them being taken away shouldn't count a jot really. Our content is v popular (just check out facebook.com/wpmudev where's it's reposted plus other stuff, we've got stunning social metrics as well as a vast array of quality, genuine, interested organic links. And we never went about it in any serious way, and stopped in 2009 or so when we were picking up 300 visits per day rather than the 20k we were prior to the 24th

    | WPMUDEV
    0

  • Thanks Carlos. I figured as much, but thought I'd take advantage of the Q&A here at SEOmoz.

    | MikePatch
    0

  • I guess the only real way is to use site explorer and then contact the site directly... We have been spammed bad by competitors and google is penalizing us.. I have contacted the sites and am waiting for response...

    | HelpingHandNetwork
    0
  • This topic is deleted!

    0

  • You have done the obvious SITE CHANGE OF ADDRESS in GWT ?  Correct? Just checking to make sure what you have done. Also as far as I can tell you have not setup the 301 redirect correctly. hINYF.jpg

    | CarlosFernandes
    0
  • This topic is deleted!

    0

  • Not really. There are some ways to get at those numbers but none of them are going to be exact. If you had fairly steady rankings for that term over the last 2 years, you could extrapolate search volumes by looking at your analytics and estimating CTR. If you run Adwords you can use impression numbers.

    | HiveDigitalInc
    0
  • This topic is deleted!

    | Zersi
    0

  • My site was also hit by the update. We had been quickly ranking for a lot of keywords that we were targeting despite only being a month or so old. Right now we are trying to strategize on how we're going to bounce back. We're strictly white hat in terms of our SEO, so it's pretty frustrating that Google's attempt to weed out spam is affecting us and so many others. We have a lot of pages in our index, but I think Google is giving less consideration to title tags and on page content and much more on domain authority and quality back links.

    | knowyourbank
    0

  • Given the recent warnings in google webmaster tools for some website owners I would say they are looking at your backlink profile for unnatural link profiles with heavy exact match anchor text and links from untrusted sources. In terms of onsite, footer nav links using exact match, keyword stuffing on page, duplicate content. I feel as if none of this is new news but they are trying to enforce and be more aggressive with how they treat websites on these factors.

    | Sean_Dawes
    3

  • Hi Michael, I think that in your case there should be no big difference between the two implementations. It all depends on how it's structured and linked together, because, as you said, now Google tends to treat subdomains exactly the same as internal links. (Regarding this statement I must say that I've seen websites reacting in different ways and not always following what Google is telling us) Of course in both cases you would definitely improve the domain authority, while you would have to work on linking well the blog with the website sections in order to pass link juice to the right pages.

    | MauroMazzocchini
    0

  • Thanks, Peter. I appreciate your feedback. We are going to move slowly until we get a response (or not) to our reconsideration request.

    | rdreich49
    0

  • Jobintourism, It's an interested little video but insanely brief. I've been thinking each site has to pass certain factors (simple go/no go). Keyword, relevant content and clean code are things that matter but there is no real way to associate a numeric value to them. Each of those main metrics likely have smaller metrics that determine whether its a pass or fail. To determine if the content is relevant Google likely parses the page for keywords, related keywords, supporting links and keyword density. If the webpage passes those sub metrics the site will pass for the main relevant content metric. This is the only real way to analyze a page for 200+ factors quickly. I'm looking for anything people have noticed that may be a constant or an interesting variable. -Shane

    | Seoperior
    0

  • Check out Search Engine Roundtable -- they have a tag devoted just to sitelinks and all the weird ways in which they appear. http://www.seroundtable.com/tag/google-sitelinks

    | KeriMorgret
    0

  • Testing different titles to see if you can improve your rankings for specific keywords or to focus on different keywords may be worthwhile but changing them for the sake of making a change is pointless.

    | CPU
    0

  • grazie you are great I ll try to apply all your advice. Only another question? If I find backlink not mine what can I do to delete it?

    | giordanoshop
    0

  • Over optimizing is having an [Exact keyword] all over a page. If you url is http://url.keyword If your title is - keyword H1- Keyword decription - keyword body - keyword... most people who overstuff will add a ton of words just so they can include there keywords many times. 1 keyword per 100 words is a good number to go by... so they add 10000 words and 100o keywords image - keyword footer - keyword When everything on the page is designed for a bot the bot will know. The best practice to avoid over opting is to include your [Exact] key in some places where it would make sense. To avoid over optimizing use broad for some keywords and exact for some.. Mix it up and focus on where things will be best suitable for your users.

    | SEODinosaur
    0