We're looking at moving our websites to the cloud. Most services seem to default to providing a dynamic IP address, with static IP addresses being offered as paid extras.
Is there an SEO disadvantage to having a dynamic IP address?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
We're looking at moving our websites to the cloud. Most services seem to default to providing a dynamic IP address, with static IP addresses being offered as paid extras.
Is there an SEO disadvantage to having a dynamic IP address?
Hi Vinod
I have the same issue with Wordpress sites. The category and archive links mean each page has too many links according to my pro reports.
I would say that if these links are internal and they are genuinely useful from the user's perspective (make the site easier to navigate and find what they're looking for), you shouldn't need to worry too much. However, if you think you can do without some of them, see if you can remove some of the widgets that generate these links.
(disclaimer: I'm not familiar with the Blogger platform, so can't advise on the technical side of removing these links)
Thank you for your response, Megan - I can confirm that I'm now seeing UK results for Google UK in my reports. So that's great!
Thanks
Heather
Hi Peter
It would be helpful to know how you are redirecting these pages (301 etc) and for what reason?
Also whether any of your pages are blocked in your robots.txt file - perhaps to prevent duplicate content?
Hi John
I'm not sure - we only target the UK so have never monitored other countries.
We rank in the top 5 consistently for our keywords on www.google.co.uk
However, the ranking report (set to Google UK) shows that we don't appear in the top 50.
When looking in more detail at the keyword in question, it shows the top ranking sites to be all US sites with .com domains that don't rank at all on google.co.uk.
Can anyone shed some light on this or tell me if there is any way of getting accurate ranking reports via SEOmoz tools for UK sites, so I don't have to do it manually?
Thanks in advance
Hi there
We had a similar issue and just added:
User-agent: *
Disallow: /
into a robots.txt file on the subdomain itself. This worked OK for us.
Hi Atul
As far as I'm aware, Google will allow you to submit a site to Google News, but not individual press releases.
You can find out about the criteria for submitting your site here
Hope that helps
Heather
Hi Dheeraj
My first recommendation would be to download the SEOmoz toolbar. This will tell you the number of links to the page and the number of root domains. It'll also give you the Page Authority and Domain Authority. You might also want to run Open Site Explorer to see who's linking to them (making sure there's no 'bad neighborhoods').
See here for information on the toolbar: http://www.seomoz.org/seo-toolbar
Also - a general look over the site should tell you whether they're sites are good quality or just contain 'thin' content for SEO.
Building up the keyword domain looks like the way to go as Google seems to favour these.
There's a good answer about a similar situation here: http://www.seomoz.org/q/how-to-use-good-keyword-url-to-help-main-site
Hi Talha
We use Hootsuite to manage several Twitter accounts and Facebook pages as well as LinkedIn accounts. You can set an account up for free, so no risk if you decide you don't like it.
This will give you the option of scheduling tweets and updates for any time in the future and to send to one or more of the twitter accounts at the same time.
I believe Tweetdeck does a similar thing, but I've not tried it myself.
Heather
Thanks Yumi, I'll take a look at the CSV files so see if I notice any patterns.
linking root domains have only dropped slightly (around 15%), but I'll take a look into which root domains are now missing from the reports
Just checked our links for a couple of our sites and noticed that the number of inbound links has dropped from around 55,000 to 13,000 on one and from 6000 to 700 on the other.
GWMT still showing the previous amounts.
Anyone else experienced this over the last few days?
My understanding is that by using the script given in the post that John mentions above, the comments are crawlable as it removes them from the iframe and places them on the page as html - so Googlebot sees what the user sees.
Unless I miss understood the post?
Hi Martin
The parameters in the URL are a big giveaway that this is an affiliate/paid link, so it's not a good idea to rely on these solely for your link building campaigns. There are ways round this by using cookies to store the affiliate details and keeping your URL clean, but your current system may not allow this.
As Ryan says, never have all your link building eggs in one basket. It's much better to diversify. This will look more natural to the search engines and hopefully mean any changes in Google's algorithm, should affect you too badly.
I would be interested to see whether anyone has done any experiments or had any experience of this effecting search engine positions.
I would say that as long as the H1 appears near the top of the page before any
oretc, so that the markup is correctly formatted as the W3C guidelines state, then it shouldn't really matter. But I could be wrong.
There's quite a bit of difference in terns of how they appear on the user's wall. See article http://daggle.com/facebook-button-facebook-share-keeping-1792
And there's a clear distinction between the buttons. I appreciate the article is old, but the latest ranking factors from SEOmoz show that shares have a higher correlation to search engine rankings than likes. Why would the two be separate, if they mean the same thing?
Hi
We're not seeing any for our sites, but I know the stars in the sponsored listings are coming from Google Product Search, and there's a bit more about this here: http://adwords.blogspot.com/2010/06/introducing-seller-rating-extensions-on.html
The stars in the organic listings, I believe, come from marking up your HTML with microformats (more on that here)
Google Adwords only displays the rating if it's 4 or 5 stars, so I can't see it having a negative impact.
I'd be interested to see if anyone has monitored this on their own site.
Hi everyone,
I'm reading everywhere about the Facebook 'Share' being more powerful than the Facebook 'Like'.
Just yesterday, there's a post on the SEOmoz blog about Facebook Shares http://www.seomoz.org/blog/plus-one-adoption-rates-and-social-sharing-statistics
But I thought the Share button was replaced by the Like button? http://mashable.com/2011/02/27/facebook-like-button-takes-over-share-button-functionality/
All references to the Share button now point you to Facebook's Like button developer page.
So if we want more SEO value, where can we get the Share button from? Or do we have to make do with the Like button from now on?