Hi Mark,
Yes they do, however the links get redirected by Joomla via a 303. So the link value is not that big like if you should give them a direct link.
Hope this helps!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Mark,
Yes they do, however the links get redirected by Joomla via a 303. So the link value is not that big like if you should give them a direct link.
Hope this helps!
Hi Raphael,
At the moment there are some problems with all the keyword tool from SEOmoz, they reported this on Twitter a couple of hours ago, see here. So sent Roger some love and hopefully the tools will be back and be available to use again. There was no ETA determined at that point.
Hope this helps!
Hi Igor,
If you have a look at the page the URL is not found on this page.
Hi Mark,
Within you Crawl Diagnostics you're able to export your data to CSV (on the topright side of the overview). By doing this you can find the links to the page you think is incorrect.
Hope this helps!
Hi Amit,
No, by enabling the country you target in Google Webmaster Tools you give Google a better idea on which language and which country you are targetting with your Web site. This could help you in the rankings for the specific region you're targeting. Google Webmaster Tools gave an even better explanation about Geotargeting which could be found here.
Hope this helps!
Hi Cody,
The best way is to block Rogerbot within your Robots.txt from crawling specific pages of your site. In your case protecting Rogerbot from seeing the pages with a session ID.
More information could be found here on Rogerbot.
Be cautious and test it out, but the lines you would have to add to your Robots.txt are probably:
User-agent: rogerbot
Disallow: /*sessionid
Hope this helps!
Hi Tourman,
Your first question: This really depends on the setup of your hosting and development at this moment, but I would say in almost 90% of the cases these sites will be uploaded to the same server.
This could affect your rankings but in a really really small difference, because Google is taking the server location into its algorithms. Personally I worked on a couple of sites which didn't had it's servers in the same country as the top level domain and I couldn't found any difference between the top level domains which had the server in the same country.
So I wouldn't worry to much about the effect on rankings of your servers location.
Hope this helps!
Hi Miarisoa,
This is probably caused by the fact that Roger Mozbot wasn't able to crawl your complete Web site and so didn't find all your internal links. As example I Googled your site and found this URL on page 17 on Google. When you enter this within OSE you directly see there is no data for the URL. As you can see here.
Hope this helps!
Hi Jon,
It looks like to me the Roger Mozbot didn't' scrape your complete site but just a couple of pages. So that's why some pages don't show up in OSE as internal followed links. However, I found this URL in OSE which seems to show a number of internal followed links.
So I won't stress to much about it
Hope this helps!
Hi David,
It's not very cool that random urls are not responding with a 404 status code error. But for now I think you have other items to fix first. Because as long as you're not linking external or internal to these 'broken pages' also Google isn't able to find them. And so won't think the site has a lot of soft 404 errors. To be sure the best would be to take a look at the crawl errors within your Google Webmaster Tools account. If there are a lot of recent (soft) 404 pages/ errors found by Google I would move this more to the top of my todo list.
I would prefer myself to fix them later on, but like many things you probably have a lot of other todos you want to take care of first.
Hope this helps!
Hi Gareth,
To determine your rankings SEOMoz will look at all the pages of your Web site and not only your homepage.
So you don't need to add/ set up a separate campaign.
Hope this helps!
Hi,
Probably without the www. so: site.edit.com/robots.txt because otherwise you would have a subdomain of a subdomain ;-). But the rest is perfect!
Hi,
The Google Robots will look for the robots.txt in each individual root. So you need the robots.txt in the root of the subdomain not just the domain root. That's why its also possible to include a complete disallow in there and not just: .edit.com/* .
Example:
User-agent: *
Disallow: /
Hope this helps!
Hi Adam,
I would recommend adding your site to Google Webmaster Tools, this will give you under Diagnostics data about the Crawl statistics from Google. This will tell you:
Which in the end will look like this provided with graphs.
If you worry about the 100 milliseconds it's OK. A second is 1000 miliseconds, so 1/10 of a second is between the click and the trackEvent.
Hi Atul,
This is going to be an intimate morning ;-), you should add this code indeed in the of your page.
Hope this helps (again)!
Hi Atul,
Your second attempt with the well written play code is the correct one.
Hi Atul,
Quite a short answer: This would give you the number of persons who clicked on the play link ans saves it as an event in Google Analytics.
Hope this helps!
Hi Fraser,
Hopefully I could help you out, if click in your campaign overview on the settings link of your campaign. You'll get something like this screenshot. The button to disconnect could be found in the right bottom of your screen. When clicking to 'Disconnect from GA' you'll get an extra javascript alert to make sure you want to disconnect.
Hope this helps!
Hi Tug,
You'll need to provide at least one of video:player_locor video:content_loc. Both would in a perfect world be the best probably.</video:content_loc></video:player_loc>
Hope this helps!