Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Hi Alan,
Have a look in google webmasters to see if the same 404's are occuring there. If so, they typically give you a list of page that generate the error page. It would be best to eliminate these issues at the source first as they will be offering a poor user experience.
As a prevented measure you could run these pages as a disallow, yes. But I fear that this will make it more difficult to detect these issues in future.
Dan
Hi Heather,
There is no correct answer to this, but personally I would work on a manageable set at a time. Try grouping them and create content and optimise accordingly. The unfortunate thing about working on many keywords at a time is that your effort becomes diluted.
Hope this helps.
Dan
Hi Edison,
Your question isn't very clear, but I will attempt to answer it the best I can.
You can target more the one keyword to a page OR multiple pages to a keyword. Generally I would only target one to three keywords per page depending on the similarity of the key phrases and the competitiveness. For best practice, generally one keyword per page is best.
On a first attempt I would only work with one page for any one key phrase. I have seen multiple pages from the one site rank on the first page for competitive keywords, but it is challenging.
Hope this helps,
Dan
Hi Bob,
In Cases like this, I take a step back and think is this being deceptive, is this already occuring on the web? The answer is no its not deceptive and yes its already happening in business today. I am sure there is an example where Google themselves are trying to capture top and bottom ends of the market (insert example here), but I'll give you this one.
In Australia a company called Coles Myer have two supermarket chains Coles and Bilo which capture the mid-to-top end and budget end of the supermarket market. They are shown as two completely separate companies, but I'm sure in an obscure about us page are still listed as part of the Coles Myer group.
The moral of my story, run both but ensure almost all aspects of the site are unique. Separate themes, separate dev teams, unique content, separate hosting etc. They should be seen to have been created by two separate teams to ensure success.
Hope this helps,
Dan
Hi Alan,
This is a two part question. For the search results I would add
Disallow http://www.practicerange.com/Search.aspx?m=*
To your robots.txt file. I would probably increase to the entirety of search (by removing the ?m= from the line above), but as I don't completely comprehend how your search section works start with the above. It very rare that search pages offer new content, generally they dilute other pages by duplication.
With your second issue I imagine there are clues in the aspxerrorpath variable. Although I couldn't get any combination of this string to render the below url didn't return to the 404 page like the rest did. Its the tilda (~) in the error string that I think offers the biggest clue.
http://www.practicerange.com/Golf-Training-Aids/Golf-Nets/
Hope this helps,
Dan
Hi Micro,
I would suggest practice is your best bet here, use the first free month to practice with the tools. They offer ease of use and a group of great people in the community to help with the more specific questions.
Dan
Hi Andy,
I would simply change the image file name and any alt text on the smaller image as it appears to be a duplicate image from a filename perspective (even though they are in different folders and different sizes). I would imagine Google sees them as duplicate and favours one, in this case the small image.
Hope this helps,
Dan
Hi Matt,
I am going to guess that the only reason that this is occurring is that when the sites were separate the SEOMoz bots crawled your sites and that data is still in their index. Potentially you could use a secondary tool to check. Possibly http://ahrefs.com/
If this isn't the case I would question your method of running your 301's, but without eyeballing the site/s, I think this question is far to difficult to resolve with the limited information provided.
Dan
Hi Chandu,
Webmasters will be your friend on this one. I would recommend heading to the landing page of your site under webmasters then selecting Configuration->URL Parameters
The errors that are showing in your image have a number of parameters in them, try to determine which parameter is forcing the 404 and tell google what to do with it within the URL parameters section.
Once complete, try to test your change, download and store your 404 issues, clear the errors in webmasters and then wait until the site is recrawled.
Hope this helps.
Dan
Hi David,
Forget Exact Match Domains. Firstly, they no longer have the appeal (as they are not as easy to rank nowadays). More than one domain means that you need to work twice or multiple times as hard in order to get them to rank.
You are best investing the time in your site. Generate more content, run a news section or a blog. Guest post on like sites with engaging topics in your industry. Engage in forums, be known as the leader in your industry. One strong site is better than many small sites. You don't see apple run a site for tablets OR phones OR laptops.
Hope this helps,
Dan
Hi Courtney,
Are you suggesting that the sitemap itself is 404ing OR webmasters is in indicating your site has 404's of page that exist on your sitemap?
If it's the sitemap itself, can you navigate to it directly? Does it render in a browser?
If it's an error from a page on the sitemap, and the page currently renders there is a good chance it didn't at some stage. If that's the case you can ask google to recrawl it as an individual page, see;
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1352276
Hope this helps.
Dan
My first reaction was to disagree with you EGOL as a company had many different personalities. BUT, you have a very valid point. Unless you can get away from a level working relationship (all members being equal partners), issues will arise over time.
Good insight!
Hey Tiff,
I don't have a complete answer for you cos I am doing some insomnia q&a (it's 4.30am in Oz) but I really hope you are using robots.txt over robot.txt? This may be the reason is not working.
Can you tell us what the contents of the robot/s.txt is?
Dan
Hi Victoria,
I agree with Andy but would put it this way.
A well constructed site with a lot of great content and some well situated external links (on a variety of dominant and relevant sites in related industries) will out perform many site with little content all interlinking any day of the week.
The unknown here is, what are the group is going to blog or sell? If you can concept the site as the one entity and all the women are discussing/selling to a similar topic, fantastic. But, if the topics are a little abstract (say baby clothes, pets and mountain biking) I would not try running them as one, unless you can build the site to discuss all these topics in very fluid way (with the example I gave I very much doubt it would be possible, although I could be surprised).
The main takeaway is that I wouldn't simply run a directory of separate sites on the one domain without having a strong theme to seamlessly offer them as the one site with a variety of categories (for use of a better word). Also for usability I would also ensure they ran on the same theme.
If you can pull off the one site I commend you and think as a group you'll go far. Don't forget about Andy and I when you make your first million 
Hope this helps.
Dan
Hi Matt,
It is fair and reasonable for a site to be decommissioned from time to time (or a page or two OR say a category of pages). As the page no longer exists, you can't ask the admin to remove your link (which is what Google expects you to do before disavowing a link), so I would simply ignore these links in your exported list. They will disappear from GWT and OSE over time as they are no longer in existence.
Disavowing links amongst other things helps Google determine a sites intent on the web and therefore they only want you to utilise this service if you no longer trust the relevance of the link and have had no joy requesting it's removal.
Hope this helps,
Dan
Hi Guys,
I edited my response above to clarify the confusion. Can you confirm Matt...
You are discussing a page on an external site that currently 404's. This page used to have a link to your site and subsequently was added to Webmasters and OSE in the past. Now this page no longer exists, but the reference still remains in Webmasters and OSE. Correct?
Dan
Hi Matt,
A link (to your site) that is no longer accessible from a page (external site) it was on due to a 404, no longer counts. Why does it still cause you concern, is it still listed on webmasters or OSE? If so, I would suggest this is because the page has not been recrawled, or attempted.
If you're concerned that the page may reappear, I would recommend writing to the admin of the site to request it be removed. Disavow should always be your very last option.
Hope this helps.
Dan
Sadly, although this site is big (over 1000 pages), wolfram alpha doesn't offer the tab that suggests subdomain...
I checked it on a number of sites (seomoz, harvard etc ) and it worked well.
Any other ideas?
Dan