Thanks again,
Running this on my Mac, it allowed me to run NSLookup but when it came to listing the subdomains it advised me that 'ls' was not a valid command.
Dan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks again,
Running this on my Mac, it allowed me to run NSLookup but when it came to listing the subdomains it advised me that 'ls' was not a valid command.
Dan
Hi Steven,
Thanks for the quick response, what I should have pointed out was that I am currently proposing a Strategy for this site and do not have access to it.
Any further thoughts?
Dan
Hi Mozers,
I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information?
The only way I could think of doing it is to run a google search on;
site:example.com -site:www.example.com
The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www).
Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information?
Cheers,
Dan
No probs, happy to help
Hi Stew,
Firstly dynamic URL's are often used to assist with searching OR filtering on a site. The practice is an inevitable part of offering flexibility to the user.
The issue with overly dynamic URL is that, for example;
If you have three elements to your URL EG/ http://test.com/search?element1=a&element2=b&element3=c and each element has 10 options, google will eventually crawl 10x10x10 pages = 1000 pages. Overly Dynamic URL's can create thousands of combinations of a URL very quickly and each URL will be seen as a unique page by Google.
Most of these pages will have duplicated content (although different products in different orders) on it. Depending on the way this section works, you may want to block the crawling of this search section using robots.txt.
I would also go to webmasters->YOUR-SITE->configuration->URL Parameters From here you can advise Google what to do with each element.
Hope this helps!
Dan
Hi Dana,
Without knowing anything about the site or the keywords, I would deduce that the url_1 page has best link profile from external sources for KW1...
You will be able to see this by running the three URL's through open site explorer. This will help you understand your offsite optimisation.
I would imagine the url_1 has more links than the other two AND/OR the sources of these links are more relevant (or sites with higher Page / Domain Authority) AND the ratio of keywords in the anchor text vs generic anchor text is better (would aim for 1 keyword back link to 3-4 generics).
At the end of the day, the reason SEO is such a busy and exciting industry is that there are many signals that help a page rank. If there was an exact answer to your question the industry wouldn't exist.
Hope this helps,
Dan
Hi Ken,
The quick answer, no...
Both SEOMoz and Adwords do all the work at their end and you only make the one request to these sites to obtain the data.
The CAPTCHA code occurs when multiple requests are run from the one IP address to Google search. This can happen naturally if many people are googling at the same time and your company only runs the one IP. It could also occur if you are using downloaded software to obtain SERP data directly from Google.
Hope this helps,
Dan
Hi,
Scrap Link Building... Treat it as Link Earning, period!
http://lmgtfy.com/?q=site%3Aseomoz.org+link+earning
Best Software? No software is best software!
Hope this helps,
Dan
Hi Semantique,
Hmm from a quick look I would look at the following.
Page Speed - Check out the home page on Pingdom http://tools.pingdom.com/fpt/#!/recfTiYWP/http://www.catwalkqueen.tv/
I would look to crunch the images for a start, they were very slow to load, the background image alone is almost 300kb.
Caching images would be recommended, ask you devs to combine Javascript files OR stylesheets where possible...
Dan
Hi Steve,
I think you should rephrase the question...
How can a responsive design harm SEO?
Typically a responsive design is driven by either USER AGENT or screen size... Both of these, if implemented correctly will not affect the way the Google Bot crawls the site. I would doubt there will be issues with 404's as the URL will be the same regardless of the device.
I would suggest it is poor implementation of the design.
Hope this helps!
Dan
Hey Andrew,
I would suggest having a category regarding small business and news within your blog, yes... I would also suggest having a category for your product, discussing a feature per post loosely marketing it to your visitors. Have a section to discuss new features you are working on (if you are happy to do so), also request feedback about other features are of interest. Ideally you want a few different sections, this will offer you a number of ideas to write posts on, but also it will diversify your content.
As for "taking great content from elsewhere on the internet and adding it to my blog" I highly recommend against it. However, you can re-write the content (perhaps, combine the details of more than one post).
Duplicate content is a world of hurt you don't want...
Cheers,
Dan
PS> If your question is answered by any of the great responses above I urge you to mark it as 'Answered'. You can mark more the one response.
Hey again,
There are five main SEO benefits to running a blog, but writing it for SEO purposes you are only going to take advantage of three and in fact one that will actually go against you.
They are (in no particular order);
I understand this is not an easy thing to hear (I pitch clients this all the time), but a quality blog is worth the time.
Your subject (online invoicing) may not be the most exciting topic, but there are still ways to making it exciting, think outside the box. Think of your audience, you only have to make it interesting for the people who are already looking for online invoicing. Make it interesting for them.
Hope this helps.
Dan
Hi,
It seems like you are running this more like a chore than a beneficial blog.
I would suggest posting when the post is completed (not necessarily all at the one time). I would also create content without being to concerned on length.
A blog should be a naturally occurring organic site OR component to your site. Post as often as you like, there are no rules to how often as long as you post occasionally to ensure fresh content.
Remember SEO is not simply a set of rules, focus on generating content to excite and/or educate your visitors and the rest will follow.
Hope this helps!
Dan
Hi,
You have listed this as a question, but haven't asked one...
So I have a question for you, Has the website that you are talking about ever posted a job on any job seeker sites?
Dan
Hi Guys,
I think the sentiments that is being passed here is the difference between link building and link earning... Submitting to directories is a link building activity that can be achieved with little friction, and therefore is often subcontracted to other people. But what are the directories? Will the ever be used to find a business like yours OR are they simply directory spam... My guess is the later.
There are a few directories out there that are worthwhile, but I would recommend making these submissions yourself. Set aside say 2hrs on a Saturday morning (OR Monday morning if it is simply the company you work for NOT own) to find the next one and make your submissions. I would take a top down approach to directory listings. Pick on the most beneficial and work your way down, continually asking yourself the question "Would I use a site like this to find a business like mine?"
If you are in the States, Canada OR UK I would suggest looking into getListed.org (Please getListed/SEOMoz team, do Australia next
for directories worth chasing. Also SEOMoz have a list of directories that they feel are worthwhile that you can access as a PRO user... Some are genre directories.
Think about it this way, What happens if your company was to move? Running directory listings on 100, 500, 1000 poor quality directories, would you bother to update the address on all of them. If you got someone else to do it, would you have all the login details they used to generate all these links... Probably not... If you do it yourself, you have all the details, you have only approached the directories worth targeting and the task is a lot more manageable...
Running low quality backlinks (directory listings included) puts you at greater risk of the Penguin update. Once you have sit below the threshold, you are at risk of being penalised either manually or by the next Penguin update. Mpst SEO will tell you, the recovery tasks involved in breaking the Penguin penalty are 2 to 5 times the amount of work that building the poor links in the first place. Don't risk it!
On a final note, link earning is about chasing after those links that have the friction associate to them to acquire... EG/ Google Maps with there postcards approval process etc. You should always approach any form of backlink by believing the link will be more valuable from the organic traffic that it will bring your site from the source site than the backlink is towards your backlinking profile.
Thus ends my novel, congrats if you made it this far 
Hope this helps,
Dan
Hi Mark,
I personally would run a clean 301 to the new site, then use something like $_SERVER['HTTP_REFERER'] (in PHP) to determine where the user came from. If it's from the old site, run a small banner in the header (like Hello bar) to advise your users of the change. This would be a lot cleaner for search engines and very user friendly.
Don't forget to transfer the sites worth using webmasters...
Hope this helps.
Dan
Hi All,
I am looking to start a eCommerce business and would like to centre the user engagement of the site around a forum.
Can anyone suggest a forum platform that adopts good SEO practice?
So far my considerations are;
Anyone used these with great success? Do you have another suggestion?
I am simply in the preliminary stage of sourcing something and am eager to here your thoughts...
Thanks in advance...
Dan
Hi Andy,
I would suggest running rel=next and rel=prev canonical tags with these.
Read more about it here http://yoast.com/rel-next-prev-paginated-archives/
Also although all-in-one seo is a good plugin I would suggest sticking with yoast. I would point out, do not attempt to run both at the one time either.
Hope this helps.
Dan
Hey Mase,
Yeh I'm going to fence sit on this one, but will offer this design tweak.
I think these look spammy, but I don't necessarily think you will be penalised for it. My suggestion would be to design this area to look less spammy. Consider a clickable drop down for each major city in the three sections of links or a show/hide section for each section etc. A comma seperated linkfest will attract a manual spam action, improve usability to prevent this .
I would also consider adding more unique content per page, because if you are having multiple pages where each listed item is shown more than once, the items content will offer little benefit.
Dan