Some years ago people used to make this claim about W3C validation badges too. A badge is a badge is a badge. It really is unlikely to affect your search rankings one way or another.
Posts made by AlexMcKee
-
RE: Does DMCA protection actually improve search rankings (assuming no one's stolen my content)
-
RE: Clarifications on the Moz Analytics package (Medium - $149 per month)
Due to the extraneous text at the bottom of your question I am answering each question with the subject of the question prepended to the answer in parentheses. So in answer to your questions:
1. (tools available) Moz Analytics, Followerwonk, Open Site Explorer, Fresh Web Explorer, Rank Tracker, Keyword Analysis (including keyword difficulty), On-Page Grader.
2. (crawl entire site) Yes, add your site as a campaign and Moz will crawl your site and inform you of any problems and factors affecting your rank. It will also track changes over time without manual intervention.
3. (10 campaigns) Yes, each domain is a separate campaign.
4. (subdomains) This is configurable. You can set in campaign settings whether to track only this subdomain or all subdomains or even just a specific sub-folder.
5. (keywords) The number of keywords relates to how many keywords can be tracked in your account as a whole across all of your campaigns. 750 over 10 campaigns gives you 75 keywords per campaign. The rank of your site's pages is tracked for each keyword, pages are graded for their ranking ability against specific keywords and you can check the rank of any page for any keyword you are tracking.
6. (social accounts) You can track the performance of your social media accounts by connecting them with Moz Analytics. You can also track your competitors accounts to measure your performance against competitors in some aspects such as the level of interaction (comments and shares).
7. (branded reports) Branded reports allow you to export reports on your data with your own branding, a useful feature for agencies and consultants.
I am a Moz Pro subscriber and highly recommend it, Moz Analytics and the various other tools are extremely useful.
-
RE: Need help in understanding why my site does not rank well for our best content?
Your site is competing in a very highly competitive field. I did a check in Moz Keyword Difficulty on your example page's targeted keyword (nokia lumia 830 review) and that keyword is highly competitive. The Google India SERP for the keyword is dominated by high domain authority sites with high page authority pages.
-
RE: Weird problems with google's rich snippet markup
I'm afraid it is the obvious. Ensure the rich snippets are relevant to the content of the page, ensure that your page is ranking for relevant queries to raise the chance of the rich snippet being shown.
-
RE: Weird problems with google's rich snippet markup
Google only shows rich snippets when it thinks it will be useful to the searcher.
As you have said you have had some issues with maintenance, check the structured data against Google's structured data testing tool. However it is more likely that Google isn't showing the rich snippets because it believes your page quality to be low or that the structured data is not relevant to the user's query.
-
RE: Csv file for moz local
Great to know that Moz is working on support for other countries!
-
RE: Csv file for moz local
Moz Local presently only supports the United States. I ran into this myself a while ago. Hopefully they'll get around to supporting UK and Germany soon.
-
RE: Not Getting Clients
Chris has already given you some valuable advice so I will restrict myself to on-site issues.
Your sign up form asks for an awful lot of private information. This is surely not helping you to get enquiries. In general I would advise you to keep enquiry forms quick and simple to complete, remove any extraneous fields such as religion or marital status. Asking for very private information like date of birth, national ID numbers and mother's and father's names could be seen as suspicious. You shouldn't need this information to do business and if you do need some of this information it could be collected at a later time once you have established a relationship with your prospective clients.
Your warning about terms and conditions does not link to your terms and conditions, which would be another element that is probably causing people not to fill out and submit that form.
-
RE: Structured Data + Meta Descriptions
Very interesting! I don't recall seeing that before but I checked the Internet Archive's Wayback Machine entry for that URL and the quoted extract has been there since at least 2013.
Elsewhere Google has been pretty insistent on structured data being part of the document itself as much as possible so it does seem somewhat contradictory advice. As you say perhaps they've simply forgotten to update that particular entry to reflect current thinking.
-
RE: Structured Data + Meta Descriptions
Once upon a time it was possibly a good use of the meta description to include some salient structured data but today we have a proper way of marking up structured data. The meta description is best used for compelling, relevant copy to attract the user to click through to your site as the meta description is your one best hope of affecting what is shown to the user in the SERPs.
Search engines haven't shown any inclination to parse the meta description and I doubt they would do so in future. Structured data belongs in the document itself, marked up accordingly.
-
RE: How many redirects on a redirect can you have?
In 2011 Matt Cutts advised that Google does limit on redirect chains - he indicated the Googlebot won't follow more than around 3 or 4. No limit on the total number of single-level 301s.
In your specific situation I would redirect both the original and the first replacement to the new replacement so that users and bots can reach the new page in a single redirect hop.
-
RE: GWT does not play nice with 410 status code approach to expire content? Use 301s?
410 means "Gone" and is used to indicate a resource no longer exists. If it exists use 200 (OK). If it is out-dated place a notice to that effect but still leave it as 200. Putting a noindex instruction in the robots meta element should be sufficient to remove it from the Google index, though it may take some time. Nofollow is probably not what you want to do as this will destroy any link value flowing through those pages. If it is so out-dated that it is considered valueless then it should be deleted and 410'd. A 301 redirect can be used where a new resource that substantially replaces the old resource has been created.
Not sure why you would want to keep Google's index of your site 'lean' unless you have a lot of resources competing for the same keywords and are concerned about cannibalization.
-
RE: How can you perform a simulated search query from another location?
You could use a VPN perhaps but it might get a bit expensive if you need multiple locations.
-
RE: Language Usage for SEO in Hong Kong
My apologies for overlooking your response. A good tool is https://github.com/jpatokal/script_detector. It is a library for Ruby and you can use the interactive Ruby console (irb) to use the function to identify script in an easy way. If you need help/guidance with that drop me a reply here or by private message and I'd be happy to help.
-
RE: Cached status(date & time) not showing
The cache info box shown in Google's cached pages is actually present but is being hidden by the website's design, specifically the div element with the classes "wsb-canvas body".
-
RE: Dashboard shows low number of visits
When you say that you know you've had over 100 visits, how do you know this? Server logs, for instance, record hits not visits or unique visitors.
If the visitor stats in Google Analytics and Moz match - which they should - then it seems unlikely there is a problem. Remember to filter by the same period in Google Analytics as your timeframe in Moz Analytics.
-
RE: What is TF-IDF
TF-IDF is a measure of how uncommon a term is against expectations inferred from a large dataset. In reference to a search engine like Google the large dataset is the Google Index. So if Google finds that a given keyword, let's say "non-alcoholic cocktail", is uncommon even in documents in which it does appear then it might be given a greater weighting in a page in which it appears.
The same page which contains our example keyword, "non-alcoholic cocktail', might also contain another keyword, for example "cocktail", as many times as it contains our keyword and even in the same elements, headings and meta description. So you might expect that the 2 keywords would have equal importance in the eye of the search engine but due to TF-IDF this isn't necessarily true - "non-alcoholic cocktail" would be perceived as having higher importance due to its relative (to the greater dataset of the entire index) scarcity.
-
RE: Rel = no follow?
I wouldn't advise a client to link out to one site so much unless it was adding significant value to the page. This kind of "list of links" page is not particularly useful to visitors who could probably have found the webmd pages anyway and it doesn't add much to the content of the website. I suspect it was created in a misguided attempt to satisfy a criterion of linking to authority sites.
The page is also titled "Saint Louis Links" but AFAIK webmd is not a St Louis company.
You need to ask "am I doing this purely for the search engines/SEO purposes?" or for the visitors. If the latter you should nearly always go ahead with it but of course optimize it for SEO. If the former you should probably drop it as you should be building for your audience, not for search engine algorithms.
Edited to answer your direct question: if links are adding value to your content you should make them followed.
-
RE: For SEO... - Display Graphs in HTML5 or Image?
Great question. Search engines presently don't index highcharts or other graphs presented using HTML+JS combinations. However it can't index the information in images either, just the image itself.
Search engines have become increasingly sophisticated at indexing content rendered/presenting using Javascript so the day may well come when these charts do become indexed. Extracting information, especially structured information, from images is probably going to remain a harder problem to solve than traversing the DOM and interpreting the structure of the charts.
Another factor is the native format of the data. If you use a dynamic charting solution like highcharts to render data present in the document then search engines will already be able to index the table and access the data. That isn't going to be the case for images. So I would recommend, wherever possible, putting the data in the document as a HTML table and using Javascript to present this as a dynamic chart. This will also mean those folks who browse without Javascript enabled will get to see your data, albeit in a different presentation.
-
RE: 404 Pages. Can I change it to do this without getting penalized ? I want to lower our bounce rate from these pages to encourage the user to continue on the site
There's no law that says a 404 page has to be dull and unengaging. Back in the palaeolithic era of the web if we saw a lot of hits to the 404 page in the server logs we rarely knew why (finding broken links was a lot harder in those days) so we tried to capitalize and added engaging graphics and search boxes, copy designed to improve the retention of all these poor lost souls.
Working on your 404 page can actually be a really good experience. With the tools at developers disposal today it should be super easy to work out the context of the 404 error and show something useful to the user and win them over.
All that said if you find yourself relying on this technique in 2014 it is probably a sign something has gone wrong with the site's information architecture. Restoring the category page but serving a 404 is probably a no-no - you're essentially saying "no, this doesn't exist" to automatons (user agents and search crawlers) but you are showing the user the page they were presumably looking for. Finding yourself in a situation where you are sending deceitful HTTP headers is a clear sign something is wrong.
If the pages are useful and visited, restore them and work on making them better. If they aren't useful enough then you should probably 301 to a relevant useful page. Don't worry about having too many 301s, redirecting is the technically correct thing to do in such situations and your search engine of choice can hardly penalize you for using HTTP features correctly.