Questions
-
Do you avoid the use of stop words in your keyword optimization?
Excellent question and a great response. I guess the original query is still out there - do stop words have any real SEO impact?
Keyword Research | | DenverKelly0 -
Always include the targeted keyword in the alt tag?
There is some good advice here from. Scott is right that your example is not inappropriate. EGOL is steering you very well also. When using alt text, most do not take advantage of what it can do for you. You can use around 140 characters and give a good description of the image (as if someone blind was hearing it) along with using keyword relevant text. I recently read that by having your images with good alt text get ranked in images that does have an effect on organic. Now, it wasn't Google saying it, but to a degree it makes sense it would have an effect - how much is anyone's guess. If your example is not close to what you are putting up, go with EGOL's advice and get a new image. (I suggest geo tagging it if at all possible as well). Hope this helps.
Intermediate & Advanced SEO | | RobertFisher0 -
Does having several long title tags hurt you?
The extra 70 characters can help - Google DOES index them: http://www.seomoz.org/blog/title-tags-is-70-characters-the-best-practice-whiteboard-friday If you can, make sure the first 70 characters are most relevant to the page content, and readable for the user - if they don't like what they see or it doesn't make sense without the other 70 characters they're unlikely to click through. You don't mention whether your brand name is included in the tag; unless you're really well known in your niche it usually makes sense to have the brand at the end of the title tag (if at all). I'm an SEO for a couple of news websites where the titles are often well over 70 characters (with the brand names at the end). It doesn't seem to cause any serious problems and we have had high traffic from SERPs with longer titled articles. I advise the editors to put the main keywords near the beginning of the title if possible, with readability (for the human!) being the most important thing.
Intermediate & Advanced SEO | | Alex-Harford0 -
How important is it to clarify URL parameters?
If you know right away what the parameter does, I'd go ahead and tell Googlebot what the parameter is or how to treat it. It's one more signal to help fight against duplicate content, and I'd take advantage of everything I can give Google to help tell them about how I want them to rank my site.
Intermediate & Advanced SEO | | KeriMorgret0 -
Block all but one URL in a directory using robots.txt?
Robots.txt files are sequential, which means they follow directives in the order they appear. So if two directives conflict, they will follow the last one. So the simple way to do this is to disallow all files first, then allow the directory you want next. It would look something like this: User-agent: * Disallow: / User-agent: * Allow: /test Caveat: This is NOT the way robots.txt is supposed to work. By design, robots.txt is designed for disallowing, and technically you shouldn't ever have to use it for allowing. That said, this should work pretty well. You can check your work in Google Webmaster, which has a robots.txt checker. Site Configuration > Crawler Access. Just type in your proposed robots.txt, then a test URL and you should be good to go. Hope this helps!
Intermediate & Advanced SEO | | Cyrus-Shepard0 -
Site views messy in a text browser, but can see all text, is that a problem?
No, that is not a problem and not very important for SEO that you have "clean" looking text in a text browser. A few months ago I thought I had to optimize (and by optimize I mean make it look pretty) for a text only browser as well. However, after conducting some competitive research I found that some of our top competitors (ranked #1-5) were in far worse condition from a text only browser stand point. In a nut shell, as long as you have the text visible then you are good to go. TIP: To see how Google views your website type the following in the search bar~> cache:www.yourwebsite.com then click on "text-only version"
Intermediate & Advanced SEO | | Desiree-CP0 -
Why specify robots instead of googlebot for a Panda affected site?
Hi there, I'm not sure of the percentage of sites who have specified all bots as opposed to just Google, but I also have to assume that the percentage depends on the sites' territory. For instance, there are few sites in the UK who would think of Bing or Yahoo, due to those search engines having a tiny market share. In the US, Japan and several other places where non-Google search engines have more share, the consideration will be much higher.
Intermediate & Advanced SEO | | JaneCopland0 -
Best way to de-index content from Google and not Bing?
Hi michelleh The solution given by Dan above is the most reliable method as robots.txt will not block pages that googlebot finds via an external link to the page. Given the reasoning behind your desire to noindex, reliability is extremely important. Also, you want "noindex, follow" rather than "noindex, nofollow" as the nofollow will trap any link value coming into the pages (from both internal and external links) and stop it from flowing through the site. Hope that helps, Sha
Intermediate & Advanced SEO | | ShaMenz0 -
Can you pass social signals with a 301 re-direct?
Gee good question. It was always said that link text and page relevancy was not passed thought a 301, but Bing say it is, but only though one hop., actualy just reading again, thats not what it says, it says "pass page rankings and other relevant data", so what is relevant data? buyt I can tell you it does pass link text , as test have ben dones, an easy test is to link with a keyword thought a redirect and see if you rank for it, oters have dones this and the result is a yes. bup page relevancy and otehr stuff? I can not tell you, it would be hard to test http://perthseocompany.com.au/seo/reports/violation/the-page-contains-unnecessary-redirects
Intermediate & Advanced SEO | | AlanMosley0 -
Best way to find broken links on a large site?
Great extension for one page, but what about checking a site with 50000+ pages? Any suggestions?
Intermediate & Advanced SEO | | nicole.healthline1 -
Does using robots.txt to block pages decrease search traffic?
If you block the pages from being crawled, you are also telling the search engines to not index the pages (they don't want to include something they haven't looked at). So yes, the traffic numbers from organic search will change if you block the pages in robots.txt.
Intermediate & Advanced SEO | | KeriMorgret0 -
How to see which site Google views as a scraper site?
If the other site is outranking yours, it would mean that for whatever reason, Google has decided it is better quality, but not necessarily determined one or the other to be a scraper. That could be based on any combination of the hundreds of factors Google uses to determine position in search results. It may just be that it has more/better inbound links. If you focus on keeping only quality original content on your site, getting good links and mall of your on-page SEO is in good shape then you shouldn't have to worry about scrapers. Google will find and devalue the duplicates eventually. If this is one of those cases where yours is clearly the original and the scraper is outranking yours for whatever reason, you may want to consider filing a DMCA report with the site's hosting company. You can usually find that info with a whois search. You can also try submitting a spam report to Google here https://www.google.com/webmasters/tools/spamreport
Intermediate & Advanced SEO | | Nick_Ker0 -
Examples of sites other than Hubpages that have used subdomains to recover from Panda?
I hear a lot of people saying they are doing so, but I have not seen any results. I wonder if it is that easy, seeing that Matt Cutts has saiid that subdirectories or subdomsisns there is no difference. I noticed that site links have links to sub domains, and recetly GWMT have stated showing subdomain links as internal links unless the subdomain is verified under another user.
Intermediate & Advanced SEO | | AlanMosley0 -
How is Google choosing which authorship profiles to display?
For the people I've seen it work for, they were "complaining" on Google+ and Twitter.
Intermediate & Advanced SEO | | caseyhen0 -
Best way to block a search engine from crawling a link?
Hi there, I'm assuming you are trying to do pagerank sculpting (or something related..) - which was made a little more tough in recent years. I'll base my answer around this assumption, so feel free to correct me if this isn't the case. There are several methods to make a link uncrawlable: AJAX - Googlebot will not read any calls through AJAX. If you can load your link through an external call, it would be completely hidden. Javascript - Obfuscate links with Javascript that masks the link. You can do any number of solutions here, including using tags with a title of your URL, which upon clicking, goes that that URL. Simple and effective. Redirects - I haven't tested this last idea, and it may not work. You might be able to redirect to another page in your website, which is then set to not be indexed. Then redirect to the intended page through a query string. In theory it should work, but obviously not as good as the previous methods I described. Let me know if you have questions. I'd be glad to help further. Cheers!
Intermediate & Advanced SEO | | deltasystems0