Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Moving Duplicate Sites
A pleasure Wayne! (If you get stuck with the web hosting issue I'm sure we can arrange a whip round for you! ;o)
| Hurf0 -
Which is the best wordpress sitemap plugin
That's good to know - I'll give it a try. Thanks.
| simoncmason0 -
Why are apostrophes and other characters still showing as code in my titles?
Gabriel- It's no problem for these codes to appear in other sections of the page. There are many character sets out there, and what may appear great to you (without codes) looks like gibberish to someone else. The codes help make for a uniform experience for everyone. For instance, there are no "smart quote" keys on your keyboard, yet some programs (like Word) will automatically insert these for you. To have these on a web page most systems will convert these to codes for you. The ampersand, however, falls into this category since it's often an escape character for other characters. So, the place we're still really most concerned about is the meta title. I'm not sure why the mb_convert_encoding() function didn't work for you? If you'd like me to take a closer look feel free to message me. This is solvable - I've done it before on WP sites.
| blu42media0 -
Redirects on Window Servers - .htaccess equivalent for IIS
Here are three I found. I am not an IIS person at all, so help on if these are what I want. http://knowledge.freshpromo.ca/seo-tools/301-redirect.php http://blog.wmsmerchantservices.com/windows-server-issues/301-redirecting-on-a-windows-server-same-as-mod_rewrite-htaccess/ http://www.isapirewrite.com/ Thanks again!
| Getz.pro0 -
Struggling to get my lyrics website fully indexed
You need more unique content. Your site is great I like it much btter then the other lyic sites. but I can't see any content at all you have written yourself.
| Fubra0 -
How to configure mobile website?
What i am planning to do is - create a small function that based on the user agent will render the content i.e. if the user agent is of smart phones, it will render mobile version of the website and if it is desktop then it will render the desktop version. I hope it does not amounts to cloaking. Please correct me if i am wrong
| IM_Learner0 -
Are there any tools to measure backlinks to email addresses?
If you are just interested in internal links to an email then the free Xenu link sleuth should do the job. Cheers Marcus
| Marcus_Miller0 -
Impact on domain when using a subdomain for majority opf site content
Not strictly no, we may have to compromise by losing a degree of control over the content. Thanks for the response Damien, very helpful. Jan
| Urbanfox0 -
How to use Author tag - Please help
The general idea of the rel attribute is to link authors to content. Check out How to use rel=author tags for SEO... the article goes on to explain that this is part of a new "AuthorRank" system for determining content legitimacy.
| blu42media0 -
Having some weird crawl issues in Google Webmaster Tools
There are website crawlers you can employ to scan the site and hunt for specific parameters such as: http://home.snafu.de/tilman/xenulink.html
| Dan-Petrovic1 -
Robots.txt Syntax
Rodrigo - Thanks, and thanks for the follow-up. To be honest with you though...I have not seen or experienced anything about this. I tend to follow the suggested rules with code So my answer is "I don't know". Anyone else know? I also agree with you on the meta tags. Robots.txt is best used for disallowing folders and such, not pages. For instance, I might do a "Disallow: /admin" in the robots.txt file, but would never block a category page or something to that effect. If I wanted to remove it from the index, I'd also use the meta "noindex,follow" attribute. Good point!
| dohertyjf0 -
Site not being Indexed that fast anymore, Is something wrong with this Robots.txt
I am not sure why you are setting disallow of file types. Google would not index wmv or js etc anyway as it cannot parse that type of file for data. If you want to coax google into indexing your site submit a sitemap in webmaster tools. You could also set NoFollow on the anchors for the pages you want to exclude and keep robots.txt cleaner by just including top level subdirectories such as admin etc. There just seems to be a lot of directories in there that do not relate to actual pages, and google is only concerned with renderable pages.
| oznappies0 -
Are there any SEO implications if a page does two 301s and then a 304?
Thanks man, I think this might be the video you are referring to: http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more
| RodrigoStockebrand0 -
Browser Pop Ups - Can it be SEO Friendly and how?
Thanks for the information guys, very useful. Any ideas as to what this page uses? Not sure if it's using what you recommend, or if there are any potential issues here...this is one of many that I find throughout my normal navigation. https://online.citibank.com/US/JRS/pands/detail.do?ID=Checking Thanks, Rodrigo
| RodrigoStockebrand0