Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
New sitelinks - can we control the number?
Thanks guys, I guess it's a case of keyword relevancy so I'll have to be happy with just the 8! Cheers,
On-Page / Site Optimization | | Confetti_Wedding0 -
SEOmoz Crawl CSV in Excel: already split by semicolon. Is this Excel's fault or SEOmoz's?
Thanks a lot for this question and answer. I've been cursing for months and months on this issue. The solution suggested by Barry is TOP! Download Open Office, even if it's just to convert CSV correctly to columns.
Moz Tools | | Jacobe2 -
Google local - is 7 pack really gone
Once again, a very detailed explanation. Thanks Ryan.
Vertical SEO: Video, Image, Local | | seoug_20050 -
Duplicate Content on Blog
This is something a lot of websites do and something the crawlers can recognise. If you have say, a 150 word block that's the same on every page, and 400+ words of unique content elsewhere, you won't have any problems at all. There's no particular science to those numbers by the way, I'm just pulling them out of the air - but there's a helpful quote below from here: http://www.seomoz.org/blog/beat-google-panda Here at SEOmoz, our PRO platform uses a 95% threshold to judge duplicate content. This means if 95% of all the code on your page matches another page, then it’s flagged as a duplicate. To check your own ratios, try this nifty duplicate content tool. What you suggest in the last paragraph is cloaking. You need to be very careful when you're serving different versions of the same page to crawlers and users.
Intermediate & Advanced SEO | | Alex-Harford0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
Thanks John for the suggestion. Unfortunately the https pages aren't separate pages from the http version; one is secure and one isn't but the actual code is identical. The rel canonical tag would appear on both the http and https version. Are you sure Google wouldn't have any issues with the http pages having a rel canonical tag that points to itself?
Technical SEO Issues | | fthead90 -
Keyword cannibalization in ecommerce sites
Ok, thanks! I had read that blog post and received good information from it, but I was hoping to hear some newer tactics that were floating around since the post was written in 2007.
Keyword Research | | Hakkasan0 -
What is the best way to close my blog?
Re #2 it can be complicated. I refer to htaccess hacking as a black magic art of website building. But for straight 301 redirects, the code is fairly straightforward, in your .htaccess file it will be a line for every page that looks like so: redirect 301 http://www.myoldsite.com/page1.html http://www.mynewsite.com/page1.html This is the brute force method. You can also use regular expressions if there are patterns to the addresses, but then you're getting into the esoteric. Re #4 they will be instantly redirected.
Technical SEO Issues | | AdoptionHelp0 -
A Blogs appearance
I guess a pro for a separate themed site is that you wouldn't be stuck with the contraints of keeping things on brand. EG: you could stray from standard fonts for the sake of a 'pretty' design and generally not have to worry about making one thing look like the other. Would likely give you a lot more freedom. Another pro would be that your blog could also be perceived on it's own merits as well as that of your main site. EG: Perhaps you have topics and content that you don't want to be directly associated with your original brand. If it's not directly associated with an established brand then perhaps it might attract a different audience who doesn't see the content as influence by a parent brand. Pro's and con's for both here. Technically of course, there's no issue i can see.I guess it all boils down to brand recognition and best practice UX.
Content & Blogging | | Daylan0 -
Page titles and descriptions
Hi Bryce, Thanks for the input. I was thinking that direction. It nice to have some affirmation from time to time. Cheers!
On-Page / Site Optimization | | APICDA0 -
Duplicate Page Title/Page Content Errors return URLs showing strange string of characters
Chad, try speaking directly with the software's developers. They probably offer a forums or other means of contact. They will surely recognize this issue instantly even if the developer you are working with does not. This type of issue does not fix itself. I expect your next crawl report to show the same issue, unless the issue is with the crawl itself or an action was taken on your site to correct the problem.
Technical SEO Issues | | RyanKent0 -
Keyword Finder
I'm surprised to see that SEOmoz doesn't have a keyword research tool like Trellian software or wordtracker! I would definitely recommend Trellian's "Keyword Discovery" tool - and I actually canceled my subscription with them when I started with SEOmoz, thinking that there would be a keyword research tool that digs into historical and global data for keywords and gives all related keywords that you can then export into a CSV file. I'm impressed with SEOmoz for sure and just surprised there isn't a keyword tool like this...unless I just haven't found it yet!
Keyword Research | | blinddrop1 -
OPINION on forum software phpbb OR vBulletin?
Can you elaborate on the security issues? There were some issues with phpbb 2 (many years ago), but phpbb 3 is extremely secure. I think your information is very outdated.
Content & Blogging | | WhygoSEO0 -
Do sites really need a 404 page?
Also, some site owners will run link checking tools on their sites that look at their off-site links and see if any return 404s. If you're not returning a 404, the site owners may not know (via an automated tool) that the link is broken.
Technical SEO Issues | | KeriMorgret0 -
Block a sub-domain from being indexed
Keep in mind that Google Index's everything that it can crawl. Even if you put a block in the robots.txt they will probably crawl it. You can require a password to that subdomain and keep big G out. This is easy to do if you have a site with cpanel access. Just go to manage permissions, and password protect that director with a .htaccess pw.
Technical SEO Issues | | X-X0