Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
GWT crawl errors: How big a ranking issue?
Hi Jeepster, I saw this question wasn't answered yet, so here I come with one: What I would do in your case, is to check when the rankings or the traffic dropped. Since, like you said, you have a child to look after, it means that you might not have a ranking history... well, if you have Moz PRO, then you could go to: Campaigns -> Ranking tab -> Change the "full rankings report to PDF" into "Entire keyword ranking history to csv" export it, and then look after the Keyword, Date, Traffic and Google Ranking columns. If you are not a PRO member or you do not have a campaign set up for this particular website, you could analyze your traffic from Google Analytics and check aprox. on which date did your organic traffic drop. After you have this, you could check the Google Algorithm Change history of Moz.com. Compare the dates and try to figure out what update has taken you down. Regarding 17 Server Errors/575 soft 404s/17 Not Founds/Access Denied 1/Others 4: try to export the data of errors and check them manually (sorry, I can't give you tools for this, the best is, to look it up manually) yes, these errors could lead to problems in ranking (a while ago, a quite big website of a previous employer dropped in rankings. after resolving these issues, we got back) cleaning up your GWT in order to be clear it will help you I really hope it works out for you! Cheers, Istvan
| Keszi0 -
Google webmaster showing 0 indexed, yet I can see them all them Google search?
For sure, I just tested this by also resubmitting some of our sitemaps. This proved to me that it could take some time for GWT to refresh the data. As I submitted 25 pages but it says it index 76 of them ;-). Mostly I give it a day 1/2 to see if the results differ. Hope this answers your question.
| Martijn_Scheijbeler0 -
Linking domains on the same C Block together
HI Tom If they are two very different websites then why would it matter if they are on the same c block or not? Personally I would not get too worried about trying to hide the connection from google, it is a losing game in most cases. That being said, most of the content you mention seems to be relevant to the ecom site, I would be tempted to host it on the same domain. Use the blog site to add relevant reviews and content and to help boost the overall domain authority of the main site. Just because the blog is hosted on the ecom site doesn't mean the reviews cannot be unbiased! My two cents!
| LynnPatchett0 -
Google ranks my sitemap.xml instead of blog post
Hi Kent, It sounds like your sitemap's link is included in another sitemap file or linked from a page. According to my research,you can add a X-Robots-Tag header to your .htaccess file. This is an old discussion, but I believe still relevant: http://www.webmasterworld.com/google/3971249.htm Thanks, Christine DeGraff
| ChrisDeGraff0 -
Duplicate content + wordpress tags
William I wouldn't do that - make sure to just noindex tags, and also noindex subpages of archives. Categories can stay indexed, it's usually the subpages that cause issues (ie: /page/2/ etc). -Dan
| evolvingSEO0 -
Links under Meta Description when performing a search
Just to add to Mihai's response, you can always delete the sitelinks from the GWT if there are some irrelevant links showing up for your website.
| SEO5Team0 -
What are some good ways to define yourself as a store to Goolge?
There are many things one can do to help Google recognize your business as a store. First, if your business has a centralized location, be sure to create an optimized listing for your company at Google Places for Business. As well, be sure to use selling related terms when optimizing your pages for keywords, e.g. instead of simply listing “hard drives”, use “hard drive retailer” or “hard drives for sale” along with your company name. Ensure your URL and landing page reflect your business wares as well by having a short, clear URL and using reflective keywords within the first 100 words of text on your site. It will always be a struggle to compete in Google against major corporate retailers, but by following these steps, you should begin to see a change in your results.
| SEO5Team0 -
Keyword Difficulty Tool
I love reading these in-depth answers from the CEO of Moz. It's so nice to see a hands-on CEO taking time to deal with even more mundane things like discussions around support issues.
| MrFrost2 -
During my last crawl suddenly no errors or warnings were found, only one, a 403 error on my homepage.
No that I know of. Just wait for the next crawl. They do one weekly so you shouldn't worry too much about it.
| FedeEinhorn0 -
Doubt between sub-directory and sub-domain for develop Blog for my business website
Thanks @Federico Einhorn and @Marcus Miller I am already aware of these things. But you can find other articles those have huge contradictions. I know it will give much befits. But still I wanted to strengthen my point. Thanks a lot
| Perfect0070 -
Using the Moz to weed out bad backlinks
Just using OSE? I would arrange the links you export by PA. Start with the lowest number and work your way up removing the good looking links so you are left with the bad links If you want to use other tools, you can use scrapebox, or even http://nielsbosma.se/projects/seotools/ to detect their PR first. (or better yet, check if the link is indexed in google before checking PR) If it's deindexed, then you dont have to bother wasting your time on it. Put it on a separate list and contact for removal once you are done sorting out the links. If the link is not present anymore but the site is bad, disavow it. PS. I would also export links from google webmaster tools
| DennisSeymour0 -
We are using Hotlink Protection on our server for jpg mostly. What is moz.com address to allow crawl access?
Hi there! Thanks for reaching out to us! I can certainly understand your need to have our crawler be accepted into your link protection program. Unfortunately our crawler doesn't operate using a url to crawl your site, we use a collections of IP addresses behind the scenes to mimic a search engine crawler to provide the best diagnoses around. With that said, a lot of our customer has had some success allowing our crawler on their server level through either their HTTP access file or other methods. Unfortunately I am not a Web developer or a server admin, so I couldn't exactly let you know how to implement it. I would recommend you to perhaps change up your question and post a new question regarding some work around for your software. Thanks for your time, I hope that helps. Peter Moz Help Team.
| Peterli0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hi Mihai, Another great option, using pagination for blogs with extensive comments. Thank you!
| DomainUltra0 -
Site not coming up even when I search with the .com
the forum you are posting on was created on the backbone of the software used to examine backlinks. as per the rest of your questions.. you're on the right path. let's see how we can keep you moving along said path: http://bit.ly/16NVY3R
| jesse-landry0 -
Using Google Adwords is good?
There is no relationship between AdWords and organic. I've spent over $30MM in AdWords, and have several relationships with dedicated AdWords reps, and have never been able to get anything special from the organic team (I've tried several times, with 0 success). I could not even get the organic side to expedite a reconsideration request.
| Branden_S0 -
Any need to worry about spammy links in Webmaster Tools from sites that no longer exist?
Thanks, Tom! That's what I assumed, but very helpful to hear from someone more experienced.
| CobraJones950