Questions
-
SEO Issues
I am not 100% as to your question. I think you may feel like you are gaining value by this but unless you provide moving services then you are not. Are you doing anything wrong by that, not necessarily. Did the client pay you to make the tool? Are these keywords relevant to the client? You are keeping them from being indexed with their site where they may have value and instead putting them on your site where they do not have any value. You are getting hits for traffic pulls but no SERP is going to see that as traffic for your content and if they do it will probably be seen as a bounce because the iframe is pulled and they are gone. If you are doing it because you have faster hosting or simply for more domains and faster page loads I would suggest using something like jQuery get() to cross domain fetch it from a deep directory with no index directives from the feeder location so that it can be indexed within the site where it belongs. The tactic is not wrong just check your thinking. Think about the end user and think about being transparent. You want to be up front both with Google and with the end consumer. Nothing gains trust faster than truly having nothing to hide.
Technical SEO Issues | | yeagerd0 -
What is wrong?
I don't read the language, which makes it quite a bit harder. One thing I did notice is that there are still some real thin pages in the index that also look like duplicates. Look at: http://www.enakliyat.com.tr/hata.aspx?aspxerrorpath=/nakliye-firmalari/yilmaz-atlas-42 http://www.enakliyat.com.tr/hata.aspx?aspxerrorpath=/gaziantep-evden-eve-nakliyat-fiyatlari-55 http://www.enakliyat.com.tr/hata.aspx?aspxerrorpath=/manisa-evden-eve-nakliyat-fiyatlari-37 Looks like these are perhaps error pages that shouldn't be indexed? It might be best for you to find an SEO who understands your language and is able to help you out.
Technical SEO Issues | | KeriMorgret0 -
Meta Robots Noindex and Robots.txt File
Hi, let us take a scenario. test.html has been blocked using robots.txt file. In this scenario, the bots that respect the robots exclusion protocol will not crawl the page to encounter the page level robots noindex attribute. So does not make any sense to use both the page level robots exclusion method and server side robots.txt file. Here you go for more: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93708 And here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710 Best regards, Devanur Rafi.
Technical SEO Issues | | Devanur-Rafi0 -
Crawl Diagnostics Summary Problem
Hey there, Thanks for the question. How you have your robots.txt set is actually preventing all bots from even touching on those pages, not just the engines. If you had a directive allowing RogerBot access to those pages it would be able to touch on them and register that they are blocked from the Search Engines in the robots.txt. Since our crawler strictly adheres to the robots.txt file you won't have anything populated there. I hope that makes sense. Feel free to reach out if you need more information. Cheers, Joel.
Moz Tools | | JoelDay0 -
Google Webmaster Tools: MESSAGE
Generally, if you've been hit by a penalty, blocking with robots.txt isn't enough. You generally either have to remove or improve those pages entirely. As you can see, most of these pages are still in the index. When looking at your site, I found many pages like this: http://www.enakliyat.com.tr/hata.aspx?aspxerrorpath=/manisa-evden-eve-nakliyat-fiyatlari-37 And also "thin" pages without much unique content. If these pages are valuable to your customer, you should consider updating them with fresh, unique content, then file a reconsideration request with Google to lift the penalty.
Technical SEO Issues | | Cyrus-Shepard0 -
SSL Certificate
Having an SSL certificate won't remove your (not provided) keywords. As long as users are arriving from Google's HTTPS service, their keyword data will be stripped out, regardless of your own SSL status. There's no compelling SEO reason to have or not to have an SSL certificate - as Kevin points out below, it's really more a matter of user trust/data security. If users are providing you any kind of personal information (email, address, CC info, etc) you probably want an SSL certificate at least for the pages where you're collecting and sending that data. Your users will appreciate the extra security. One thing to watch out for is that http and https versions of the same page may be counted as duplicate content - so make sure that one redirects to the other.
Search Engine Trends | | RuthBurrReedy0 -
Robots.txt
Ah, it's difficult to see anything on the page because i can't read Turkish. The only thing you should know is that every single page in a website should have unique content. So if two pages are exactly or almost exactly the same then Google will think it's duplicate content.
Technical SEO Issues | | WesleySmits0 -
Robots.txt
and this disallows everything in the /details/ folder so if there were some exceptions to the rule (some pages or sub folders in that folder) you would need to add some allow directives, or make a more specific disallow(s)
Technical SEO Issues | | irvingw0 -
Remove Directory
Hi, If all the content you want to remove is part of a separate folder / directory then use the approach from: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427 Except from the use of robots.txt - set all those pages to noindex - don't use robots.txt ! (very important) In case you have some pages in the same folder that you still need in the index but this number is low as ratio vs the ones you want to remove - is better to remove it all and the index those back then move one by one and remove the ones you don't need - it's all about the numbers. Cheers.
Technical SEO Issues | | eyepaq0 -
Noindex meta tag
GWT will take awhile to update sometimes, as mentioned above. Are you still having trouble with this or have the errors started to decrease? Aside from that, if Moz isn't reporting a duplicate title or duplicate meta description, issue, I would guess that GWT is simply reporting old data still or something along those lines. I'm going to mark as answered for the moment but jump back on here and let me know if you're still having trouble - happy to take a look.
Technical SEO Issues | | KaneJamison0 -
PAGE TİTLE
I have used unique identifiers to create unique page title before but only because it's a huge site that required automation - it's not the most elegant way to add dynamic unique content (but it does insure it which is why it's a good technique for huge sites) and could reduce CTR. the problem i see here more is that both title tags are competing for the same thing. if users are able to generate similar description resulting in duplicate title tags that is probably very rare and you can probably go in and edit the ones that are duplicates by changing one character.
On-Page / Site Optimization | | irvingw0 -
Page Content
I would recommend using hubspot.com with their Pro package using salesforce as well. You are running a business that does not need to have pretty URLs. Why would you want them to be indexed? If I have a customer of yours and I fill out a request form to have my belongings moved to a different location. It should be done on the CRM through a work flow using a product similar to salesforce. That means the customer will receive the form ( Adobe forms is an excellent source for high-quality forms as well) in an e-mail once completed. of course your company will receive that form/request/lead as well. Put it into hub spots workflow so when somebody fills out the form. It does not get indexed by Google and as a lead number on it or so http://hubspot.example.com/company–date–lead number/ would be your URL. The reason this is the best way is you're going to use this and multiple e-mails and in your customer relationship management software. It's not going to be indexed by Google if you can help it. If you need to make a customizable form meaning add more of whatever you should be able to do that with either Adobe forms, gravity forms, Google forms, Wufoo forms so if they need to add an extra end table they can do it. The link below will take you to some less expensive software it is a combination of Wufoo forms and Microsoft CRM 2011 it's hosted, and it looks to me like it would be right up your alley. So if http://www.forms2crm.com/ cannot help you and Hubspot.com cannot help you can use some of the links below. They all do similar things. This is not really an SEO question my opinion however it is a lead question and how handle leads when they come in. A new way to interact with your customers, built for Microsoft Dynamics CRM, powered by Wufoo forms. http://www.forms2crm.com/ What can you do with forms2crm? Leads capture Surveys Order processing Case logging Event registration Leads capture for tradeshows Field technician reports Field sales reports Real estate management Read 1-Handling Tender Form Fill Location where to move and be moved into, carrying information about the goods auctioned fill in the form. Price from firms 2- Buy Offers Ratings and reviews of many companies in the Get a price quote. Choose one of the offers. Enakliyat guarantees a comfortable stone. 3-Company and Service Rate Use the power of your hand, and the rating of a company and evaluate the service. Salesforce.com As you can see in this demo below that is a moving company they're referencing https://crm.zoho.com/crm/ShowHomePage.do#tab_Dashboardsbck http://msdn.microsoft.com/en-us/library/cc150850.aspx http://www.zoho.com http://www.zoho.com/crm/sales-force-automation.html If you have more questions give more help or I can please let me know. Sincerely, Thomas watch?feature=player_embedded&v=lGFUpF9DLKg
Technical SEO Issues | | BlueprintMarketing0 -
Crawl Diagnostics and Duplicate Page Title
When we manually check the page source we see that there is no duplicate title tag or missing title tag. We do not understand why? GWTs show us this title: "evden eve nakliye" When we manually check the page source we see this title: "Enakliyat: evden eve nakliye 4263. ihale"
Technical SEO Issues | | iskq0 -
REL CANONİCAL
There is no SEO software can help you determine if you have implemented correctly the canonical links(which page you are targeting with optimization). What they can do is to notice you about the fact that you have a canonical link present on that specific page. Which data you can export and analyze. Unfortunately this means a lot of manual work for you or your team. Just think about the fact that you might Point a Canonical link from Page A -> Page A and from Page B-> Page B (because you have a script that will point a canonical to itself). Eventually these two pages are the same, it will be quite confusing to a search engine, right? Or Page A -> Page B, Page B-> Page C and so on... that's also something that you would like to avoid. Another case Page A -> Page B, Page B -> Page A. With exporting the data that SEOmoz gives you and analyzing it in Excel (or a similar program), you will have the chance to avoid these problems. I hope it helped and cleared the picture a little-bit Istvan
Technical SEO Issues | | Keszi0 -
Duplicate Page Title
Yes, No doubt. You should change and keep your all duplicate page title and also if you have duplicate content, change them asap.
Technical SEO Issues | | Perfect0070 -
Google Webmaster Tools: Quality Issues on http://www.enakliyat.com.tr/
Invest in content. Ensure that you have a good amount of unique content for each page, push for photography and balance out the ratio of pre-generated content (sidebars, footers etc. that are shared across pages) and the unique content to the page so that it's more favourable. I imagine that users are making these pages rather than yourself so perhaps you need to take an editorial stance here. One option would be to "nofollow" links to this page by default (or perhaps NOINDEX the page entirely) unless you're satisfied that they're acceptable links. Alternatively you might want to consider asking more precise questions when users fill out the form so that there's little room for them to write 'bad content' and you can provide more valuable information in a consistent format to your users.
Search Engine Trends | | AndieF0