Category: Alternative Search Sources
Find information about alternative and less common search sources.
-
Main Russian Search Engines
Presumably submitting an english content based site is pretty pointless?
| bjs20100 -
Do we need to submit another sitemap just for the images?
Thanks for your answer Irving, however I dropped image optimization completely after I read some posts on the traffic drop from Google Images after the UI update in late January 2013. Apparently the traffic from there is down with ~80%, and is now not worth it. Lily
| edamam0 -
How would you promote a "Literary Contest"?
Many thanks Kane, I've found very useful your links!
| YESdesign0 -
Has anyone used Curation tools for SEO benefit?
Hey there If they're using Curata to gather a lot of useful articles and then create their own article from that, then that's great. It will likely result in content that is rich for the user and could earn shares and links. If, however, they are just using curata to dig up content which they then post on their site, the SEO effect won't be positive. Google ignores syndicated content and won't give the site any credit for taking others work and simply reusing it. At best, it won't have any effect because Google will understand that it's being done for the user, not to game the algorithm. At worse, it could start thinking that you're scraping sites and replicating content for your own site in order to pass it off on your own - in which case, you could quite easily fall into a duplicate content/Panda penalty. Use tools like that to draw inspiration and add your own take on things in a unique article that appeals to your demographic. Otherwise, I can't see this having a positive SEO benefit at all, and it could even be negative.
| TomRayner0 -
Alternative to Google Analytics
There are a whole bunch of alternatives - all with their pro's, cons and prices. Here's the ones I can think of from the top of my head: ChartBeat GetClicky Kissmetrics Woopra Reinvigorate Piwik Open Web Analytics Mint GoSquared StatCounter
| generalzod0 -
Appearing in Universal Results drops us from Organic Results
Thanks for your answers! So our local universal 7 box result links to our homepage, rather than Place page - great, but sometimes I see Google decides to link to a companies Place page instead. Why? And how can we ensure it stays linking to our homepage rather than Place page? Do we lnow what factors influence this? And Is it ok to get Place reviews or might this cause it toshore Place page instead ? Thanks!
| emerald0 -
Should URL Follow Navigation Of A Site?
Thanks Mike, for your interesting views on other issues that come with a short URL. So true! I agree on the domain/DX600 and the domain/NX600. This is a WP site, so it should be fairly easy to go with example: [**www.**domain.com/nikon-DSLR-camera-NX600] Not that my personal opinion matters here, but there is a value for a visitor to see that short URL, rather than the [www.domain.com/cameras/compact/all-in-one/DSLR/20mp/2013/Nikon/NX600], right? As logical as those folders and sub folders may be, it is a blur for a visitor and can very well be confusing for a search bot with all those folders it now has to verify. The question is purely about the bot: would a bot find it better to have the folders or would it not matter to the bot.
| Discountvc0 -
Spider Simulater. Does it work?
Thank You, CleverPhd I was able to locate a few of them 404s
| polarking0 -
How does alexa ranking works?
They based their rankings on data collected via a toolbar. Since this is biased and is a small sample, don't rely on it. Focus on providing a great user experience and off-page seo and you should naturally increase in Alexa rankings. If you do not, don't worry about it at all...
| KevinBudzynski0 -
Why would so many links be appearing in the source code of this page - but not on the page itself?
Thanks Don! (and Chris too). I think your guess on the platform is probably correct I appreciate both of your responses very much!
| danatanseo0 -
What are realistic goals for local serch results?
Thanks Littlesthobo! I feel I am doing the right things, as you described, socially. Have all the accounts and I dedicate a couple hours every Friday to social. "Also ensure you have cast iron on page, pick a keyword per page and make sure the page fully reflects that keyword" I am going to start a new topic about this because I have a interesting problem. Being that I service rural area, my ideal keywords consist of four cities and two services. Thats tough sledding when trying to build a site that is easy to navigate, visually appealing and optimized. I assume it's not wise to put "city y service, city x service, city z service" in the title and then try to naturally implement all those keywords into said service page. I need to decide if I should limit my city to 1 or build multiple pages for each city. Each has it's pros and cons from my inexperienced point of view. The multiple pages looks amateur-ish and takes away from navigation / appeal. (Lakeville lawn care page, eagan lawn care page, apple valley lawn care page) The one city limits my potential customers. As a start up business, I am leaning toward 1 city and then worrying about expanding the scope of the site later, but am certainly open to suggestions.
| dwallner0 -
Deleting a duplicate google places listing
Hi MediaCom, You could try calling Google's new phone number for data issues to see if that gets you faster action: http://localsearchforum.catalystemarketing.com/google-local-important/2022-huge-news-phone-support-google-local-data.html It may not yield results, but if you haven't tried it yet, it could be worth a try!
| MiriamEllis0 -
Client is being approached by company claiming to get lots of video views
Youtube definitely will remove views and even the videos occasionally if the views were obtained in a black hat way that goes against their TOS. It's just not worth the risk, and most clever people can tell that there's no way a video about something relatively obscure got so many (real) views so quickly
| adventurenick0 -
Redirection of domain maybe affected by google
Hi, Instead of redirecting your traffic and bots to the new site, why don't you use Google's Disavow tool to remove all the bad links that you have on your clinicadentalbarcelona.net website. Although 301 redirects do transfer SEO ranking and link juices; however, it doesn't transfer ALL. Furthermore, in my opinion, I like to make things simple for bots and visitors so I don't really like redirecting. Furthermore, even if you redirect everything (including the bad links) to clinicadentalbarcelona.es/blog, it will still affect the whole website since the blog is part of the website.
| TommyTan0 -
How to find the categories a keyword is belong to?
A tool that you type "carpet cleaning" and it will return home services local businesses living fabric protection etc.
| Elchanan0 -
Does PPC Help your Organic Placement?
As Keri has suggested, no. There has been a study however to suggest that including a PPC ad next natural SERPs results increases search trust and as a byproduct improves CTR for your organic listing.
| Martin_Harris0 -
Does anyone know of good SEO vs PPC ctr history studies
I have read that one in the past and it seems different that a lot of other ones I have read. Do you have any other examples like the one the have on seo moz? thanks,
| DoRM0 -
Ajax Server Snapshot Setup...
First off, the 100 links thing isn't a law written in stone. SEOmoz's tools do yell about it if you go over 100 links. This "100 link lore" comes from a Matt Cutts blog post: http://www.mattcutts.com/blog/how-many-links-per-page/ If you look close, you may notice that there are more than 100 links even on the page that Matt wrote about this. It's kind of a loose guideline in my eyes. From my own professional experience, if every page on your site has 500 links, you're going to hurt for it. But if you have 125 links on quite a few pages, or put out a blog post that's just an insane resource that links to a few hundred people, you'll still be just fine. What it seems like they're going for here is just an indicator of quality and usability. People do really abuse internal linkage, and Google needs to make sure that those people hurt for it. Without having seen your site, I'd just say, think about what your users really end up needing on every single page of the site (usually it is much less than 100 links). Often times, a simpler navigation is the better one. Look at what pages people actually reaching in Google Analytics, or setup a heatmap to follow their behavior. Tweak it. Annotate it in Google Analytics. See if pageviews or other key goals improve. As for only seeing 1k pages out of 6k in the index, I'd again take a close look at what is actually of value. If you have a lot of duplicate/thin content, you may be best off just using noindex,follow tags on some of it, to improve Google's perceptions of your domain's quality as a whole. If they are all of value, you could have other IA issues. One test site of mine has roughly 1,000,000 pages without noindex,follow, and that's exactly how many appear in the index. If it's really good, useful stuff you should definitely be able to get indexed.
| CoreyNorthcutt0