Have you set OSE to the maximum amounts? Target >> this root domain, Link Source >> all pages, Link type >> all links?
You might be missing some if you're checking only external, certain types of links, only the current page, etc. Cheers!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Have you set OSE to the maximum amounts? Target >> this root domain, Link Source >> all pages, Link type >> all links?
You might be missing some if you're checking only external, certain types of links, only the current page, etc. Cheers!
With more localized searches I'd focus on the Local Search Factors (http://moz.com/local-search-ranking-factors) so that you have your real world presence tied closely to Zurich, then I'd also focus on getting solid social signals from the same (Google+ would be one of the best in this case). Optimizing around mobile would also be good as trends go from desktop to mobile the more local someone gets. Some hyper local paid campaigns within Facebook or LinkedIn probably wouldn't hurt either.
Finally you could expand your networking and link building within Switzerland either from sponsoring local charities, listing with local business directories, helping non-design blogs, and so on.
A link from a high powered Swiss site would probably be better than the same from a German site in a side-by-side comparison when ranking on something like google.ch, but at least your German links aren't hurting you. Cheers!
I think this is partly due to the size of the navigation menu. If you look at it in a text browser, it's several pages of the same text and links before you get to the main page of the text. From a UX the menu is very unobtrusive and the body text is the center of focus, but a crawler is going to see the header, sidebar, and footer when calculating out the totals.
I agree with Travis. In short, yes it's an excellent link. Like Travis mentions, getting caught up in the numbers can be misleading at times, and for a short hand of the sites and people you want to work with it's better to think of them as relationships. In this case, being connected to an official site that's reputable, spam-free, and exclusive is an excellent connection.
Yes. Redirection done correctly will pass most authority over to the new site. Ideally you should modify your .htaccess file as it is a root level file that's accessed prior to your CMS. Redirecting multiple domains is an acceptable practice as well, and there are no down sides other than the warnings and precautions that should be taken when doing any amount of redirecting. To really help amplify the new domain, it'd be best to work at promoting the change via bloggers and press she's worked with in the past as much as possible. That way there aren't just technical signals that the domain is new, but PR, link, and social as well. Cheers!
Hmm... This is really tough to automate as it can become really inclusive; meaning, often there's a one/two degrees of separation within search results where your client or their competitor could be discussed within the top ten of the search results. For example, the search could end up pulling a review site that's discussing client/comp products and then linking to them. While clicks would show up from this site as referrer the value associated with the ranking should also be part of a report like that.
What you might be able to do to convey the same idea but with quicker iterations would be to really lower the number of keywords that you're reporting on to a smaller level and then clustering them into representative batches. If they become a client you can bump up the reporting details, but at the initial stages you're really pitching your knowledge, service, and system. I think you could get that across with fewer keywords and deeper dives into the marketplace. That way you could present on options like advertising on sites that feature reviews of your competitors, print options in smaller / low-cost markets, image-map-knowledge box results, etc. Cheers!
Hi Charles. I don't work for Moz, so my recommendations are my own.
Also, I pay for a pro subscription here as well, and also combine resources from multiple sources when I'm really trying to get into the minutae of each and every back link. Ultimately it's a lot cheaper than creating my own crawler and index to try and duplicate Google. In your example with clients on local, you were likely involved in developing those links so why not keep track of them in-house versus 3rd party tools? Personally my only need I have for trying to get 100% of backlinks reported is if I run into a situation with a manual action client and need as robust as a disavow list as possible. Often times you can find these sorts of links via Google searches alone due to repetitive exact match keyword usage.
The most value I get out of OSE is when I'm comparing the back link profiles of my sites versus those of the competition that are appearing in the rankings. I can usually come up with more than enough work to keep myself busy when using the tool in this manner.
If you're specifically trying to manage local listings, Moz Local may be a better tool for your purposes. Cheers!
Reviewing old content is always a good idea as you can trigger positive signals like freshness of content, higher specificity towards searches, and better linking. Too many people think of links as a one way street, trying to gain links to their site without giving much thought to who they should link out too. For example, let's say you have two articles on two site, both are identically the same (forget about duplicate content for a moment) and have the same numbers when it comes to inbound link strength, PA, DA, etc. The article that links out to better resources (high trust sites, specific examples, sites with frequently updated information) is going to rank higher than the article that does not link to any external resources. In other words, Google can understand the content of your page, but also the quality of that content by its two way link associations.
At the least, I'd add on Webmaster Tools as I've never seen negatives of doing that. Plus that will give you more insight into what's helping drive the growth. If I had to guess they're probably a pretty old site that is increasing in traffic due to Brand strength triggers being emphasized within Google.
Duplicate content, query parameters, and indexation issues might end up being not that big of a combined based on how many pages they have indexed of their total. Google is pretty good at figuring out a site's structure and parameters. Duplicate content is often not as severe an issue when it's all housed within one domain.
Mostly look into their strengths and why that's working so well. Why, exactly, is their organic traffic increasing so well? That's something that you want to help even further. Play to their strengths.
Rand recently did a whiteboard Friday on this very thing: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday, the pertinent part on your question being:
You're asking, "Should I put my content on a subdomain, or should I put it in a subfolder?" Subdomains can be kind of interesting sometimes because there's a lot less technical hurdles a lot of the time. You don't need to get your engineering staff or development staff involved in putting those on there. From a technical operations perspective, some things might be easier, but from an SEO perspective this can be very dangerous. I'll show you what I mean.
So let's say you've got blog.yoursite.com or you've got www.yoursite.com/blog. Now engines may indeed consider content that's on this separate subdomain to be the same as the content that's on here, and so all of the links, all of the user and usage data signals, all of the ranking signals as an entirety that point here may benefit this site as well as benefiting this subdomain. The keyword there is "may."
I can't tell you how many times we've seen and we've actually tested ourselves by first putting content on a subdomain and then moving it back over to the main domain with Moz. We've done that three times over that past two years. Each time we've seen a considerable boost in rankings and in search traffic, both long tail and head of the demand curve to these, and we're not alone. Many others have seen it, particularly in the startup world, where it's very popular to put blog.yourwebsite.com, and then eventually people move it over to a subfolder, and they see ranking benefits.
If at all possible, make it part of the domain in a subfolder.
There are various plugins available for download to help you create a Wordpress sitemap that you can upload into Google Webmaster Tools. GWT allows for multiple sitemaps to be uploaded.
You're likely missing the Google Analytics code on your blog as well which is leading to the weirdness within analytics. Yoast makes several Wordpress plugins that help with SEO which you can find here: https://yoast.com/wordpress/plugins/, that cover both sitemaps and Google Analytics. There are others, but those two things are what will address your questions above most likely. Cheers!
Hi Sarah. In the bottom right hand corner of this page: http://silodistillery.com/tag/windsor/ you have a list of social icons. The Yelp one contains the link to the 404 page. Cheers!
Rand recently did a WBF on this, and one of the comments by Matt in the post has an interesting back of the napkin result for his tests that were similar to what you're describing above here: http://moz.com/blog/are-on-topic-links-important-whiteboard-friday#comment-325645, mainly, "Category 3 (Off Topic, Strong Links) actually started taking sites DOWN with it. If we had a strong but off topic/spammy link, it hurt the site more than helped. We tested a few "generic PBN" sites that had decent effect on some sites, negative effect on others. It seemed to stem from relevance with what went up & down."
If it's not something that's likely (high percentage chance of interest) to help a user visiting the page, I'd avoid it. Cheers!
It could be that the way the content is being loaded dynamically might prevent it from being seen by Google. Did you check their cached, text-only version of the page?
There might be some, but if the domain is off brand and not a name you want to use, you're probably better off universally redirecting it to your present domain. It's important to investigate the history of the domain though. Is it associated with Spam? Either as a source of spam email or a spammy link profile? Try to get as much insight in that regard as you can.
I wouldn't worry too much about the nofollow link, especially since having a complete lack of nofollow links in a big profile would be a warning sign of link manipulation. Still, Google also knows when a site with really high trust and authority uses nofollow links to maybe a too high of a degree--like wikipedia. That said, they also can bring search value. See: http://moz.com/blog/the-hidden-power-of-nofollow-links and the first comments at the end of the post. Cheers!
Hi Michael. Sites can freely employ a NOINDEX / FOLLOW on low quality content pages or other non-critical pages. It's fairly trivial and easy to change work that can be handled in-house. Obviously other things like high quality content, linking, and freshness will go much farther in terms of overall strategy, this technique is valid. See: https://support.google.com/webmasters/answer/79812. Cheers!
Ok. If you decide that you're going to buy the domain, once you do use a 301 redirect to send all former links and references from it to your present domain. Here a guide on that: http://moz.com/learn/seo/redirection.
This is also a question of usability / user experience. If your page incorporating in-article anchor links is laid out well and has engaged users clicking those links to explore the different sections of the page, you're likely getting a lower bounce rate and better numbers in-terms of users bouncing back to the search results from your page.
In ATPs case, those links were likely causing navigational confusion and/or potential keyword stuffing flags. Cleaning them up served his site better.
In either case, I don't think it's a simple question of solely how many links count, especially in real world usage. Cheers!
I see. One thing that might help you with the customer is looking at the Analytics and highlighting the performance of the low quality pages. If they're never being seen you could make the case for getting the key information from those pages, adding it to the better pages, and redirecting. Cheers!