Questions
-
Google My Business
If you set up "rel=publisher" and link it to your G+ page this will help. See some examples I set-up here: https://www.google.co.uk/search?q=the+body+matters https://www.google.co.uk/search?q=ddr+surrey These are not big brands, just well optimised for local search. Help here: http://blog.woorank.com/2014/06/how-to-implement-rel-publisher-tag/
Local Strategy | | Silkstream0 -
Question on Google analytics
What Semalt is doing is also known as "referral spam." Here is more information from the Google Product Forums, including how to block Semalt...however, don't go to Semalt's site and try to do it with their tool, block them via your .htaccess file https://productforums.google.com/forum/#!topic/webmasters/VdmF4xDYnDE My favorite line is "The company is engaged in old school stupid customer acquisition techniques (referrer spam). As to their links from their site, they belong to a class of links that I call fungus (automated crap). As to the referrer traffic, it is crap traffic that is meaningless to you. If you can isolate the IP addresses used by the company, block them via .htaccess (but beware that they may spoof their IP address and you don't want to block a legitimate IP range)" The thread contains instructions for adding code to your .htaccess file that will block them. Hope that helps!
Online Marketing Tools | | danatanseo0 -
Best Practices For Local SEO For A Nation Wide Property Company?
Hi Matthew, From your description, these appear to be your client's options: Rank locally (in the local pack) for their single physical office via NAP, content and SEO work on the website + their Google+ Local listing and citations. Rank organically for other cities in which they vend properties via content, on-site SEO and linkbuilding. PPC may be a necessary component, too, given how competitive this market typically is. It shouldn't be a goal for the client to rank locally for anything but their city of location, and it's forbidden to list for-sale properties in Google's local product, so this isn't a way to get around the lack of location either. Basically, it's going to come down to the organic strength of the business to build a presence for the various cities in which they sell properties. You'll be cleaning up duplicate content and developing new, unique content for each of their major cities + wanting to earn links to this content. A blog could be a BIG asset here if the client has the resources to blog in a hyperlocal fashion about properties and local communities and on-topic subjects like buying/selling a home. I think you're receiving some good advice on this thread. I hope my suggestions are helpful, too.
Local Listings | | MiriamEllis1 -
Time based search positions, are they a thing?
Hi Matthew, You write: "... my client operates as a financier. As a result, the company offers online finance applications, which are supported by an effective call centre." I want to verify: 1. Does this business actually make face-to-face contact with its customers? 2. Are the ranking changes you are noticing in the organic results or in the local pack of results?
Local Listings | | MiriamEllis0 -
Perplexed - Errors increasing, moz rank dropping, conflicting data from other sources. Please Help.
Hi Matthew, Without getting into the specifics of your site, it would seem the first step would be to identify the duplication errors Moz reports to see if they are actually valid, and then determine what is causing them and takes steps to address them. Within the duplicate content reports, there should be a number listed next to each URL showing the number of other pages with duplicate content. If you click on this number, you'll get a list of URLs. By comparing these URLs against each other, you can start to identify the problem. Does that make sense? Perhaps you've already taken this step, but it's hard to tell from the question. If you find that the URLs are fishy, and you suspect a problem with your CMS, it's often helpful to determine where the Moz crawlers discovered the links to these URLs. You can do this by downloading the entire CSV report. In the last column is a URL showing where the link was discovered. Knowing this is often helpful in rooting out problems. Hope this helps! Best of luck.
Moz Tools | | Cyrus-Shepard0 -
Rogerbot's crawl behaviour vs google spiders and other crawlers - disparate results have me confused.
Thanks for your response. I was beginning to think this question had been left to rot. I'm not getting any errors in WMT. What is concerning is that Roger is returning almost 300 errors of dupe content, which is obviously a problem. Screaming frog is no longer finding the pages (they've been blocked in the robot.txt) I guess what I'm trying to ask here is how can I be sure that my dupe content has been effectively blocked from google's spider. Is there anyway to check? Thanks for your help.
Moz Tools | | KJDMedia0