Moz keeps disconnecting from Google Analytics making the SEO reporting useless. The Google Analytics account information has never changed. It's nearly everyday I am reconnecting accounts...
Can someone at Moz please help?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Moz keeps disconnecting from Google Analytics making the SEO reporting useless. The Google Analytics account information has never changed. It's nearly everyday I am reconnecting accounts...
Can someone at Moz please help?
Thanks, the reason I ask is we're updating an SEO audit snapshot tool we have and we want to use the minimum amount of data points to get a score. I was hoping that DA could be used to represent link/social signals as a whole. Since scoring social is a bit tricky.
We want to get away from trying to score everything. Since the more things you try to score the more room for error their really is.
This would serve as a snapshot of SEO performance and not an in-depth analysis.
We broke our snapshot analysis into 4 parts:
We felt if we could score these three things it'd give us a snapshot of how a website could be doing. A deeper analysis would be required to get a more accurate picture.
I agree.
Still waiting to hear someone at Moz 
I was curious if Domain Authority considers social signals.
"We calculate this metric by combining all of our other link metrics—linking root domains, number of total links, MozRank,MozTrust, etc.—into a single score."
Is Social included in the etc? I know you have some social data displayed in the results.
Pull the site stop searches via semrush.com. With that report then take everything out except for the landing pages and traffic % columns. Now consolidate all the duplicate pages and sum the traffic % using the excel consolidation feature. You'll be left with the top landing pages sorted by estimated search traffic %.
It's not perfect but it's not a bad estimate either... especially if they have a lot of data available via SEMRush.
You don't really want to "build links" you want to earn them. Links earned are worth 1,000x more than a link built.
Sticking out as a dating site in this day and age is a tough one. You really need to make your site unique. You need to do something newsworthy... if your focused in a geographical area. Do an infographic about that area, or conduct a survey and share the results.
Now with all that said if you're still looking for "link building" offer a coupon and share that coupon on reputable coupon sites... but then again those links are "earned" since you're offering something.
Yeah it seems like the best logical answer is that each location page needs unique content developed for it. Even though it still kinda feels a little forced.
Goes to show you that Google has really pushed SEO firms to think differently about content and when you have to do something just for SEO purposes it now feels icky.
Yes creating unique content for that page for that location can be seen as useful to the users but it feels a little icky because the user would probably be satisfied with the core content. But we're creating unique location specific content mostly to please Google... not the user.
For example what if Walmart came to this same conclusion. Wouldn't it be a little forced if Walmart developed pages for every location that had that locations weather, facts about the city, etc?
Due to it's brand it's able to get away with the thin content version of location pages: http://www.walmart.com/store/2300/details they don't even use the markup... but any SEO knows you can't really follow what is working for giant brand like Walmart.
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services.
Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc.
They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well.
My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names".
In a nutshell, Google's Guidelines seem to have a conflict on this topic:
Location Pages: "Have each location's or branch's information accessible on separate webpages"
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one."
Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page:
Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content."
...starting to feel like I'm in a Google Guidelines Paradox!
Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?
It can be used for reporting as well helping set expectations for SEO clients. If Client A has a very low domain authority and wants to compete for very competitive keywords a score like this can help set expectations as to what keywords are realistic for them vs. keywords that are out of their grasp. Thus helping you develop a campaign that helps them target keywords that are within their grasp.
How Keyword Difficulty works now, lets say you get a keyword difficulty of 50. It's 50 no matter what domain you're optimizing for. It's 50 if you're optimizing for wikipedia.com and it's 50 if you're optimizing a brand new website. This equation changes the keyword difficulty score based on the domain authority of the site you're optimizing
Moz's Keyword Difficulty tool is great. Minus one thing... it does't take the domain itself into it's equation.
Of course it doesn't take a lot of other things into consideration too like relevancy of domain but lets at least start with adding domain authority to the equation.
I've come up with simple math that allows you to take the domain authority of the target URL into consideration.
The Equation KD = Moz Keyword Difficulty
DA = Domain Authority
DS = Domain Specific Keyword Difficulty
(KD/DA)*KD = DS
Equation Applied
URL: atari.com
Keyword: classic video games
KD: 69
DA: 77
DS: 62
(69/77)*69=62
You will end up with numbers larger than 100, but it makes sense. If your domain authority is 10 and you're trying to compete with a difficulty of 90 then you should be scared to compete for that term and the number should reflect that.
Thoughts? Other ideas?
Yeah in no way is this a perfect equation, because the only way to get that is to have access to Google's database but I think it can serve as a good starting point. Even if we took SEO out of the equation, if 50% of your organic visitors are bouncing within 15 seconds, that doesn't sound good.
I've created a custom Google Segment which you can find here.
Segment Settings:
To calculate Pogo rate then I looked at Pogo vs. Total organic and got the percentage of Pogo sessions.
I tested a group of 20 sites and got 53% on average with a high of 87% and a low of 33%.
What do you get? Any interesting insights?
Site A Root
/services/
/about/
/contact/
/blog/
/post1/
/post2/
/post3/
Site B Root
Do you think Google potentially would treat these two sites differently in anyway?
Do you think putting blog posts on the root poses a disadvantage?
Root: exmple.com/post-name/
Folder: example.com/blog/post-name
Does anyone know of a tool that can get you the total, or close to it, social shares an entire domain has received?
I know of...
http://www.sharedcount.com/ - but this only gets shares for the entered URL not the entire domain.
http://www.pagesort.com/ - this seems to dig a little deeper but not deep enough.
I know opensite explorer gets some data as well but still seems very limited.
Thanks for mentioning our Content Strength Audit. I would of suggested Wordpress SEO for the easiest solution to this problem but It seems it's already been suggested and seems you're using it.
We updated this today to include a new type of content to look out for...
Broken Content (Extremely Dangerous) – Broken Content refers to pages that are broken in some way. Ask yourself if a visitor arrived at your page, would they notice something was wrong with the code? This typically includes broken links (internal and external), broken images, embedded content that no longer works and improper use of HTML or CSS.
We added this after seeing that Google was typically filtering/unfavorable of "broken content".
Let us know what you think!
Good idea. Was considering putting this one on Youmoz but decided it was better to get it up ASAP for our clients' sake.