It makes sense semantically and search engines (particularly Google) are able to understand language better than most people think. Go ahead and add "in", and furthermore if Royal Oak is a city, use proper grammar and add the comma before the state. It's not going to be a problem to add it, but it may be a problem for you to have poor grammar on your site as proper spelling and grammar is actually a ranking factor.
Best posts made by brettmandoes
-
RE: Keywords and stop words
-
RE: Is there any advantage to including a subdirectory in a URL?
There are two good reasons to avoid the structure you've proposed: user experience and SEO. As your website gets larger, having more and more and more links from the homepage to each individual article is going to be a massive navigation headache, and will be confusing to users. If you're planning on orphaning that content so it's not accessible from the homepage, then again, you're creating a confusing navigational structure that will not be beneficial to users or to you from an SEO perspective.
The subdirectories are providing a no-nonsense approach to finding information quickly and efficiently in a manner that people are accustomed to. If the issue is that the subdirectories themselves are confusing, I would just rework the content so they make more sense and facilitate navigation.
There's also a loose rule about the number of links on a single page - try to stay below 100.
But if you have a small website and can organize the links in a manner that doesn't look like a hoarder designed it, then the structure you're proposing may be workable. I would just be very cautious about implementing a flat architecture like that for a medium - large site.
-
RE: Keyword Stuffing
Hi Edwyn, can you share some more details? If you're not comfortable with a link to the page, it would at least be helpful to know more like how often the keyword is mentioned on page.
Sometimes, that keyword count metric is just off. If you have a golfing ecommerce site for example, you probably have a ton of mentions for the terms "golf ball" or "golf bag", especially on category level pages, and that's beneficial to your business and the user experience. In a situation like that, the keyword count might be very high but it's not necessarily bad for SEO.
Now, if you've written a paragraph about golf balls on that same page, and you mention "golf balls" 17 times, then trim it back. If you want to know how often you should mention a particular keyword, here's an easy exercise.
1. Pick your target keyword and google it.
2. Open the top 5 sites
3. Use the finder to see how many times those top 5 sites mention the keyword in their pageUsing the golf ball example I just did this in about 2 minutes and came up with this:
position 1: 14 mentions
position 2: 131 mentions
position 3: 3 mentions
position 4: 74 mentions
position 5: 64 mentionsAs you can see these sites have many, many mentions of golf balls on their top pages and include some big names like Dicks Sporting Goods and Amazon, and rank perfectly fine. A keyword count metric would probably warn them that they mention the target keyword too many times, but that doesn't appear to be the case. So go ahead, and try this with your target keyword. If you're coming in less than the top results for Google, then I wouldn't worry about keyword stuffing if your design legitimately uses the target keyword, such as in a product name or description.
Hope that helps!
-
RE: Differentiating Franchise Location Names to better optimize locations
Hi Jeff, I think I can help you with this, but to clarify, it looks like you have three separate questions:
1. What is best practice for naming different locations to optimize for local SEO?
2. What is the best URL structure to optimize for local SEO?
3. Should geo specific terms be used in blogs?Be sure to let me know if I'm missing the mark. I'm also going to go heavy on industry jargon and assume you know what it means, so feel free to ask questions if I go over your head at any point.
1. For local SEO, it's important to start with a good foundation. This means you have citations claimed for each location with consistent NAP information on your GMB profile, your listings, and the landing page on your website for that location. So if your name includes the geo on the website, it should also include the geo on your GMB profile and citations. It's preferably to use the specific city name they are in. For example, if you're in Flower Mound, TX, be sure to use Flower Mound, not Dallas. Some local SEOs get tripped by targeting the metro area they're in and that can tank results. If some of your locations are in the same city, dividing them up somehow as North/South, East/West, etc. is fine. Google typically picks one or both in those circumstances to display in search.
2. For URL structure, using subpages the way you have laid out is fine. For enterprise local SEO my agency uses a proprietary, scalable CMS to build unique, local websites that rank very well, so I'm more familiar with that structure, but one of the tricks we use is to include a geo variable in the URL, which helps rank for some terms like "glass repair dallas tx", because we can get picked up on the exact match. Every little bit helps.
3. For blogs, I would recommend you completely ignore the geo unless your blog is very unique and specific to the location. You should really only target the location when it's a page that you're trying to rank for local queries and you typically don't have that in a blog. For example, a blog about "what to expect in a hundred year old house" will typically not rank for keywords that trigger the local algorithm, so there's no reason to add the geo. It just gets in the way of the content, and inferior content doesn't rank well. Now a blog like "what to plant in your [location] fall garden" just may have some localization to it, because what you plant in the fall in Des Moines is different than Atlanta. But I find these cases to be few and far between.
Hope that helps, let me know if you have questions.
-
RE: Indexed Pages Increase and Major Drop June 25th and July 16th?
Hi Kwilgus,
We've seen major fluctuations across several tools and our own proprietary data sets. One of the tools you could use to help keep track of ranking fluctuations is mozcast (http://mozcast.com/). Click on over to metrics, set to 90 days, and compare domain diversity with daily big 10 - the two seem to have an inverse linear relationship. You'll see that Google is monkeying around with their algorithm and giving preference to fewer sites.
Within our own data set we're noticing ranking fluctuations in the local SERP for August, and on other tools we've noticed some ranking fluctuations on the mobile SERP towards the end of July beginning of August.
So you're not going crazy, things are a little bouncy right now.
-
RE: Spam Flags on my minutedrone.com
No problem! One more thing that caught my eye. You need a native speaker to reword your English page. Some of the sentences don't translate very well, and apart from killing your conversion rates it's also a bad look for your SEO.
Best of luck, looks like you're offering a cool service!
-
RE: Will there be problems in the future with a mobile dedicated site?
While Google has been reluctant to just come right out and say "Responsive is the way to go", they've dropped enough hints to make it fairly obvious that's what they want you to do. It makes sense - provide the same content to mobile users as you do desktop so it's accessible from anywhere. And the simplicity of it allows Google to efficiently crawl the site.
You're likely to realize greater efficiency by moving to a responsive design, which in turn will allow you to improve rankings more expediently. If SEO is a concern, go responsive and ditch the worry.
-
RE: Is there any advantage to including a subdirectory in a URL?
I can't think of anything beyond what you've already explored. Instead of completely deleting those pages, just add in a 301 redirect to something relevant so you minimize any loss from those pages disappearing.
Best of luck on your new design!
-
RE: Engagement - but high bounce rate?
98% bounce rate is more indicative of ghost spam in your analytics than actual users. Moz writes a great article about what ghost spam is and how to stop it here: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
A high bounce rate isn't helping the site, so at best it's a null effect, at worst your superior is correct and it's impacting your rankings. Again, Moz has a great article on the topic: https://moz.com/blog/traffic-engagement-metrics-their-correlation-to-google-rankings
From the article: "**Bounce rate: **At first glance, with a correlation of -0.08, the correlation between bounce rate and rankings may seem out-of-whack, but this is not the case. Keep in mind that lower bounce rate is often a good indication of user engagement. Therefore, we find as bounce rates rise (something we often try to avoid), rankings tend to drop, and vice-versa."Keep in mind that's measuring correlation, not causation. As no one has been able to prove that Google uses engagement metrics as ranking signals we cannot definitely say whether or not bounce rate has a direct impact, but for obvious reasons we want to keep bounce rate low.
I hope this answers your question more fully.
-
RE: Client wants to repackage in-depth content as PowerPoint files and embed on site. SEO implications?
Hi there, I think your specific question is, will embedding power points into the website hurt their site or help it? I'm going to try to break this down for you.
If the slides are indexable one of two things will happen:1. Rankings go up for new, related terms that you either weren't ranking for before or were ranking poorly for
2. The powerpoint cannibalizes rankings for the other pages that were previously built outI'm going to assume you know how to track for this since it's pretty straightforward.
If the slides are not indexable then there should be no reason it would negatively impact rankings. It's essentially invisible to Google.
SlideShare slides can be crawled and indexed. I would expect that to be the default behavior unless you find documentation that shows otherwise. If you don't want it indexed, embed on a page and mark that page noindex.
Let me know if that helps!
-
RE: Drop in rankings after AMP implementation because of lack of facebook comments
I'd be interested in seeing the research you did showing that Google is using Facebook comments as a ranking signal. I think a correlation definitely exists, and there is supporting evidence for it, but to date I don't believe anyone has proven causation.
The best research I've found on the topic is here: http://buzzsumo.com/blog/social-shares-and-inbound-links-insights-from-new-correlation-data/ In short, it states that some content that is shared garners more backlinks depending on the type of content and where it is shared. You should compare this with your research and see if it follows the trend noted here.
If you're trying to decide whether to move to AMP or not, consider that you could receive a ranking boost as some have, but you could also be sacrificing your ability to convert as highly as you used to.
I like to follow this rule of thumb: publishers yes, everyone else no. We all have to make money somehow and stripping out conversions is not a great way to do it.
-
RE: I can get hundreds of natural links from real estate agent sites, but should I?
As long as the links are marked nofollow you don't have to worry about being penalized for unnatural link building. To stay in line with Google's webmaster guidelines you should make that a part of your strategy. If you're getting hundreds of dofollow links in a matter of days, Google is going to know those are unnatural whether they have a high DA or not.
Google has stated in the past that they may pass value through a nofollow link, so don't consider it a dead end for SEO efforts. I would consider any link from a legitimate website to be of value, regardless of what their moz score is. Remember that any score from moz or any other third party is an approximate measurement of Google's algorithm and does not guarantee anything. There may be something valuable to your website on other sites with a low DA.
-
RE: Is hiring bloggers to review my products while back linking to my website bad for SEO?
Hey Debashish, I'll try to answer your questions in order so they make the most sense, because they're good ones and they mirror perfectly the same questions we all see in this industry.
1. We really need to define penalization here first. Unless you're getting a message in Google Search Console telling you that your site has been manually penalized, then you haven't been penalized. Next we're going to make an assumption, which is that you gained some ranking benefit from illicitly garnered backlinks. And finally, we're going to assume that Penguin caught on and devalued those links. This means you haven't been penalized per se, but you're no longer deriving optimization benefits from those links. So 'recovery' in this sense is going to mean something different.
Recovery from a manual penalization does not mean you will return to your previous rankings and organic traffic. It means that you are no longer being blacklisted by Google and your site is being indexed by Google again. But in both cases, what a client wants to know is "when will my previous rankings return" and the answer is that they will not. If you were ranking highly before due to inadvisable link schemes and those beneficial links were devalued or disavowed or removed, then you aren't going to see a return those strong rankings. You're going to have to fix things so you're obeying webmaster guidelines, then begin competing in another way that's within guidelines to bring yourself back to the top.
John Mueller gives this response every time he's asked this question in his Hangout series, and he gets asked it A LOT. You can see those videos here: https://www.youtube.com/playlist?list=PLKoqnv2vTMUMqH8IyzOMBc_Z8i-S6-tVJ. This channel is a really fantastic resource for learning some intermediate to advanced SEO tactics via Q&A with John Mueller himself.
2. You can submit a disavow file via Google Search Console. Again, this is going to tell Google that you don't want any of those links to be associated with your site for the purpose of ranking. What some SEOs are guilty of doing when clients were hit with Penguin penalties was to disavow ALL backlinks instead of just the naughty ones, and this got them back in the index but completely nullified a lot of valuable and credible links.
Consider that it takes 4-12 months to implement and begin to reap the benefits of well thought out white-hat optimization strategies, and use that as your timeline for 'recovery', because again you're not going to see an immediate return to your previous rankings and will need to implement new tactics.
-
RE: Will more comprehensive content on product pages help improve ranking?
Definitely add more comprehensive content on product pages but make sure it's unique. Meaning if you use an API to bring in the manufacturer's specs and everyone else selling or renting your equipment are doing the same thing, then it's not unique and Google won't determine that you have something extra to provide to users. You should still have that content, but add on to it.
If you want an example of this technique working in the wild, just search any product and you'll find that Amazon ranks highly for thousands and thousands of products that others sell. They have a lot of unique content in the form of user generated content (which Google counts as unique content) and this helps them rank.
So yes, adding content will help you rank, but make it useful content (scrap the fluff), make sure it's unique content, and mark it up with schema for that extra oomph in the SERP.
-
RE: Keywords and content query
Hello, I think your question can be broken down like this:
1. Is it a problem if I can't add text/content?
2. Is there a certain word count I should aim for?
3. Is there a specific number of keywords on page I should aim for?So I'll try to answer this as best I can and if you have more questions, just fire back.
1. This could be a problem if the content on page is something you'll need to rank well. It seems counterintuitive to many because "content is king" has been parroted as SEO wisdom for years, but there are times when content is NOT the primary driver of rankings, and the secret is in the intent of the searcher. Think about it like this, if you're searching "best ac repair service near me", you probably just want a short list of the best HVAC companies near you. A 3,000 word article is less helpful here than a short list of the best, and indeed when I run this very search the top 5 results are all lists. The number one result has less than 600 words, but all of them have user generated content in the form of reviews. Another example where content may not matter: "buy golf balls". You're going to get a lot of ecommerce listing style pages that are short on content but allow people to easily buy golf balls. I know this because I just ran this search yesterday to help another Mozzer. But if your page is meant to be informative, you may need the ability to modify, add, or remove content, so this could be a problem. Try to match the searcher's intent with the page and that will help you determine if this is truly an issue.
2. As we just demonstrated in example one, no specific word count is recommended for all queries. However, there was a study performed in September 2016 by Backlinko that analyzed about a million queries and one of their findings was this:
In fact, the average word count of a Google first page result is 1,890 words
This would indicate that longer content is better, but as I discovered early in my career - if you write content just to have the length it will flop. We tried it at scale and wrote the content just to have the length for about 120 websites. It performed the exact same as the content we had before it, which was about 500 words. So don't do that.
3. This one is short and easy. The answer is no. The metric you're referring to is Keyword Density, and it was short lived and shut down back when Matt Cutts was still at Google. The myth lives on but it's a garbage metric that doesn't correlate to success. Avoid using or even referencing this.
Hope that helps, let me know if you need more info.
-
RE: How much does doing google search queries dilute your search console data
Hi Fishe, thanks for sharing this. I had never really thought about filtering out ip traffic from search console data. I typically work with websites with a high enough volume that I think the filtering wouldn't likely impact my work, but it's good to know for my newer clients who may not have much brand presence and are spending a lot of time googling themselves out of anxiety. I can definitely see a use case for that scenario. Good work!
-
RE: Is hiring bloggers to review my products while back linking to my website bad for SEO?
1. It's great that you haven't been manually penalized! Unfortunately, yes I have seen pages drop off the map entirely before for specific keywords. Most keyword tracking tools only search the first 100 results, so if you don't make it inside that bubble it will display a null value like (--). It basically means you need to up your game. Most often you can make it well inside those first 100 results by applying on-site SEO tactics. Update your meta data, make sure relevant pages are linking internally to the page you're trying to rank, and improve the amount of unique quality content on page. Put some focus on the user experience.
2. You can disavow them, but I would strongly recommend you first check which ones are marked nofollow. You don't want to disavow those because they're already compliant with webmaster guidelines. For the rest, communicate with the bloggers you've paid and see if they can switch it to nofollow. Give them a couple of weeks. I don't think you need to jump straight to the disavow file if you can get a significant percentage of those links marked nofollow. I generally avoid using disavow files as much as possible because it's like the nuclear option. Last resort.
Moz has a pretty sweet tool with their spam score. I love how easy it makes it for the novice-intermediate SEOs to do a good job keeping their site quality up without needing a ton of oversight from a director or SEO manager. Check out Moz's article on spam score to get an idea how it works: https://moz.com/blog/spam-score-mozs-new-metric-to-measure-penalization-risk
I think if you have 4 or less that's pretty rock solid. It's an approximate value, so don't hyperventilate if you can't get it down to 0. Even this blog has a spam score of 1/17.
-
RE: How Can I Redirect an Old Domain to Our New Domain in .htaccess?
There are several resources around the net that show how to do this. Here's a code snippet provided by http://www.inmotionhosting.com/support/website/redirects/setting-up-a-301-permanent-redirect-via-htaccess
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC,OR]
RewriteCond %{HTTP_HOST} ^www.example.com [NC]
RewriteRule ^(.*)$ http://example.net/$1 [L,R=301,NC]If this doesn't work for you, here are some other resources:
https://www.internetmarketingninjas.com/blog/search-engine-optimization/301-redirects/
https://css-tricks.com/snippets/htaccess/301-redirects/
http://www.bruceclay.com/blog/how-to-properly-implement-a-301-redirect/The first code example is more canonical friendly. The simple version I see used a lot and recommended in the other resources I've provided looks like this:
RedirectMatch 301 ^(.*)$ http://www.xyz.com -
RE: How will changing the phone number on my website affect SEO?
I've seen it advised that you should not use letters - so 1-800-eat-cows would be less optimal than 1-800-328-2697. I don't know if it's true from an SEO standpoint, but it's certainly true from a usability perspective. Customers do not enjoy having to translate letters into numbers, especially anyone who is even slightly visually impaired. Takes forever.