If you start a campaign in Moz, go to page optimization, enter a URL and keyword, and go to the bottom where it says "Content Suggestions" is that basically do a TF-IDF analysis? I want to make sure I understand how that works. Thanks!
Posts made by brettmandoes
-
Is the Content Suggestions section under Page Optimization a TF-IDF Analysis?
-
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward.
But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet.
Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
-
RE: Optimal URL Structure for a Multi-City Directory
I think you should consider how your users are interacting with your website and how they search for your services/products/locations and follow that. For example, Yelp is focused on local reviews. People will filter first to their city, then the category naturally. You would never filter down to restaurants first, because if you're in Huntington Beach, CA you really don't care what's in Portland, OR. If location is secondary to your product, then it makes sense to start with the category. For example, let's say you sell ATVs and other off-road vehicles and gear, but some showrooms only have ATVs while others also carry dirt bikes. Customers who are looking for a dirt bike care more about reaching a showroom with dirt bikes, so that category structure would be preferable.
Note that I'm assuming in both of the above examples that your navigation is following the structure of your website for usability purposes. In terms of structure, one way is not inherently better than the other from a ranking/algorithm perspective, but if your structure is confusing it can be detrimental to SEO. For example, outreach is a lot harder if you have a garbage navigation that contributes to poor user experience on your website. Any piece of Google's algorithm that measures user satisfaction with your website (Rank Brain, pogo sticking, etc.) will either directly or indirectly affect you depending on how user friendly your website is.
One last thing: in both instances you have the geography in the URL, so if you're hoping for a boost for local phrases from an exact match URL I think you're already tapping that. EMDs are nowhere near as effective as they were in years past, so I wouldn't make that my focus.
-
RE: Differentiating Franchise Location Names to better optimize locations
Hi Jeff, I think I can help you with this, but to clarify, it looks like you have three separate questions:
1. What is best practice for naming different locations to optimize for local SEO?
2. What is the best URL structure to optimize for local SEO?
3. Should geo specific terms be used in blogs?Be sure to let me know if I'm missing the mark. I'm also going to go heavy on industry jargon and assume you know what it means, so feel free to ask questions if I go over your head at any point.
1. For local SEO, it's important to start with a good foundation. This means you have citations claimed for each location with consistent NAP information on your GMB profile, your listings, and the landing page on your website for that location. So if your name includes the geo on the website, it should also include the geo on your GMB profile and citations. It's preferably to use the specific city name they are in. For example, if you're in Flower Mound, TX, be sure to use Flower Mound, not Dallas. Some local SEOs get tripped by targeting the metro area they're in and that can tank results. If some of your locations are in the same city, dividing them up somehow as North/South, East/West, etc. is fine. Google typically picks one or both in those circumstances to display in search.
2. For URL structure, using subpages the way you have laid out is fine. For enterprise local SEO my agency uses a proprietary, scalable CMS to build unique, local websites that rank very well, so I'm more familiar with that structure, but one of the tricks we use is to include a geo variable in the URL, which helps rank for some terms like "glass repair dallas tx", because we can get picked up on the exact match. Every little bit helps.
3. For blogs, I would recommend you completely ignore the geo unless your blog is very unique and specific to the location. You should really only target the location when it's a page that you're trying to rank for local queries and you typically don't have that in a blog. For example, a blog about "what to expect in a hundred year old house" will typically not rank for keywords that trigger the local algorithm, so there's no reason to add the geo. It just gets in the way of the content, and inferior content doesn't rank well. Now a blog like "what to plant in your [location] fall garden" just may have some localization to it, because what you plant in the fall in Des Moines is different than Atlanta. But I find these cases to be few and far between.
Hope that helps, let me know if you have questions.
-
RE: Hi. One of our competitors is ranking ahead of us on Google. Our site has a much stronger authority and much more quality links than this competitor. Would anyone have any explanations for this? Thanks
Hi barryhq, Google has in the past called the top three components of their algorithm Content, Links, and RankBrain, without naming a particular order. It sounds like you've worked hard on links, so good job! Your problem is therefore potentially related to content (ignore RankBrain, you can't really optimize for it).
So let's talk about content. When you do your keyword research, try to focus in on the searcher's intent. What tasks are they possibly trying to solve that Google is surfacing that aren't addressed on your webpage? You can learn this by studying other top results, related searches, people also ask, etc. For example, if one of the keywords you have is "best running shoes" then you know people are doing comparison shopping and including content that compares the top running shoes on your page will help you rank for that. And it's entirely possible that you'll discover in this process that your website is not a suitable match for the keywords you've targeted. I've seen this happen with clients who pick a phrase without doing the research required to make an informed decision and end up targeting something they can never rank for.
It's also possible that you have technical SEO issues, like canonicalization or poor internal link structure or cannibalization that's making it harder to rank, but assuming that your technical SEO game is on point I would recommend focusing on content.
-
RE: Footer no follow links
Hi seoman, it's definitely outdated and was never accurate to begin with. The "nofollow" attribute was always designed to be applied to external links and modern advice is to never apply a nofollow link to your own internal links. If you're concerned about passing authority from a page like your homepage down into your footer links instead of more important pages, you should know that Google tags the links on your site so that they're weighted differently, i.e. a link in your body content is worth more than a link in your footer, image links don't pass as much authority, etc.
In short, I don't think you're going to move the needle by altering your footer links to nofollow.
-
RE: 301 Redirect and Canonical link tag pointing in opposite directions!
Canonicals are not absolute directives, so Google will eventually sort out which of the two signals is more important. My guess is that the redirect takes precedence, because if they displayed the canonical to a user in search, it would be displaying a URL that sends users through a redirect which is a poor experience and they take pains not to do that.
When there are confusing signals like this out there, Google will do its best to sort out these issues and John Mueller has repeatedly stated "we do a pretty good job" at figuring it out, but he almost always adds a disclaimer that it's "better" to have a less confusing structure.
In plain english, it's not a catastrophic error, but it's something you need to clean up as part of your optimization efforts.
-
RE: Proper URL Structure. Feedback on Vendors Recommendation
Hi there, I've got a few thoughts to drop about this, but I want to make sure I answer your specific question first, then answer what I think are the lead up or follow up questions that are either on your mind or that you'll land at in the end anyway.
There are specific instances where you may favor one URL structure over the other. For example, our landing pages are similar to your current structure, and the rest of the website is more similar to your vendor's proposed structure. Folders are a great way to categorize your content and help both Google and users navigate and understand your content. However, you do not want to lose the hyphens. That can make it difficult for users to read in search when they're deciding on a page to view and it can be difficult for Google to read. Let's say your URL has an acronym in it - maybe you're writing about basketball and NBA is in the URL. So your URL becomes: website.com/sports/hownbaistakingcharge Or website.com/sports/baskteballnbakobe. Are either of those readable? You have two stakeholders, Google and Users and your URL structure should support both. Compare the above to website.com/sports/how-nba-is-taking-charge or /basketball-nba-kobe. That's much better for Google because they can clearly read the different words and make sense of it, and it's much better for Users who are trying to quickly scan the URL on Google. I would push back on the vendor that the hyphenation is necessary.
I've listed a few other questions below that I would have for my vendor and team if we were proposing a major restructuring of the site's content.
A new URL structure means a few other things will likely change.
1. Have you thought about creating a redirect map for every page that is going to move?
2. How will the new URL structure interact with breadcrumbs on your site?
3. If you move to folders are you going to need to create head pages e.g. website.com/sports/how-nba-is-taking-charge is located under a main "sports" page that maybe doesn't exist yet. You WILL have users that attempt to reach the head page whether it exists or not and they'll be sent to a 404 instead.
4. Will changing your URL structure alter your main and sub navigation elements on the site? (in almost every instance, it should)And then my final question, knowing how much work it is to take a healthy site and improve it by changing the URL structure alone is this: what is the expected value? Why are we doing this? Sometimes there's a legitimate reason and sometimes it's pure vanity. The SEO upside to a major restructuring like this isn't normally enormous, but the effort involved can be titanic. So be sure your expectations are realistic going into it and get the details fleshed out as much as possible ahead of time.
Best of luck, let me know if I can answer anymore questions.
-
RE: Related Keywords: How many separate pages?
Instead of trying to group pages by keyword, try thinking about searcher intent and task accomplishment. Can you write one comprehensive page that addresses the searcher's needs and includes all the keywords? Or does it make more sense to break into a couple different areas, such as a page that's specific to a plaintiff and a page specific to a defendant?
Try this: create a venn diagram of the different audiences that may visit that section of the site you're contemplating building out, and group the keywords that you suspect each audience would use and see where the overlap is. If there are areas that are completely blank, you don't need a page for that specific audience or task. Doing this will help you determine which pages need to cover which keywords for the right audience. For example, for an optometrist there's probably searches involving "contacts", "glasses", and "lasik". You might be able to address all three on the same page, but that's probably a horrible experience for someone who is just looking for a specific eyeglass style to have long text about the benefits of lasik. Very little overlap there because the audiences and intent may be different, so they get different pages, and that shows up in the venn diagram.
Hope this helps!
-
RE: Quick Fix to "Duplicate page without canonical tag"?
The simplest solution would be to mark every page in your test environment "noindex". This is normally standard operating procedure anyway because most people don't want customers stumbling across the wrong URL in search by mistake and seeing a buggy page that isn't supposed to be "live" for customers.
Updating your robots.txt file would tell Google not to crawl the page, but if they've already crawled it and added it to their index it just means that they will retain the last crawled version of the page and will not crawl it in the future. You have to direct Google to "noindex" the pages. It will take some time as Google refreshes the crawl of each page, but eventually you'll see those errors drop off as Google removes those pages from their index. If I were consulting a client I would tell them to make the change and check back in two or three months.
Hope this helps!
-
RE: How do you use Moz to research related topics?
Hey Dave, thanks for the response! I should have updated my question earlier today, but I was able to find a better way to do this type of research using Moz Pro's page optimization tab. The section in there that was previously labeled "related topics" was renamed "content suggestions", but it worked great. I was able to double content length and put in some genuinely useful information (I hope) that should help it rank better (I hope).
It's a darn sight faster than what I was doing before, which was manually copy/pasting all body copy from the top ten sites for high volume keywords into an ngram analyzer and looking for patterns. The results were actually pretty similar, but good gravy, was it boring.
-
How do you use Moz to research related topics?
Like most of the folks here I'm a pretty big fan of the content that comes out through Whiteboard Fridays, and I try to apply the things I learn, but one of the WBF videos that I'm following along with does not do a stellar job of detailing execution using Moz KW Explorer.
https://moz.com/blog/related-topics-in-seo-whiteboard-friday
Now granted, this came out in 2016, but I still feel the core principle and strategy results in a higher quality piece of content and is still relevant to discovering and understanding searcher task completion requirements, and drafting content that fulfills those requirements. Towards the end Rand sort of mentions that you'll be able to do this with KW explorer, but I'm not really seeing the functionality.
The steps I followed were to enter in the keyword in kw explorer, went to keyword suggestions, and selected "based on closely related topics" and ran it, but received no suggestions - came up blank. I then selected "based on broadly related topics" and the same thing happened. I tried this out with the keyword r22, keeping it very broad to start but that didn't seem to work.
So what do you all do to perform this sort of research within Moz? Or do you even feel it's relevant in today's Rank Brain driven world?
-
RE: Javascript and SEO
Thanks for the response Nikki, I'll try to be as thoughtful about this as I can, but I am somewhat skeptical that your problem is javascript. It may be a contributing factor, but in general the concern that most SEOs would have with java is that Google can't crawl it and effectively the content rendered by java is invisible, making it completely impossible to rank as your page is deindexed, and yeah, this is a real risk. The fact that you're on page 1 right now for a competitive term though means that isn't likely your issue. And you're on a Wordpress site, so most of the js issues aren't going to be a problem for you, unless you're using an Angular integrated theme or something.
That doesn't mean there aren't any technical issues holding you back. I ran your page through a couple tools and I'm finding that the page is very heavy, slow to load, and has a very low performance score in terms of page load times and part of that is how js heavy your webpage is. I would recommend running your page through any of the free tools out there. The lighthouse extension for Chrome isn't great, but it was developed by Google so it gives you an idea how they might be measuring your page. Your page scored a performance rating of 4 out of 100, which again, big indication you have speed problems related to your js that could be tied to your rankings.
I think you're on the right track to investigate technical performance issues, but the easiest way to track this down is to start by making sure you don't have content that isn't being indexed. From there you should be able to see if there's any js that's blocking content from rendering for Googlebot. If Google is crawling and indexing the content, your js is okay from a visibility perspective and you can focus on the performance aspect.
If Google is displaying the page completely with fetch and render, you're probably okay, but try going into Chrome Dev Tools and disabling the cache, then reloading the page. Watch for any errors and try running lighthouse with that open. You'll probably be able to catch errors that way.
Good luck!
-
RE: What Moz reports would you suggest running when pitching SEO services to a client?
I just thought of one report that's quick to run and always good to share.
1. Go into keyword explorer and change the drop down to root domain
2. Enter the client's domain and hit run
3. A new screen should pop up with the option to add more domains. Add one or two of his competitors in the field below his URL and hit the "compare sites" button
4. Take a screenshot of the resulting graphWhat's nice about this is that you're providing something visual you can talk to without overwhelming them with data. You can talk about the problem you see in this graph and how you can address it.
Hope that helps!
-
RE: How much does doing google search queries dilute your search console data
Hi Fishe, thanks for sharing this. I had never really thought about filtering out ip traffic from search console data. I typically work with websites with a high enough volume that I think the filtering wouldn't likely impact my work, but it's good to know for my newer clients who may not have much brand presence and are spending a lot of time googling themselves out of anxiety. I can definitely see a use case for that scenario. Good work!
-
RE: Javascript and SEO
Hey Nikki, I think your specific question is more centered on "Will having a website that is only fully enabled with Javascript be harmful to SEO?"
First, there's a lot of mythology about this in SEO land. There are outdated resources and it looks like you've read some of them. Google has advanced their ability to crawl and understand js and the content behind it to a very advanced degree and the tools you may use as proxies to understand Google's capabilities aren't so effective.
But before I move on, I want to verify something with you. When you're talking about javascript, are you specifically looking for answers regarding a website like WIX, built with AJAX? Because that can change my answer significantly.
-
RE: What Moz reports would you suggest running when pitching SEO services to a client?
Hi Rupert, this is kind of a tricky question. The tools and reports provided by Moz are really meant to help provide SEOs with the knowledge to do their jobs, not so much as a sales tool. This means that the information you get from keyword explorer will be more useful during execution, and will be confusing to a prospective client whose familiarity with SEO is vague.
I would encourage you to use the tools that Moz provides to create a preliminary strategy, and only show the back end as an auxiliary. It's not important for them to walk away with a report that shows them a bunch of metrics, it's important for them to walk away feeling like you're the person that can do the job.
Ultimately as a consultant you're not just a mechanic turning wrenches. They have human problems that you need to address while presenting SEO services (a mechanical problem) to a prospective client. If they're the business owner for example you can ask directly, what do you think SEO will do for your business? What kind of timeframe are you looking for results? Is there any pressure to improve rankings in the near term? Asking these types of questions can often get to the root of another issue they're having, e.g. revenue is down the past quarter and they think SEO is a quick fix solution because they don't understand it takes time.
I've sold more SEO services and avoided more headaches by addressing these types of very human problems than I ever have by showing someone a report. In my above example (which was a real client), I addressed the client's need by redirecting them to another form of advertising that could generate quicker results (SEM) and still got the SEO contract. I used Moz to show how I gather intelligence on keywords and showcase my expertise. But I avoided going in depth or handing them anything during the initial consult.
There's an art to this that can't really be fleshed out in a forum, but there are a ton of books and courses out there on consulting. I can recommend Flawless Consulting by Peter Block. Fantastic book that will help you avoid bad deals and close more often just by being authentic and using your expertise appropriately.
I know this answers your question very indirectly, but I hope it's more useful. I didn't want to just tell you "spit out these reports and you'll be fine" because I've gone that route before and that business never stayed with me.
Good luck in your next pitch!
-
RE: Client wants to repackage in-depth content as PowerPoint files and embed on site. SEO implications?
Hi there, I think your specific question is, will embedding power points into the website hurt their site or help it? I'm going to try to break this down for you.
If the slides are indexable one of two things will happen:1. Rankings go up for new, related terms that you either weren't ranking for before or were ranking poorly for
2. The powerpoint cannibalizes rankings for the other pages that were previously built outI'm going to assume you know how to track for this since it's pretty straightforward.
If the slides are not indexable then there should be no reason it would negatively impact rankings. It's essentially invisible to Google.
SlideShare slides can be crawled and indexed. I would expect that to be the default behavior unless you find documentation that shows otherwise. If you don't want it indexed, embed on a page and mark that page noindex.
Let me know if that helps!
-
RE: New link explorer
Roman is spot on. Links are still a big part of the game, but there are specific instances where you can rank with just content. Local SEO is a prime example. I only ever bother with citations for local SEO because I consistently rank just by focusing on content, technical/on-page SEO, and citations.
For my clients with a national presence, outreach is necessary. Think healthcare and finance - smaller guys are competing with massive banks who have a ton of authority and history. Content won't win in that space alone (I wish it did). There are some real easy outreach wins early on, but once you've used up the easy stuff (like getting into directories or fixing broken backlinks) then you have to do a content inventory, find your best stuff, and promote it.
OR
You have to create amazing content, then promote that and earn some backlinks. Like Roman says, the term "great content" is overused and oversold. Most people who show me their great content are often showing off mediocre content.Best of luck!
-
RE: Local SEO - 2 Locations
Hey there, we do this **A LOT **at my agency (I'm currently managing three enterprise local SEO clients) so I think I can help you.
1. Your citations are composed of your NAP+W information, so the best situation is to make sure that it's as unique as possible between the two separate locations.
a. Name - this will probably be the same unless your client has a naming convention like "FroYo Blast San Diego" and "FroYo Blast Sacramento".
b. Address - this will be unique
c. Phone - This **can be **unique and should be. I know some clients send everything through a call center and that's suboptimal.
d. Website - create location specific landing pages and link to those.
If you follow this then the only non-unique item in there is potentially the name. What we've found across something like 350 websites/locations is that the more unique this information is, the better rankings tend to be.2. For local SEO we've never needed to actively build links outside of citations and we rank page 1, often position 1, for highly competitive queries. Relevant content is more important, so make link building a lower priority. You may need to work on backlinks if you are in a very competitive space, but small local businesses generally have a hard time getting backlinks, which is probably one reason why it's not as important a signal. If it were, then the only HVAC businesses showing up in search would be the ones paying SEOs for link building services which I think Google realized.
3. Put the locations into your footer and wrap those in schema. You could do the header too I suppose, but from user testing we've found it's better to keep the header area decluttered. Start putting in too many phone numbers up top and people get confused.
4. We build a unique website for each location. When you can't do that your best bet is to build landing pages that optimize for the location. On one of our programs we have about 1500 of those landing pages, and we rank on page 1 for a little over half of our 18,000+ targeted keywords with that strategy. It's harder if you're not physically in that location, but since you have a physical location that makes it easier. Make sure you're mentioning the target location in your meta data, like title and h1 tags where appropriate. That helps!
Best of luck!