Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Rel=canonical and redirect on same page
Hello, the canonical should be enough in this case as it helps Google determine which page is the original out of the many duplicates. 301 is used if you want the duplicate pages gone and its link authority transferred to the original page. From Search Engine Watch: 301 – Hey, Search Engines: My page is no longer here, and has permanently moved to a new page. Please remove it from your index and pass credit to the new page. Canonical – Hey, (most) Search Engines: I have multiple versions of this page (or content), please only index this version. I’ll keep the others available for people to see, but don’t include them in your index and please pass credit to my preferred page. https://searchenginewatch.com/sew/how-to/2288690/how-and-when-to-use-301-redirects-vs-canonical
Technical SEO Issues | | nhhernandez1 -
Never ending new links and our rank continues to plumet
Hi Nikki Thanks for the response. I too am under the impression that with penguin being live that big G had figured out how to diminish value from spam links and it was no longer an issue. On the flip side The only real thing that's changed over the past 6 months for us is a massive increase in these spam links. I will try to just bullet out a response so I don't write a novel Yes I have checked if it's hacked and it is not. I have no manual actions listed I did just look for weird traffic sources, and out of the 48 that I do have, the only weird ones are from urls with the spammy links. Also to note, our ranking has not dropped in any other search engines, only google. In fact we have improved in Bing and Yahoo. Another note. I have disavowed a few porn backlinks we have gotten from sites like gay.interracialsexcinema among others. In the past 7 days I have gotten 33 new links from sites like zenlaser.c0oo - thefreestates.commm prowp.neeet and lilucia02.coooommmm slash cantilever-coffee-table dot html. Many of the domains that are linking to us are literally created (whois lookup) 15 - 30 days before we get a link. All domains are registered private (except one that i caught just before going private and it was in china) and 90% of them are hiding behind cloudflare. An annoying thing is that over 90% of these links are on pages that the url path is the actual same url path or title tag that we are using on our page as well as the h1's and h2's with a "." anchor or the actual path anchor linking back to either our page or an image on our site. I updated the disavow a few weeks ago but have not added all domains to it as I wanted to knock it down little by little. I have been hesitant to do a one shot one kill approach as I want to be cautious in the event that I make a mistake and cause an even larger issue. Any further thoughts? Thanks again!
White Hat / Black Hat SEO | | plahpoy1 -
Consolidate URLs on Wordpress?
Like you said, you're going to want to redirect all http versions to https. And then redirect all www. to non-www – which you can usually accomplish by setting your non-www as the primary domain. Depending on where your site is hosted / domains are registered, there will be a few different ways you might go about this. My agency uses WPEngine to host all of our sites, and they make it super easy. Regardless where my site is hosted / who my registrar is, I've always had a lot of luck chatting with their support team. For more specific redirects, we utilize the Redirection plugin. Is that helpful at all?
Intermediate & Advanced SEO | | brooksmanley1 -
What happens when a de-indexed subdomain is redirected to another de-indexed subdomain? What happens to the link juice?
Exactly as you said. I wonder what ranking fluctuation or dip we can expect with main domain due to this deindexing of sub domains. Someone claims that ranking will be dropped, but how? Still sub-domain "B" will be there with all backlinks. So the backlinks are there technically. Please let me know your valuable thoughts on this. Thanks
Search Engine Trends | | vtmoz1 -
Should apartment management companies have a separate website for each of their properties?
Those are all the same brand. Something like the following on the same site makes more sense to me from both a branding and SEO perspective. modernapts.com modernapts.com/locations/ - overview page modernapts.com/locations/seattle/ - city/location page modernapts.com/locations/los-angeles/ - city/location page modernapts.com/locations/seattle/capitol-hill-penthouses/ - building page This is more or less the same as any retail/physical chain that has location pages. Here is an example of an actual brand that has locations on the same domain, city level pages, and building level pages: http://www.equityapartments.com/san-francisco-bay-apartments - city/location page http://www.equityapartments.com/san-francisco-bay/mission-bay/azure-apartments - building page I have not reviewed their overall SEO setup, but this is a great example from a URL point of view, and they have lots of useful information on their actual building level pages. The city level pages are great for SEO and targeting city-level keywords as well.
Local Listings | | KaneJamison1 -
Free Trial Query
Hi there! Sam from Moz's Help Team here - sorry about any confusion! Your credit card will not be charged at any point during your 30-day trial, but we do perform a $1 authorization on your account to ensure your credit card is valid, which is then immediately voided. After your 30-day trial, you will be charged each month thereafter. If you decide that Moz isn't for you, you can cancel anytime, both during and after your trial. I hope this helps to explain - do let me know if you have any follow up questions!
Getting Started | | samantha.chapman0 -
No Index thousands of thin content pages?
Egol, Thanks for this. I did consider the sub-domain option and I'm going to discuss this as an option with my team. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Issues with Multiple H1 tags on homepage?
Thanks Paul, Agreed. Its for the hero image only on the home page and I'm in discussions with the theme provider to see if there's a way to have only one H1 tag, but have the image change behind it as you move between devices. We'll see how I get on with that! Thanks for your help it's greatly appreciated! Mike.
On-Page / Site Optimization | | Veevlimike0 -
Why Do Different Tools Report 404s Differently?
One more sidenote - you can also use inbound link auditing tools like Moz Open Site Explorer, AHREFS, or especially Cognitive SEO to collect as many of your incoming links as possible, then filter them for which ones 404 and then you'll know which external pages contain the broken incoming links. CognitiveSEO is especially well set up to do this, but it's not a cheap tool (Does have a free trial though). Hope that helps? Paul
Moz Tools | | ThompsonPaul0 -
Why are recently deleted pages still appearing in the latest MOZ crawl?
Hi Billy! Thanks so much for the great question! I'm so sorry for any confusion. I'd be happy to look into this and do some digging. Can you please send an email over to help@moz.com with some examples of the pages which are still showing up in your Site Crawl data after having been deleted? That way we can take a look and see what's going on. Looking forward to hearing from you!
Link Explorer | | meghanpahinui0 -
Did Moz have a citation report ?
Hi there! Thanks so much for reaching out! Are you looking to a local citation report? If so, you may want to check out our Moz Local tool! You can get started by heading to https://moz.com/local/search Another resource which may help: https://moz.com/help/guides/local/local-reporting If this isn't what you're looking for, feel free to send an email on over to help@moz.com and we'll see about sending you some more resources to get you what you need. I hope this helps!
Getting Started | | meghanpahinui0 -
ExpressUpdate.com Down, any other options for updating NAP
If you refer to NAP for a name, address, and phone, well there is a lot of options some of them are mandatory. Google my Business and Bing are the standard. Then depending on your niche, location and the services or product you are offering there are several options like Yelp or Angie's List here is a list very useful 50 Online Local Business Directories. There's a hundred of tool to manage your local visibility, for me, the best is Moz Local.
Local Listings | | Roman-Delcarmen0 -
Search Visibility Dip
Looks like were are in the same situation. I have yet to find an explanation, but if I do, I'll follow up here to keep you in the loop.
Local Listings | | Dions0 -
Moz Crawl only crawling the top level page (1 page)
Hey there! Tawny from Moz's Help Team here. For really specific, in-depth questions about why a Campaign isn't working quite right, it's really easier to help you out through our normal support channels. Please write in to us at help@moz.com with all the details of what you're seeing, and we'll do our very best to help you figure things out. Thanks!
Other Research Tools | | tawnycase0 -
Need help understanding API
There's no single API that does all of this but you can chain a couple of them together to get what you want. As Tawny pointed out, it's a technical task and you may need the help of a web developer. As you have access to SEMRush, you can use their "organic results" API call which, given the keyword you're interested in, will return the top ranking URLs. You can see the documentation specific to that call here. So that gets you from your starting point (being interested in a query) to having the top URLs. You can limit the number of rows returned by the SEMRush API—sounds like you'd only want the first 10 rows (i.e. the top 10 results for the keyword). Now taking that list you can send it to the Moz API's UrlMetrics call. This will give you back DA, PA, Trust Flow, and so on, for each URL. Neither tool will tell you the word count. If you need to calculate that, you'll have to crawl the pages somehow. It depends whether you really need to completely automate everything. If "semi-automation" is good enough, I'd suggest that your script, after fetching the top ranking URLs from SEMRush, writes them out to a CSV as well. Then you can use Screaming Frog in list mode to crawl all of the URLs listed in the CSV. So everything would be automated except for gathering the word counts. You'd have to stitch together the results from your Screaming Frog crawl with the data you had back from Moz. If everything must be automated, and you really need word count or other on-page information, your script will also need to crawl the pages. And for this you'll certainly need a developer familiar with technologies like Selenium, for crawling and scraping web pages. In almost all use cases, that's overkill, so I'd suggest focussing on the SEMRush and Moz APIs for now.
API | | StephanSolomonidis0