Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Ensuring that Google Display my Meta Descriptions
Delete all the other content on the page Nah just kidding. There's really no way to achieve that, other than writing a Meta Description which Google would prefer to render, against the rest of the page's content
On-Page / Site Optimization | | effectdigital0 -
Query for paginated URLs - Shopify
I mostly agree with Robin here. Also, be sure NOT to mix 'noindex' and canonical tags. Google will (in most cases) end up picking rel=canonical over noindex when you use both of these. So it is very possible that even when using 'noindex', your pages will appear in search results. The approach of canonicalising all your paginated pages to the first one, is not good practice. We all just found out that Google hasn't been using rel=next/prev for a couple of years now, but most of the pagination was indexed in a correct way. So doing nothing is maybe not that bad of an option. If you see things going wrong, you can further evaluate and test other possibilities.
On-Page / Site Optimization | | Mat_C1 -
Our Domains DA has dropped from 52 to 17
Hi! The above answers are all great! Just to add on: DA drop can be attributed to a few things: 1.Links we previously discovered are now marked as lost. 2. You've earned more links, but the highest authority sites have grown their link profile even more. 3. The links you've earned are from sites that we haven't seen correlate well with higher Google rankings. 4. We've done a better or worse job crawling sites/pages that have links to you (or don't). It's a bit difficult to isolate the exact cause of what happened without your own SEO consultant or developer being able to take a dive into this, but you can definitely read more specific information here if you'd like to understand more about the process:). Best, Eli
Link Explorer | | eli.myers1 -
Which is the better keyword strategy? Specifying page or seeing what ranks?
If a person considers his business and the industry where he will compete, the keywords that are most profitable or most precious are usually obvious. Having identified them, the work is then to build one very high quality page for each of those keywords and then forming that page into a finely-crafted arrow with characteristics that will defeat all opponents. When in difficult contests, multiple subpages are then built, each to attack a subkeyword and these are prepared so that they might provide deep content resources, backlinks to the page attacking the root, and demonstrating to Google and all visitors that yours is the world's best resource. The attack might be short but if the prize is worthy, years might be required. And, for this deep knowledge and strong writing skills will be required of you, and possibly assistance from allies. The tracking most valuable for all of the above is the ranking position of your root and primary subkeywords. As time progresses you will hopefully see increases in the rankings as you put your work forth. Those ranking increases will be the fuel that keeps you moving forward and keeps your mental energies high. Nothing is more valuable than mental energy when in this type of competition.
Moz Pro | | EGOL1 -
How to deal with parameter URLs as primary internal links and not canonicals? Weird situation inside...
Hmmm. This is tricky. Some ideas - hope something here is helpful: Have you tried "inspect URL" in search console? That has information about canonical selections these days and may be helpful Are the canonical URLs (and no parameter URLs) included in the XML sitemap? Might be worth trying cleaning that up if there is any confusion Cookies could work - but it sounds to me as though that would go against your client preferences as the non-cookie version would have to remove / work without parameters I think - which you indicated they weren't prepared to do Failing all of that, what about picking one category to be the primary category for each product and canonicalising to that (which will have internal links) instead of to the version with no parameters? Could that work? Might nudge towards the canonical being respected
Intermediate & Advanced SEO | | willcritchlow0 -
How can I avoid duplicate brand name in the title serp?
Hi there, as effectdigital says above, Google has a tendency to append brand to the end of a title tag where it thinks it's confident of it. I think the question here is - how much do you mind it being added to the end? I think unless it's causing active harm (which I think it probably isn't) probably most effective to just leave it and focus on other things. Wish we had the answer you were looking for!
Technical SEO Issues | | R0bin_L0rd1 -
Google June Core update massive drop in visibility and rank
Hi Have seen it impact many sites. Most down, a few up. It is the continuation of the roll out or tweaking of the medic update. So the first major update August 18, then March 19 and then the June one referenced. It was poorly named as the medic update - it really should be called the "entity update" or the "trump update". Most are saying it is about content - it is far more complicated than that. We have recovered a few sites from the August update. If your business sits within "YMYL" verticals - then there is plenty to be done. Usually, the largest job however in time, is a content audit, and then a re-write as the content usually coming up short on audit. The technical elements are just as critical from schema markup to uniform citations etc. Not sure it helps, but your not alone.
Search Engine Trends | | ClaytonJ1 -
After 301 redirection non-English keyword points to English language pages
Hi, So the situation looks like this website domain1.co.uk in not regional - it's in only for UK but had multiple languages polish language pages on domain1.co.uk website were redirected (301 redirect) to domain2.com but with /pl "bit" so: domain1.co.uk/page1 > domain2.com/pl/page1 and domain1.co.uk/page2 > domain2.com/pl/page2 when user in UK look for Polish keywords, for some the keywords, the result is correct (it shows domain2.com/pl/page2) and for some Polish keywords user in UK got served domain2.com/page2. Please notice that in both cases is the same location and different polish keywords. pages on both sites was mostly "mirrored" which means article in Polish had English equivalent, so even when Polish user gets to English page, it's still relevant, thus increasing bounce rate as user don't necessary have to understand it. Would you need more info to get clearer picture of the situation? Thank you.
On-Page / Site Optimization | | Optimal_Strategies0 -
Site moved. Unable to index page : Noindex detected in robots meta tag?!
That's hugely likely to have had an impact. No-indexing pages before they were ready was a mistake, but the much bigger mistake was releasing the site early before it was 'ready'. The site should only have been set live and released once ALL pages were ported to the new staging environment Also, if all pages weren't yet live on the staging environment - how can the person looking at staging / the old site, have done all the 301 redirects properly? When you no-index URLs you kill their SEO authority (dead). Often it never fully recovers and has to be restarted from scratch. In essence, a 301 to a no-indexed URL is moving the SEO authority from the old page into 'nowhere' (cyber oblivion) The key learning is, don't set a half ready site live and finish development there. WAIT until you are ready, then perform your SEO / architectural / redirect maneuvering Even if you hadn't no-indexed those new URLs, Google checks to see if the content on the old and new URLs is similar (think Boolean string similarity, in machine terms) before 'allowing' the SEO authority from the old URL to flow to the new one. If the content isn't basically the same, Google expects the pages to 'start over' and 're prove themselves'. Why? Well you tell me why a new page with different content, should benefit from the links of an old URL which was different - when the webmasters who linked to that old URL, may well not choose to link to the new one Even if you hadn't no-indexed those new URLs, because they were incomplete their content was probably holding content (radically different from the content of the old URLs, on the old site) - it's extremely likely that even without the no-index tags, it still would have fallen flat on its face In the end, your best course of actions is finish all the content, make sure the 301s are actually accurate (which by the sounds of it many of them won't be), lift the no-index tags, request re-indexation. If you are very, very lucky some of the SEO juice from the old URLs will still exist and the new URLs will get some shreds of authority through (which is better than nothing). In reality though the pooch is already screwed by this point
Intermediate & Advanced SEO | | effectdigital0 -
HTML entity characters in meta descriptions
Thank you for the response, Zee. I've been told that Google is "familiar" with character escaping and will interpret the description correctly. Is there a way to confirm this? We're programmatically populating meta descriptions for one of our applications and having trouble converting the HTML entity characters. Thank you Ellen-
Technical SEO Issues | | ellenu0 -
Keywords Declination
Don't worry too much about keywords that are that similar. Google's algorithms are smart enough to recognize these small variations. Moreover, as you say yourself, this will lead to awful content and thus a bad user experience. Unless these two words mean totally different things (which they don't), don't start rewriting everything because it simply won't change your rankings.
Technical SEO Issues | | Mat_C0 -
Domain SEO
2 because it's easiest to remember. In 2019 exact-match domains have less impact on SEO, it's more about 10x content and demonstrating a solid value proposition (watch up to the point where issue #1 is fully outlined). SEO is a pretty vast field in modern times. Coding tweaks and URL slugs are still somewhat important, but they provide slight, slight bonuses to your core value proposition (the value-add of your site, to the internet). I don't think engagementrings.com is too bad, but without a solid 'idea' and value-prop behind it, the URL won't magically make it rank alone
White Hat / Black Hat SEO | | effectdigital1 -
SEO - New URL structure
Hi there! There seems to be a bit of confusion in this thread between URL structure and Information Architecture. Having more folders in a URL doesn't reduce the authority but pages with more folders in the URL tend to be deeper in the sites linking architecture, which means they tend to have less authority because they aren't as close to the surface. The difference between internal links and url format is an important one. There's a blog post here which explains in more depth. From my perspective, here are the benefits of having pages within folders; There is an opportunity to put more relevant keywords in the URL without stuffing Easier folder-level reporting in Google Analytics, Search Console etc. Some increased understanding for Google of how pages hang together - there is some evidence that Google uses folder structure for ranking before it knows much about the page for example. In terms of managing authority for pages and signals of relevance I'd be looking much more towards the internal linking to those pages. I wouldn't rely on Google intuitively understanding the topical connection between two pages unless both of those pages target that topic or have relevant links between them. So for example, say you have two pages; site.com/widgets site.com/doodads If those pages are both subcategories of trinkets you could reformat them to be; site.com/trinkets/widgets site.com/trinkets/doodads Having "trinkets" in the url might help both pages rank for "trinkets" type keywords, like "doodad trinkets" for example. However, I wouldn't rely on this change to help Google understand that widgets are related to doodads - you can handle that much more effectively with relevant internal links between /widgets and /doodads that make the relation clear. In terms of whether there is a risk to making this change - this is essentially a migration and definitely comes with risks associated, even if all of your redirects are 1:1 and direct. It'll take time for Google to find the redirects and new pages, and as a rule of thumb, link equity isn't passed perfectly along a 301 redirect so I wouldn't expect these new pages to just inherit the strength of the old ones. I think it comes down to weighing up whether the benefits I listed above outweigh the risk of an in-site migration. If you think the keyword targeting opportunities will make enough of a difference then great but I wouldn't rely on url structure as a way to get Google to understand your site differently - the impact of internal links is going to be a far greater factor.
Technical SEO Issues | | R0bin_L0rd1 -
Link to webdesign bureau in footer on follow or nofollow
Test it. No-follow a portion of the links (if you have designed hundreds of sites, maybe try 10%). See if your results go up, stay the same or go down. If your results go down, remove the no-follow tags again. Even if the results don't instantly come back (and Google keeps the no-follow reference, even when the coding is removed) you won't have lost much as you kept your sample small SEMRush and Moz Toxicity ratings rely a little too much on linguistic, 'semantic' relevance (e.g: "a link from a car manufacturer to a car insurance site is relevant as they're both about cars"). Deeper relevance (that which Google is actually looking for) is more to do with "why is it relevant for the user to click on this link?" Toxicity scores may simply be a reflection that you have loads of links from sites which are 'thematically' irrelevant. But that doesn't necessarily make the links themselves irrelevant! It may well be useful for people to Google the sites you made, think they are cool and wonder who designed them The truth is neither SEMRush nor Moz knows exactly what Google thinks a good / bad link is. They basically look for patterns in backlink profiles and linking sites, which have been involved in penalties which did occur on Google (which they know from their keyword / ranking indexation data). If suddenly 2-3 sites drop out of the rankings and they all shared similar backlinks, SEMRush and Moz can estimate that those linking sites (under certain circumstances) may be bad But it's not a 100% guarantee, indeed if you disavow or no-follow loads of links based on these ratings alone - you often do see little performance dips (without doing more forensic, more holistic research) If your main concern is that 'site-wide linking' may be negatively affecting you, there could be a simple cure for that. Your idea of producing case studies on your own site is great, but it stops you getting free traffic and leads from sites that you designed - if those sites stop helping you rank well, or stop linking to you Instead you could create new pages on the clients sites. Yeah seems crazy but hear me out as I have some logic behind this which might create a good compromise, which would be very interesting to test In the footer on your client's sites, leave a link saying "Webdesign by Conversal". When users click that link, instead of taking them directly to your own site, you could point that link to a page on the client's site with some design sketches, a bit of blurb about how your approached the project. THAT page (on your client's site) could then link to you directly. In this way, you'd only get ONE link from each site, but the footer link would remain (though it would become an internal link) and continue to serve you. Maybe this could be a decent solution, but I've never tested it (sorry) The links from these pages on your client's sites (accessible only from the footer links) could connect with the case study URLs on your own site, creating a unified experience which leads people down a funnel - to buying a site design from you I might try that on a few sample clients, monitor the results. If the results didn't drop I'd at least feel better insulated against Penguin, and would probably then roll out another batch
Intermediate & Advanced SEO | | effectdigital0 -
How long does it take, So i can see results of Kewords Ranking search on moz?
Hey, Thanks for contacting us! Any changes in settings for a related section (Tracked Keywords, Google Analytics profile, search engines, competitors, etc.), mean you will have to wait until the next scheduled update for that section to see fresh data. In most cases, that means it should be reflected in the Campaign's next weekly update. You can check on when that update is scheduled from your Campaign Dashboard — just look for the little italicized date underneath the Search Visibility tile. Any further questions, please feel free to reach out to us direct at help@moz.com Best, Eli
Getting Started | | eli.myers1 -
How to rank a transactional query
Hi, Realistic structuring is the visual introduction of thoughts. Realistic planning is utilized to structure ads for papers, magazines, and item bundling. It is likewise used to configuration site pages and T-Shirts. Visual originators are utilized to make ads and marks for items that are engaging and straightforward by buyers. Graphic Design Training Institute The field of realistic planning is persistently developing and on the off chance that you are an inventive individual and overflowing with visual thoughts, you ought to think about a vocation in this field. Realistic structuring is the visual introduction of thoughts. Realistic planning is utilized to structure promotions for papers, magazines, and item bundling. It is likewise used to configuration website pages and T-Shirts. Visual creators are utilized to make promotions and names for items that are engaging and straightforward by buyers. The field of realistic planning is constantly developing and in the event that you are an inventive individual and overflowing with visual thoughts you ought to think about a profession in this field. In the event that you are an individual that starts pondering the inventive procedure each time you see a commercial? Is it true that you are keen on structuring sites that stick out? Do you appreciate understanding funnies and respecting the work of art? At that point, you ought to think about entering the field of visual computerization. In the event that you fall into any of the above classifications and need to be an image of society then you ought to think about this profession. You can begin by selecting for a Bachelors qualification in expressions or structure. This could be an appropriate degree or a multi-year course. This is no to state you won't find a new line of work on the off chance that you have what it takes. However, the degree will show a great deal of the things that you should prevail as an innovative craftsman. This incorporates utilizing work area distributing instruments like Photoshop, PageMaker, FrameMaker, QuarkExpress, and Acrobat Exchange. You will likewise pick up the additional experience that goes with the entry-level position that is a piece of the degree. Here is a rundown of the best ten colleges in the United States - 1. Carnegie Mellon University 2. Virginia Commonwealth University 3. Rhode Island School of Design 4. Pratt Institute 5. Maryland Institute College of Art 6. Cranbrook Academy of Art 7. College of Delaware 8. California Institute of the Arts 9. Corcoran College of Art and Design 10. Drexel University These schools offer a multi-year degree in Fine Arts. You can likewise pick a course that runs for a long time and accomplish a level of higher capability. A vocation in this field implies you should continually improve your relational abilities. This is a direct result of the idea of the work. You should plunk down with the customer and comprehend the customer's necessities and make an interpretation of their thoughts into drawings and plans. This field requires consistent difficult work and you may end up working extended periods of time, however, the work will satisfy and fulfilling.
Intermediate & Advanced SEO | | alihassanblogger0 -
404 vs 410 Across Search Engines
Thank you both. Upon further digging in Bing's documentation they do state a 404 or 410 can be used.
Intermediate & Advanced SEO | | sb10300