Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
Google Search Console actually has a URL removal tool built into it, unfortunately it's not really scaleable (mostly it's one at a time submissions) and in addition to that the effect of using the tool is only temporary (the URLs come back again) In your case I reckon' that changing the status code of the 'gone' URLs from 404 ("temporarily not found, but will be returning soon") to 410 ("GONE!") might be a good idea. Google might digest that better as it's a harder indexation directive and a very strong crawl directive ("go away, don't come back!") You could also serve the Meta no-index directive on those URLs. Obviously you're unlikely to have access to the HTML of non-existent pages, but did you know Meta no-index can also be fired through x-robots, through the HTTP header? So it's not impossible https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404 (Ctrl+F for "X-Robots-Tag HTTP header") Another option is this form to let Google know outdated content is gone, has been removed, and isn't coming back: https://www.google.com/webmasters/tools/removals ... but again, URLs one at a time is going to be mega-slow. It does work pretty well though (at least in my experience) In any eventuality I think you're looking at, a week or two for Google to start noticing in a way that you can see visually - and then maybe a month or two until it rights itself (caveat: it's different for all sites and URLs, it's variable)
Intermediate & Advanced SEO | | effectdigital0 -
Site name in page title - leave it or remove it?
I always include the site name in the title. It will also help from a brand search perspective for people to click through to your page.
On-Page / Site Optimization | | Kelly-Anne1 -
Is Bing also ignoring meta Keywords tags?
No, Bing doesn't use it. Source from Bing's Duane Forrester. Also here: "But as the engines get smarter with and about signals, and as new, trustworthy signals are grown and adopted, the SEO of yore becomes a bit less relevant. No one really cried when we all walked away from tags after they were inundated with spam. No one cried when keyword density became a passé topic, largely covered up in the then somewhat novel approach of “making quality content”."
On-Page / Site Optimization | | KevinBudzynski0 -
Downsides on shortening article title?
Thank you for your replies José, Christy and Salience, It looks like you are right, and the ('SEF URL') is fixed: it will not change when I change the title: https://screencast.com/t/XR6lS6YdL For now -changing URLs- sounds a bit too risky for me to start with yet. I'm just trying to create the best articles, contentwise. Feels a bit odd that URLs are going to be different from the titles though. But I guess that's a better situation than having titles that are too long.
On-Page / Site Optimization | | RaoulWB1 -
Business not being found via Moz Local on Facebook
Thanks, Lauren, for the reply and the advice! I will talk to my boss when we are back in the office tomorrow to see what we can fix and will keep you updated. Thanks again! Andrea
Moz Local | | olsonah822 -
Does google sandbox aged domains too?
My keywords now start to show on google second and third page. I think I should wait to see some more improvement. Only few links are showing in search console. Moz and ahref shows 300+ referring domains. I should have to wait more until all referring domains start to show in search console.
White Hat / Black Hat SEO | | Steven231 -
Removing the Trailing Slash in Magento
You could always force trailing slashes instead of removing all trailing slashes. What you really want to establish, is which structure has been linked to more often (internally and externally). A 301 redirect, even a deeper more complex rule - is seldom the answer in isolation. What are you going to do (for example) when you implement this, then you realise most of the internal links use the opposite structure to the one which you picked, and then all your internal redirects get pushed through 301s and your page-speed scores go down? What you have to do is crawl the site now, in advance - and work out the internal structure. Spend a lot of time on it, days if you have to, get to grips with the nuts and bolts of it. Figure out which structure most internal/external links utilise and then support it Likely you will need a more complex rule than 'force all' or 'strip all' trailing slashes. It may be the case that most pages contain child URLs or sub-pages, so you decide to force the railing slash (as traditionally that denotes further layers underneath). But then you'll realise you have embedded images in some pages with URLs ending in ".jpg" or ".png". With those, they're files (hence the file extension at the end of the URL) so with those you'd usually want to strip the slash instead of forcing it At that point you'd have to write something that said, force trailing slash unless the URL ends with a file extension, in which case always remove the slash (or similar) Picking the right structural format for any site usually takes a while and involves quite a bit of research. It's a variable answer, depending upon the build of the site in question - and how it has been linked to externally, from across the web I certainly think, that too many people use the canonical tag as a 'cop out' for not creating a unified, strong, powerful on-site architecture. I would say do stick with the 301s and consolidate your site architecture, but do some crawling and backlink audits - really do it properly, instead of just taking someone's 'one-liner' answer online. Here at Moz Q&A, there are a lot of people who really know their stuff! But there's no substitute for your own research and data If you're aiming for a specific architecture and have been told it could break the site, ask why. Try and get exceptions worked into your recommendations which flip the opposite way - i.e: "always strip the trailing slash, except in X situation where it would break the site. In X situation always force the trailing slash instead" Your ultimate aim is to make each page accessible from just one URL (except where parameters come into play, that's another kettle of fish to be handled separately). You don't have to have EVERYTHING on the site one way or the other in 'absolute' terms. If some URLs have to force trailing slash whilst others remove it, fine. The point is to get them all locked down to one accessible format, but you can have varied controlled architectures inside of one website
Intermediate & Advanced SEO | | effectdigital0 -
HomePage Stopped Ranking For Brand on Aged Site
I think Google might actually have a problem with my site title.. I just googled BlowFish SEO west palm Beach and this is the result I got. Hqt5g7W
Local Website Optimization | | BlowFish-SEO0 -
Too many SEO changes needed on a page. Create a new page?
Firstly see what Google thinks of the page. On-page SEO checkers require wider experience to use accurately and efficiently. You can't just take insights from online tools and run with them If the page has no associated keywords / search queries (in Ahrefs, GSC, Google Analytics, SEMRush etc) then that would show Google isn't really that interested. Because keyword data (even from Google) is always heavily sampled, you'd also want to check if, as a 'landing page' (in Google Analytics), the page had been receiving any traffic from Google (Organic Search segment) If the page is doing well despite what online tools say, rules be damned. If the page isn't performing well (or at all) then just re-build it from scratch in line with best practices, and 301 redirect the old URL to the new one. If the URL stays the same then just rebuild on the active URL, that's fine too (but don't publish until the complete re-build is 100% finished and you are 110% happy with it)
Intermediate & Advanced SEO | | effectdigital1 -
Geo-location by state/store
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking. All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test. Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie. 1. Check IP 2. Embed their location in a cookie 3. Set cookie 4. If cookie is excepted and thus exists, do personalize. If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
Local Website Optimization | | Everett1 -
Drop in performance
Impressions don't really mean much at all, as Google often experiments in terms of ranking sites for new keywords, then decides they are not relevant and takes the rankings away again. What we really need to see is 1yr+ of traffic (Google Analytics) and Clicks (Search Console). Even that may not be enough to define exactly what's going on The site is mainly tagged with the FA language, which is Persian. AFAIK (as far as I know) most Persian people live in Iran, which used to be called Persia (and which is still named Persia by some people, though on the international stage the nation is referred to as Iran) Right now there's a lot going on in the news between the USA (where Google is based) and Iran. It does make me wonder, does make me consider - could that be part of the issue? Obviously it would be impossible to gain clarification but... One thing I know, is that Google is foremost an American company, and that right now the USA has "engaged in a campaign of maximum financial pressure on the Iranian regime and intends to enforce aggressively these sanctions that have come back into effect" - source Who knows what's going on behind the scenes. Right now, Google is really clamping down on 'soft' medical practices within their SERPs which we know from all the YMYL / Medic updates. I know that Google only has limited presence in Iran (as you can see there, they won't even give Google-Iran a TLD, they use parameters in the URL structure to sort of generate a relevant page). This could in part be due to internet censorship in Iran. We know that even the app market, Google Play is extremely locked down in Iran Without taking sides or making any judgements on the international level (something we wouldn't do) - it does seem that Google have difficulties operating in Iran, in the same way that they operate in the West. The USA is clearly sending signals to Iran right now on the international stage (which are also being returned) - as such, it's not hard to see that an Iranian site (especially one with potential Medic / YMYL issues) might fail to rank on an American search engine Your site seems to use the "Netmihan Communication Company Ltd" ISP, which would confirm that the site is based in Iran (rather than just being built for an Iranian audience by those who may be external to Iran). I have the city down as Rasht Taking no sides here, it's possible that your site has become a casualty of international conflict (at least on the communication and economic level) and additionally of YMYL / Medic updates, which may have stung you regardless of your location Hope this is helpful to you, hope you will have a great day
Technical SEO Issues | | effectdigital0 -
Free trial Subscription
Hi According to the conditions no. You have to be aware of canceling it before 30 days. Anyway you can get in touch with moz help at https://moz.com/help/contact Regards
Technical Support | | josellamazares0 -
My Company Doesn't Appear in the Search Results on the Right When I Search for It
Hi VELV, So glad you came back to the thread. Can you clarify some things for me, please? Is this your company: https://www.ifixappliancesla.com/ Is it in any way related to this company: http://ifixappliancerepair.com/ When you are searching for just "iFix Appliances" and not seeing either your Van Nuys or Beverly Hills Google Business Profile come up, where are you physically located when doing those searches? If you are standing in your location in Van Nuys and doing the search, what do you see? And the same with Beverly Hills. Is there absolutely no Google Business Profile (what you called the knowledge panel) coming up even if you are standing in the location of each while searching? When you do add the geo-modifier (Van Nuys or Beverly Hills) does the GBP then come up? How long ago did you create the listings for these 2 locations?
Local Listings | | MiriamEllis1 -
Will I loose from SEO if I rename my urls to be more keyword friendly?
I will check those guides and see how I can work on optimizations. Thank you.
Intermediate & Advanced SEO | | Spiros.im0 -
Getting Google to index our sitemap
Now I can see Sitemaps, loadings takes time ... a lot and they look weird, but maybe ok. But there is stuff in it, wich I wont like to have in Google-Index. Northeless - whats the message in GSC? (opend in Chrome, Firefox and on my pixel as well - the first one is looking good, all linked once had the error, now they are differnet from each other (with Linebreaks or without, with space or without) but contain links at least) Is the site on a subdomain for pages on a different domain? (didn't saw that) - that makes it way more tricky ...
Intermediate & Advanced SEO | | paints-n-design0 -
Reducing Negative Impact of Webpage Login Form
It's horrible that you have been forced to adopt this as it's basically forced suicide of the website You're likely up against large aggregator sites who have more commercial properties (inside of and outside of New York) on their books than you do. Places where users can go and just process one request, to see 20-30% of all available online results. These sites have a massive USP as for most of us who are lazy, they are one-stop shops we learn to trust To assail and overcome these giants, even around the more-niche fringes, you need solid value-propositions and value-add for your site. Why go to you, instead of an easier, larger supplier? Maybe you have some reasons: Better prices Better customer service Easier to use website once you register BUT - being forced to register is a massive value-subtraction measure. It subtracts HUGE amounts of bonus points from your value-proposition, which is ultimately what you ride on to rank well Not only are you having bounce rate issues, your search traffic may soon tumble into the dirt. Right now, your organic search (SEO) performance seems pretty solid, looking at the charts on Ahrefs (which I still think, are superior to the SEMRush ones by a long shot). This single change, could jeopardize all of that By the way, even if I change my use agent to Googlebot, the form still comes up - so they will definitely be aware very soon (if not already) of this user obstruction. It's such a shame because it looks as if you guys started lifting in May 2018, and have been on the up-and-up ever since. To have such a change forced is literal insanity Maybe there are some small CRO things you can do to get 3-15% more conversions or leads. But if your SEO traffic plummets 60% you will just be left thinking "wow, that was a terrible, terrible deal" You've already given people the easy options by allowing them to log-in with Google or FaceBook. But people only really like to do that for big brands they can trust, otherwise they feel they are somehow connecting their personal accounts with a site they are not 'quite sure' that they trust yet Sorry to say but I think this was a bit of a Doomsday maneuver Your only real option is to incentivise account creation with deals and discounts or something like that. People are pretty astute these days, they will want something back for their trouble In reality it would be better to wait until users were about to convert (going through whatever conversion funnel you have set up) and then at the very end, when you already have all of their info (which they need to give you to properly contact you) - set up an auto-filled form (based on the data they have already input) and ask if they want to create an account in a simple, easy 1-click style That would have (probably) increased your uptake in account creations. But whomever was making the decisions, got too greedy and got forceful. It won't work and it may have severe repercussions
Conversion Rate Optimization | | effectdigital2