Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Correct use of schema for online store and physical stores
The head office happens to be the e-commerce store. Then there are actual physical stores that sell the same products physically. So we do want visibility for 'HQ' as the main 'entity'. Yes if anyone has a problem they contact the shop or HQ/e-commerce store. So with that in mind I still need clarification of the schema to use.
Technical SEO Issues | | MickEdwards0 -
Optimal use of keywords in header tag
Hi Bek, Thanks for your reply. Below is the message I get. Ive checked my source code which is <code># **Browse 164 live paralegal jobs** and as you can see i have the key word 'paralegal jobs' once only.</code> 'Why it's an issue: Although using targeted keywords in H1 tags on your page does not directly correlate to high rankings, it does appear to provide some slight value. It's also considered a best practice for accessibility and helps potential visitors determine your page's content, so we recommend it. Over-using keywords, however, can be perceived as keyword stuffing (a form of search engine spam) and can negatively impact rankings, so use keywords in H1 tags two or fewer times. To adhere to best practices in Google News and Bing News, headlines should contain the relevant keyword target and be treated with the same importance as title tags. See Four Graphics to Help Illustrate On Page Optimization.How to fix it:Use your targeted keywords at the beginning of your H1 headers once or twice (but not more) on the page. <dl class="page-grade-inner-list"> <dt>Optimal Format</dt> <dd> keywords in my headers </dd> <dt>Sample</dt> <dd> # The Moz Blog '</dd> <dd>Let me know you thoughts Bek please. Im quite confused about this error message I get and really need to get it sorted because I get same error message for other pages on my site which is www.purelegaljobs.com </dd> <dd>Thanks a lot</dd> <dd>Serg</dd> </dl>
Technical SEO Issues | | Serg1550 -
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi vtmoz, Given the limitations you are telling us, I'd give noindex in robots.txt a try. I've run some experiments and found that noindex rule in Robots.txt works. It definitely won´t remove from index that pages, but it will stop showing them for search results. I'd suggest you to try using that rule with care. Also, run some experiments on your own. My first test would be only adding one or two pages, the one that causes more trouble being indexed (maybe due to undesired traffic or due to ranking on undesired search terms). Hope it helps. Best luck! GR
Search Engine Trends | | GastonRiera0 -
Can you please let me know which of these two SEO tools you think is better?
Both tools sound really great in theory. I will definitely have them as they will help with simple onsite mistakes that can happen frequently. Especially, if you're working with huge and unstable websites where onsite errors might not even be your fault. In the case of online stores, TOOL 1 could prove to be really valuable. As you have thousands of pages and a platform that can literally do anything at any given time. If it can really notify that error quick enough that would be priceless. The second tool sounds awesome too. But I will go with the first one as it seems like an error prevention tool that will definitely be useful.
Online Marketing Tools | | alexspur0 -
Webinar Copy
Yes, it is very likely that the company hosting the webinar will send you a recording or a link to an on-demand video after the event. This could be immediately afterwards or be a few days later. If you don't receive a copy of the webinar after several days, I do recommend to reach out to the company to request a recording and I'm sure they would be more than happy to send you a copy.
Educational Resources | | abbiplunkett0 -
I long it take to refresh the information?
Hi there! Thanks so much for the great question! I just want to make sure I get you the best answer I can- are you referring to 404 pages in your Campaign's Site Crawl? Or are you seeing 404 pages in the Top Pages of Link Explorer? I'd be happy to look into either of those for you! Can you send an email on over to help@moz.com with some examples of the pages you're working with and where you're seeing them marked as 404s? Thanks so much!
Other Research Tools | | meghanpahinui0 -
Multilingual Sitewide Links
Without any indicators that Google 'do' think the links are spammy, I wouldn't worry about this too much. If you start to notice performance issues which you can isolate to these footer links, then I'd no-follow them right away Usually site-wide links are only an issue between different domains, and even then - only if it's not a multi-domain site. A multi-domain site is usually where you have exactly the same site with linguistic differences, spread across multiple domains (so instead of having site.com/fr/ and site.com/en/, you have site.fr and site.co.uk). As long as the templates are highly, highly similar and Google begins linking the 'brand-entity' across those sites, there shouldn't be a problem Lot's of sitewide links placed in footers across the web (cross-domain) are paid for links to manipulate SEO rankings. Those are bad. If your links are 'editorial' in nature (e.g: the site owner or editor decided they were required for user benefit) then I wouldn't be so concerned. There's always the chance Google's algorithm could get it wrong, and you could eventually have a problem What you need to decide is, would you rather have some small performance issues now (by removing the links or no-following them) and prevent any further 'possible' action in the future? Or would you rather take a small risk, and keep your results solid. No one 100% knows how Google's algorithm(s) work (not even Googlers). As such, there are elements of chance at play here and only you can decide what you are happy with: A) Undo or no-follow the links now for a high chance of mild devaluation now and some affected results, but it will almost 100% stop any site-wide linking penalty (which could wipe out all results) from occurring. The damage of that would be devastating, but the chance of it occurring in the first place is low B) Leave the links as they are. Experience no mild devaluations or performance issues at all, for now. But possibly in the future, you get struck with a penalty and lose everything. The chances of that seem very low, but if it does happen... ouch Sometimes both your choices are less than ideal. But you still have to choose! If it were me, I think (with the information which you have supplied thusfar) I'd leave it alone for now (but watch performance like a hawk)
Technical SEO Issues | | effectdigital0 -
How to overcome Connection Timeout Status Error?
This means that Screaming Frog is not 'waiting long enough' before returning the time-out error. Just do this: https://d.pr/i/D7Cj5M.png (screenshot) https://d.pr/i/N6xdLA.png (screenshot) Raise that number up, until you don't get 0 / Time Out any more. Note that if it does fail a lot on moderate crawl settings, there are likely to be underlying page-speed issues (either that or the machine you are crawling from has bad bandwidth) It could also be that your crawl ' frequency' is too high, go Configuration->Speed and lower the thread count (to 2-3) and the URI/s (to one or two) Finally it might be that the SF user-agent is blocked so go Configuration->User Agent and switch it to Chrome **To help you **- I did the crawl for you, here is your crawl data: https://d.pr/f/a1ux4b.zip (archive of crawl file and some exports) You can actually use my crawl file as a starting point (by double clicking it) when you want to re-crawl in future. Should be useful to you
Feature Requests | | effectdigital0 -
Local Search filter or penalty
Hi David, Yes, unfortunately, Google's support is often not the best. That's why I'm recommending you take this to the forum and ask for the help of the expert volunteers there.
Local Listings | | MiriamEllis0 -
Refund
HI there, Jo here from the Moz Help team. All queries of this nature should be directed to help@moz.com. I can see you've already sent us a request there so I'll reply to you directly. Cheers! Jo
Technical Support | | jocameron0 -
Duplicates
Hi there, Sam from Moz's Help Team here! Sorry for the confusion! The "Check Listing" tool is populated with all of the various NAP (name/address/phone) matches that we could find out on the web among all of the different directories that we report on. So, while we can't amend that list directly, the process of submitting your accurate business information to each of these directories will inherently work to consolidate the list. Additionally, going through the "Duplicates" section of a listing and closing extraneous listings can also aid this effort! When selecting a listing to purchase, you'll want to start with the most correct "Verified" listing available; those are the ones whose information is being pulled directly from a verified Facebook Places or Google Maps page. From there, you'll begin correcting and assimilating listings to that chosen set of correct NAP info. Seeing multiple "verified" sources will indicate multiple Google/Facebook listings, which you would have to close or remove directly so you have one result for each service. This is where we "import" your NAP data from to submit to our data partners. Let me know if you have any other questions!
Moz Local | | samantha.chapman0 -
How Have You Managed GDPR?
We actually found that, whilst it require strict management in terms of file transfers, GDPR wasn't as scary as everyone said it would be One thing we did was to sign up for Wizuda, a GDPR compliant file-transfer system (previously we just sent stuff to clients through Dropbox links, Sync.com links or WeTransfer links). It's important to note that a compliant file transfer system, doesn't 'make' all your file transfers GDPR compliant. It provides a platform which records certain info and erases files past a certain date, thus 'enabling' you to be GDPR compliant (but not necessitating that your actions will make it so) We also asked clients whom wanted to transfer data to us, to sign up to it and to send a covering note (through Wizuda mail) on every single file which they fired through to us. If they don't include the note we delete the file and reject the transfer The note they must send to us goes something like this: https://d.pr/i/tIhQBK.png (screenshot from Wizuda Mail - redacted) We also initially got a lot of pressure whereby, our Account Managers were going directly to analysts (whom were, at the time - managing GDPR transfers) and trying to 'push through stuff that the client just wanted' without the client having properly proven - that they owned the data and had the 'right' to transfer it to us for marketing activities. Needless to say we immediately clamped down on that with full force, by creating an interactive (digital or printable) 'fillable' PDF form which AMs 'have' to get filled in (by the client) before we accept ANY inbound data which contains any PID https://d.pr/i/1nkG5F.png (PDF screenshot - redacted) Since only Account Managers have a relationship with a client and can tell them 'no you do not have permission to legally do this, and we will not support you with illegal data transfers' - it made sense to unburden those 'physically' transferring the data and leave it up to higher level AMs / ADs and clients to sort out between themselves We have now adopted more advanced approaches but all this stuff was an integral stop-gap This all prevented two things: 1) Us transferring data which was not GDPR compliant to clients 2) Clients being able to get us to 'work on' illegally transferred data, which would make us an accessory to their malpractice Some think we went crazy and went way too far, but I'm pleased that we're taking more steps every day to ensure full GDPR compliance. That being said even our initial steps were really strong The truth is, no one knows whose practices are / are not safe. Most of this GDPR stuff hasn't worked its way through the courts yet - and until that happens, whose to say which approach is most compliant? I think we're doing well, though At the beginning we were quite scared that our email marketing would die off. But actually that's not the case! It just has much less churn than before. To be honest, the people whom were targeted before GDPR came into play, who may have not given explicit permissions for our client(s) to share their data, were the group who never really converted anyway. The people who signed up to be contacted, whom demonstrated their interest, supplied far more of our client(s) conversions. So in a way it was kind of irrelevant, just meant we spent less of firing out emails in the first place. Most ethical, strong-performing email marketing is re-targeting and usually users have to interact and give consent for that to happen anyway (subscribe to our newsletter, etc.)
International Issues | | effectdigital0 -
Is there a Risk Around Creating a Website for Each Country in The World?
Unfortunately yes. We have a number of clients who went 'geo-mad' and in almost all situations, it has caused problems for them. Sometimes it has created colossal site footprints which Google doesn't care to index (unless you're a household name, don't expect Google to care about your hundreds of thousands of URLs). Sometimes that has also caused server-load issues for them too, irrespective of Google Other issues include Google ignoring their canonical tags and setting one language URL as the 'canonical' result (and thus de-indexing the other language URLs). This can happen due to link signals and similar content, stuff like that Many clients in such a position have **seen their pages devalued as a result of them going against Google's content guidelines **(and simplicity guidelines). If you're not super important, Google don't want to waste 4x, 6x or 20x crawl budget in your site just because you decide to serve in more combinations of language and geo-location. Even with perfect Hreflang deployments, a lot can go wrong if you go nutty so cherry-pick your language/geo combinations and don't be greedy with it If your brand is powerful online and you have loads of SEO authority / ranking power, then you can deploy hreflangs extensively and usually you can make real gains. Not everyone is in that position, most aren't Having unique content (not powered by some crappy auto-translate plugin) per deployment is strongly, strongly recommended. By the way if you have less ranking power than most sites which have 'successful' broad-reaching hreflang deployments, you need to adhere to Google's guidelines more strictly than those sites do. You need to make up for you lack of trust and authority, by doing things by the book Too many people look at big international sites and say: "well Google lets them use relatively thin content so I should be alright too". Nope, you are likely standing upon a platform of radically different stature to those guys, so don't over-reach too quickly or you'll stumble and fall Also if you are planning to use canonical tags to 'canonical' from one language to another, don't do that. If a page points to another, separate URL with its canonical tag - then it tells Google that it (the active page) is the non-canonical version and usually de-indexes itself Be very careful how you proceed. If you increase your footprint too far, all the great authority you have built up may bleed out over a sprawling site and you could end up with nothing
International Issues | | effectdigital0