Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Can a page that's 301 redirected get indexed / show in search results?
In response to your question, it really depends on how long his current page has been active and how long it has been indexed by google. If there are links pointing to his current bio, it will stay active in the SERPs longer. Overtime the original page will not show in the SERP's but it will be replaced with the new page with his middle initial. It is always better to have more information than not. It is just like long tail keywords. If you type his full first middle and last name into search, he will most likely rank for all three queries as long as his domain has relevant authority. I hope this helps!
Technical SEO Issues | | Colemckeon0 -
Can I track Page Authority on a custom report?
Hi there! Sam from Moz's Help Team here! Yes, you can indeed add Page Authority as a module - however, you will only be able to do this on a root domain level. When in the Custom Reports area > Add Modules, head to the Links module and then 'Inbound Links'. You can then add the first module that appears - for your own site, as well as your competitors. If you're running into trouble finding this, just shoot us an email at help@moz.com and we'll be happy to assist!
Feature Requests | | samantha.chapman1 -
Meta descriptions in other languages than the page's content?
Thanks Andreas, it makes sense. In this case I think I'll apply the "no index" strategy for those pages, the website has great potential and I wouldn't want to risk a bad reputation.
Local Website Optimization | | Andreea-M0 -
How often does Google review Featured Snippets? What do you think?
Hi, I had the same question. My web LO+EIR had several pages with and without featured snippets. I even tried out asking Google on GSC to review multiple times the page but sometimes it worked and sometimes it didn't...
Search Engine Trends | | lomaseir1 -
Should I exclude my knowledge center subdomain from indexing?
Well, the advantage of having sub-domain is that you can target a specific audience with that specific sub-domains since Google'll treat it as it's won unique site. The biggest disadvantage is that if you don’t do it right, you will not get the expected results you could even draw traffic from your main domain.
Intermediate & Advanced SEO | | jasongmcmahon2 -
High Fives for new Moz Pro interface
So happy to hear that you are enjoying the new updates! I will be sure to pass this feedback over to the team in charge of the changes
Feature Requests | | lauren.s3 -
If I migrate to a new domain, does my Domain Authority score get migrated also?
Hi there, Sam from Moz's Help Team here! Sorry for any confusion. So the thing is, for us to update any 301 link, we would need to re-discover the old pages to be able to follow and update the metrics. As our index continues to grow and links are rediscovered, links which have changed should be marked as “lost” for the original domain and “discovered” for the new domain. In order for our crawler to update metrics to reflect the new domain, it would need to recrawl the pages linking to your site and discover the links which redirect. The discrepancies between the DA of your old domain and new domain are probably due to the previously externally discovered links that haven't moved over just or haven’t been discovered. I hope this helps to explain - definitely let me know if you have any follow up questions!
Link Explorer | | samantha.chapman1 -
Value of Links? What is each link worth?
We run "in house" websites. That gives us knowledge of many other businesses in the industry. For businesses that we don't know, we can easily check professional credentials of individuals and companies. Google is able to check professional credentials of individuals that include: government issued licenses, professional registrations and memberships, college degrees, wikipedia articles, awarded patents, registered copyright. and publication history in professional journals and on industry websites. For companies, Google can aggregate the professional credentials of its staff, company certifications such is ISO (and similar), stock exchange listings, wikipedia articles, awarded patents, registered copyright. DA is a very primitive tool in comparison.
Link Building | | EGOL1 -
Moz not able to crawl our site - any advice?
Hey, We crawled the site just fine with Screaming Frog. Perhaps this is a Moz issue? Looks like they are well aware of it though. Thanks!
Getting Started | | Salience_Search_Marketing1 -
A mass of spammy links pointing to our domain from dynu.net
I would just disavow to be careful. Who wants to keep looking over their shoulder .
Moz Tools | | waqid0 -
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Good Answer. I completely abandoned the banner I was thinking of using. It was from one of those directories that will list your site for free if you show their banner on your site. Their code of course had a link to them with some optimized text. I was looking for a way to display the banner without becoming a link farm for them. Then I just decided that I did not want that kind of thing on my site even if it is in a javascript onload event if Google is going to crawl it anyway, so I just decided not to add it. Then I started thinking about user generated links. How could I let people cite a source in a way that the user can click on without exposing my site to hosting spammy links. I originally used an ASP.Net linkbutton with a confirm button extender from the AJAX Control ToolKit that would display the url and ask the user if they wanted to go there. Then they would click the confirm button and be redirected. The problem was that the URL of the page was in the head part of the DOM. I replaced that with a feature using a modal popup that calls a javascript function when the link button is clicked. That function then makes an ajax call to a webservice that gets the link from the database. Then the javascript writes an iframe to a div in the modal's panel. The result should be the user being able to see the source without leaving the site, but a lot of sites appear to be blocking the frame by using stuff like X-Frame-Options, so I'm probably going to use a different solution that uses the modal without the iframe. I am thinking of maybe using something like curl to grab content from the page to write to the modal panel along with a clickable link. All of this of course after the user clicks the linkbutton so none of that will be in the source code when the page loads.
On-Page / Site Optimization | | CopBlaster.com1 -
I think i hv mistakenly subscribed for the service, and paid for two months already
I feel you. I once found out that I had been paying US Search for over a year. I sincerely hope they have a refund policy is you have not even been using it. US Search told me I could only get the last month back.
Technical Support | | CopBlaster.com0 -
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
Read what EGOL wrote. It depends upon the nature of your blog pagination There are a few reasons you could have pagination within the blog area of your site Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post) In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!
Search Engine Trends | | effectdigital0 -
Correct Localisation of my website on Google
Did you add the correct data of your business. I'm also working on a gaming blog, which you can see it here. And I want to use Localizatoin service.
Local Website Optimization | | fatetmpwcosl1 -
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
Really interested to see that others have been receiving this too, we have been having this flagged on a couple of sites / accounts over the past month or two Basically, Google Data Studio's schema error view is 'richer' than that of Google's schema tool (stand-alone) which has been left behind a bit in terms of changing standards. Quite often you can put the pages highlighted by GSC (Google Search Console) into Google's schema tool, and they will show as having warnings only (no errors) yet GSC says there are errors (very confusing for a lot of people) Let's look at an example: https://d.pr/i/xEqlJj.png (screenshot step 1) https://d.pr/i/tK9jVB.png (screenshot step 2) https://d.pr/i/dVriHh.png (screenshot step 3) https://d.pr/i/X60nRi.png (screenshot step 4) ... basically the schema tool separates issues into two categories, errors and warnings But Google Search Console's view of schema errors, is now richer and more advanced than that (so adhere to GSC specs, not schema tool specs - if they ever contradict each other!) What GSC is basically saying is this: "Offers, review and aggregateRating are recommended only and usually cause a warning rather than an error if omitted. However, now we are taking a more complex view. If any one of these fields / properties is omitted, that's okay but one of the three MUST now be present - or it will change from an warning to an error. SO to be clear, if one or two of these is missing, it's not a big deal - but if all three are missing, to us at Google - the product no longer constitutes as a valid product" So what are the implications of having schema which generates erroneous, invalid products in Google's eyes? This was the key statement I found from Google: Google have this document on the Merchant Center (all about Google Shopping paid activity): https://support.google.com/merchants/answer/6069143?hl=en-GB They say: "Valid structured markup allows us to read your product data and enable two features: (1) Automatic item updates: Automatic item updates reduce the risk of account suspension and temporary item disapproval due to price and availability mismatches. (2) Google Sheets Merchant Center add-on: The Merchant Center add-on in Google Sheets can crawl your website and uses structured data to populate and update many attributes in your feed. Learn more about using Google sheets to submit your product data. Prevent temporary disapprovals due to mismatched price and availability information with automatic item updates. This tool allows Merchant Center to update your items based on the structured data on your website instead of using feed-based product data that may be out of date." So basically, without 'valid' schema mark-up, your Google Shopping (paid results) are much more likely to get rejected at a higher frequency, as Google's organic crawler passes data to Google Shopping through schema (and assumedly, they will only do this if the schema is marked as non-erroneous). Since you don't (well, you haven't said anything about this) use Google Shopping (PLA - Product Listing Ads), this 'primary risk' is mostly mitigated It's likely that without valid product schema, your products will not appear as 'product' results within Google's normal, organic results. As you know, occasionally product results make it into Google's normal results. I'm not sure if this can be achieved without paying Google for a PLA (Product Listings Ad) for the hypothetical product in question. If webmasters can occasionally achieve proper product listings in Google's SERPs without PLA, e.g like this: https://d.pr/i/XmXq6b.png (screenshot) ... then be assured that, if your products have schema errors - you're much less likely to get them listed in such a way for for free. In the screenshot I just gave, they are clearly labelled as sponsored (meaning that they were paid for). As such, not sure how much of an issue this would be For product URLs which rank in Google's SERPs which do not render 'as' products: https://d.pr/i/aW0sfD.png (screenshot) ... I don't think that such results would be impacted 'as' highly. You'll see that even with the plain-text / link results, sometimes you get schema embedded like those aggregate product review ratings. Obviously if the schema had errors, the richness of the SERP may be impacted (the little stars might disappear or something) Personally I think that this is going to be a tough one that we're all going to have to come together and solve collectively. Google are basically saying, if a product has no individual review they can read, or no aggregate star rating from a collection of reviews, or it's not on offer (a product must have at least one of these three things) - then to Google it doesn't count as a product any more. That's how it is now, there's no arguing or getting away from it (though personally I think it's pretty steep, they may even back-track on this one at some point due to it being relatively infeasible for most companies to adopt for all their thousands of products) You could take the line of re-assigning all your products as services, but IMO that's a very bad idea. I think Google will cotton on to such 'clever' tricks pretty quickly and undo them all. A product is a product, a service is a service (everyone knows that) Plus, if your items are listed as services they're no longer products and may not be eligible for some types of SERP deployment as a result of that The real question for me is, why is Google doing this? I think it's because, marketers and SEOs have known for a long time that any type of SERP injection (universal search results, e.g: video results, news results, product results injected into Google's 'normal' results) are more attractive to users and because people 'just trust' Google they get a lot of clicks As such, PLA (Google Shopping) has been relatively saturated for some time now and maybe Google feel that the quality of their product-based results, has dropped or lowered in some way. It would make sense to pick 2-3 things that really define the contents of a trustworthy site which is being more transparent with its user-base, and then to re-define 'what a product is' based around those things In this way, Google will be able to reduce the amount of PLA results, reduce the amount of 'noise' they are generating and just keep the extrusions (the nice product boxes in Google's SERPs) for the sites that they feel really deserve them. You might say, well if this could result in their PLA revenue decreasing - why do it? Seems crazy Not really though, as Google make all their revenue from the ads that they show. If it becomes widely known that Google's product-related search results suck, people will move away from Google (in-fact, they have often quoted Amazon as being their leading competitor, not another search engine directly) People don't want to search for website links any more. They want to search for 'things'. Bits of info that pop out (like how you can use Google as a calculator or dictionary now, if you type your queries correctly). They want to search for products, items, things that are useful to them IMO this is just another step towards that goal Thank you for posting this question as it's helped me get some of my own thoughts down on this matter
Technical SEO Issues | | effectdigital1 -
Can images with a company logo get included on featured snippets?
This is also relevant to knowledge-graph boxes and the images which Google compiles into those. Not quite the same thing as the featured snippets but still pretty neat Typically it's a good one for industrial materials or chemical compounds, e.g: https://www.google.com/search?q=Poly%28methyl+2-methylpropenoate%29 https://d.pr/i/lQLl1v.png (screenshot) or https://www.google.com/search?q=polyethylene https://d.pr/i/HwWXoc.png (screenshot) ... there are lots of 'material' based knowledge graph entries, which pull images from other sites in, in order to build up a good view of what the material is. Some people actually find the images from their sites which Google injects, and edit them to use a small and unobtrusive watermark (the trick is not to get too greedy, or Google notices and replaces the image from your site with an image from another site!) Obviously where branded products and compounds / materials converge, it's easier to get some branding showing in the tiny little images: https://www.google.com/search?q=cbd+oil https://d.pr/i/jLn5uQ.png (screenshot) A lot of these actually come through Google's image search results You don't see so many successful injections in this particular area, these days though This one is quite a neat example: https://d.pr/i/582BzT.png (screenshot) and this one also: https://d.pr/i/kaibeC.png (screenshot) ... bit of free advertising for BirdsEye and Green Giant there
Intermediate & Advanced SEO | | effectdigital0 -
Meta description
Hey there, Sam from Moz's Help Team here - thanks so much for reaching out and sorry about the trouble! Could you please pop an email about this over to help@moz.com, along with the name of the campaign and the URL for the page you're seeing this occur with, as well as the old and new meta descriptions so we can take a look at this for you? Thank you!
Link Explorer | | samantha.chapman1 -
Content change and variations in ranking
Well is there way more competition for the keyword or keywords you are optimizing this old page for than for the keyword(s) the new page is optimized for? We would say it's common to not necessarily see changes when making changes to content. For every query, Google has several hundred thousand pages (at least) it could return in the SERPs. Sometimes content tweaks alone isn't enough to move the needle.
Intermediate & Advanced SEO | | Nozzle1