Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Sorry to say that but you're kinda gaming the aggregated reviews schema. Google states that for aggregating reviews, you're good as far as you're referring to a specific product, but you can't use schema markup to refer to a category. I mean you can but Google won't simply be showing your markup as that's not the way aggregatedratings are supposed to work, no one will get angry but you'll be getting no rewards for your work. Detailed info about review snippets here: https://developers.google.com/search/docs/data-types/reviews#review-snippet-guidelines As an alternative you may want to consider other options like aggregate price.

    | mememax
    0

  • Yes, I would link. Yes, it would change breadcrumbs.

    | DmitriiK
    0

  • Hey there, Personally, I don't look at footers as duplicate content, whether they are the same across multiple domains or multiple subdomains. More often than not, footers are pretty generic and contain social links as well as a few links to secondary pages on the site such as 'terms and conditions'. Unless you're packing your footer with a large chunk of text, I wouldn't worry about duplicate content issues as there aren't going to be any. Duplicate content on main pages across sub domains will ALWAYS be the biggest issue and that should be concentrated on above site footers - I would have a word with your outsourced specialists and ensure they're concentrating on the biggest wins for the site rather than the nitty gritty bits that would have little to no impact. All the best, Sean

    | seanginnaw
    0

  • Hi there, They are both competing for the same search terms. Probably, one will never out rank the other and you have double work to do. Also, the duplicate content might be hurting both sites. My advise is to create a second directory in the older and global site. Then, use hreflang telling google the country specifics. Then you can use some of the leverage the older domain has.  I'd do: sample.com (global) and sample.com/uk/ (for UK). Here some  resources: International SEO - Moz.com FAQ: Internationalisation - Google WMT central The International SEO Checklist - Moz Blog. Best Luck. GR.

    | GastonRiera
    0

  • Hi, 1. No, just try to find 1 or 2 pictures that would be lazy loaded in the page and check if these requests are made by the tool. My guess is that they shouldn't be in the initial load as they're hidden in the code most of the time. 2. Hard to say, probably not.

    | Martijn_Scheijbeler
    0

  • Hi there. It's known that comment section is not awesome at all in the eyes of google, typically due to spamminess, so attention paid to such sections is way lower than to other content on the page. Also, as far as I remember, Disqus is being loaded through deferred JS, into an Iframe, meaning that it would not be even considered as part of your website. Also, plugins like that are usually loaded with deferred JS, which means your loading times shouldn't be affected. Therefore, it shouldn't really matter what you do to it to do impact (if any at all) on your SEO. Concentrate on your own content, my friend.

    | DmitriiK
    0

  • Although Google can now get to js, I would still be nervous on choosing a theme/CMS  that is using lazy loading. According to John Muller from Google: “Is Googlebot able to trigger lazy loading scripts- lazy loading images for below the fold content” – “This is a tricky thing.” On lazy loading images John says  “test this with Fetch as Google in Webmaster Tools” and “imagine those are things that Googlebot might miss out on.” http://www.hobo-web.co.uk/john-muellers-seo-advice/

    | MickEdwards
    1

  • Donnleath, I wouldn't worry too much about this. In fact, Google has been rewriting title tags for about 3 or 4 years, now. Google, for whatever reason, has decided to rewrite your title tag for you if the one you're using better matches the actual search query being used by the searcher. It's quite possible that if your home page, for example, ranks for one search query they will rewrite it but for another search query they won't rewrite it. Google will use, from time to time, your brand name, company name, and even elements of your site's navigation to rewrite your title tag. I've seen them take internal links from a site's navigation and breadcrumb trail and use those words in the title tag. In your case, what you need to decide is if the title tags that Google is rewriting are better or 'worse' for your searchers. If they're worse, then you might consider looking at your site's navigation and breadcrumb trail to see if there's something that you can fix on your site to maybe influence Google to rewrite them another way. If you see that Google is rewriting ALL of those title tags, though, on all pages, then you might want to take a look at your site's title tags and see if they do need to be rewritten, taking what Google is suggesting, into account.

    | becole
    0

  • Appreciate the response and that is was what I was leaning towards. Thanks for taking time to answer this question!

    | WebPT
    0

  • If there are redirects on your site from http to https, I believe the best practice is to actually leave your old sitemap in place in the http property, as well as submitting the new ssl sitemap to that property. We learned this AFTER recently moving from http to https and most of our URL structure changing as well. However, with 301s set up for as many site URLs as possible, we saw the new ones gain in indexation and rank rather quickly, even without the old http sitemap.

    | Dmilliman
    0

  • Need to know which is best out of the two. I also read somewhere that JSON is becoming the standard. Confused!! Please help..

    | sachin.kaushik
    0

  • Can i re-open this questioning line? We are interested in a similar strategy in order to help educate some customers who are falling victim to an unscrupulous operator in the market. We know we can get links to our site for their brand name because there are a lot of scandals people are falling victim to based around this operator with government bodies and charities trying to raise awareness in terms of what it means to get involved with this player. They are very good at marketing to people who feel they do not have many options under their circumstances however alternatives exist and our goal is to raise awareness. Does anyone know the legality of using competitors brand names on our site providing the information is factual and transparent?

    | TrueluxGroup
    1

  • Bernadette said it well. It won't necessarily hurt you, but user experience is very important. It's helpful to see unique information about each page in the meta when a user searches for something; that way, they know to click on your page for their search. Also, it's a great best practice to give every page a unique meta description and title no matter what.

    | BlueCorona
    0

  • Also, here is the link to Moz's (Peter Meyers) title tag preview tool, so you can see how the different alternatives will look.

    | Linda-Vassily
    0

  • Thanks Ipegiro...  My point exactly on saying "unless they are creating value within the url"...  as obviously "bunny-on-stool" has a completely different meaning than "bunny-stool".. Cheers -Jake

    | HiveDigitalInc
    0

  • Hi David, Good question! When you are logged into Moz.com, if you click on your user avatar in the top, righthand part of the main site nav, you will see a link to your private messages. (I've called it out with a hot pink box in the attached screenshot.) E6PZR

    | Christy-Correll
    0

  • I noticed this with some of our competition. It doesn't seem to hurt them being that they are ranked number 1 and 2

    | Tyler-Brown
    2

  • Thanks Egol thats great advice.

    | GrouchyKids
    2

  • There is nothing "bad practices" about allowing a non-existent page to 404. People often times forget that a 404 isn't a signal that something is broken and needs fixing, its just a status code that returns "Not Found". Sometimes it makes sense for things not to be found on your site because they were never there in the first place. 404s eventually stop being crawled and indexed. You shouldn't just bulk redirect things to your homepage though. Its always best to have a 301 point to the most relevant page based on what the original page was. If there is no most relevant page, have you considered 301-ing them to one step up in the site navigation? (i.e. a category page or hub page)

    | MikeRoberts
    0

  • Thanks for writing in - I think I can help here! Any items listed as "notices" in the Site Crawl section aren't necessarily errors or issues that need to be addressed; they are just pieces of information that might be helpful or interesting for you to know. In this case, it looks like the tools are just letting you know that there's a rel="canonical" tag in place on the page. There's nothing wrong with that, and in many cases it's a good thing, as it tells search engines which page to prioritize and index if there appears to be duplicate content or more than one version of a page. This isn't an issue you need to take an action to resolve, it's just some information that might be useful to you. I hope this helps! Let me know if you have any other questions or if there's anything that needs clarifying!

    | JordanRailsback
    0