Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey Ben, it has been quite a while, I would like to know if this has resolved and if so, how? I have a client who have millions of images and we have created sitemaps but they have not indexed any from the sitemap. I am guessing that it might be some information structure/link juice flow issue but any more info on the matter is gladly welcome.

    | alphonseha
    0

  • i added it to the robots.txt, thanks for the tip, it was about 2 weeks ago

    | Storesco
    0

  • My sitemap file has them but only because Screaming Frog and WordPress include that information when they generate the sitemap. Reading the Google forum, there is a reply from John Mueller from 2014 that he specifically says change frequency and priority do not affect ranking of URL's once they are indexed https://productforums.google.com/d/msg/webmasters/w9o8bTgqb9c/7hS7hog3ZIYJ He also states that last mod date is good for Google to know. We all know that you can take things said by Google with a grain of salt, but this is also an instance where it is specifically stated so I would tend to believe it to be true.

    | Shawn_Huber
    0

  • I use JSON-LD and have had no issues since implementing it. It's easier and is a

    | MarketingChimp10
    0

  • Thanks, valuable advice! I will put it to good use.

    | Bob_van_Biezen
    0

  • The OP would like to keep the URLs and queries in confidence, but here is what I found upon investigation for those that might have the same issue in the future: The main issue looks to be a recent URL change from [second most linked to URL on the domain] to the URL in question. There is a 301 redirect in place (as you mentioned), and the new URL is indexed, but it might be some time before it takes the place of the old URL. For the time being, Google for some reason thinks the other URL is the most relevant page, which is odd as there are more topical pages on the site, but I'm not a robot. Good thing is that I feel a change coming. If you do a site: search for your domain with "term in question", Google is seeing the right URL as the most relevant page after the homepage. The homepage being first makes sense since it has the term on the page and it's the strongest page on your site. A few things you might try in the mean time: Resumbit a sitemap of your old URLs that have 301 redirects. Resubmit a sitemap of the new URLs Get the three external links to the old page URL mentioned above changed to the new URL (three external links according to Moz Open Site Explorer) Your title tag and H1s are a little overstuffed with keywords right now. Try to dial that back a little and you only should have 1 H1 that is visible on the page. The H1 should be what people see as the title on the page right now rather than hidden in an extra header and in the body copy. Try those items and give it a little more time.

    | katemorris
    0

  • Hey Chris, it looks like you got some responses to this over on the Google Help Thread. This is an interesting question. My understanding is that you should include all relevant XHTML, image, and video elements within their corresponding <url>tag all within the XML sitemap itself</url>. This is discussed a little bit over at Allotment Digital and they even include an example XML file that shows the best practice writeup. Let me know if that's helpful or if you get any further clarification over on the Google thread.

    | mediawyse
    0

  • I submitted a request through The Guild to find out why my original answer above was not delivered to you. Sorry about that. As to your question about the average Moz Domain Authority. DA is made up of "MozRank", "MozTrust" and your link profile. Of those three, the one that we can most quantify is "MozRank." Moz actually says the average MozRank (from 1-10) for a site page is a "3." So just taking that information it's easier to see why the average Moz Authority would be on the lower end 30-40 and not 50 or higher. It also explains why it is easier to move from 30 to 40 than it is to move from 70-80. Good luck with the above, let me know if I can provide any other help.

    | mediawyse
    0

  • It's possible that Google might give Data Highlighter information more credence when generating a rich snippet, but the success rate for that is far from 100%, as Andy and Bas have mentioned. It's just as likely that Google will be confused about which data is correct, and either not generate a rich snippet at all or generate one based on the schema markup. I haven't had much success adding markup via the data highlighter or via Google Tag Manager; hard-coding it always seems to work better. If your schema markup is incorrect, I would not recommend using Data Highlighter to try to fix it without changing the code - your best bet is just to change it, especially since Data Highlighter won't help you out on Bing/Yahoo.

    | RuthBurrReedy
    0

  • Hi Tormar, You tend to find that almost every site has an XML sitemap tied in with Search Console, but an HTML sitemap doesn't really do much for Google and is more of a navigation aid for people to find places on large sites. You certainly won't get a penalty or benefit if you do or don't carry one. Add one if you feel it will be beneficial, but don't if not. -Andy

    | Andy.Drinkwater
    0

  • Indeed: thanks for the update. AngularJS is a great technique but relatively new and so much different from the techniques we've gotten used to working with in the past couple of years that this is an interesting case. Good luck with the project and cool to see more people using AngularJS!

    | BasKierkels
    0

  • Hi Robert, They don't hurt so leave them in place for some time. It might be quite some time before the index is completely as you want it to be. So just let the redirects to their thing just in case Google, Bing, etc keep indexing the old URL's. Hope that helps. Bas

    | BasKierkels
    0

  • Just checked my GA data and you're right. Referral data from mountainjade.co.nz is there. Thanks for the heads up. I've decided to make the switch to https, so will be organising that with dev in the coming few weeks. I'll keep you posted! Cheers for the help again Logan, I owe ya.

    | Jacobsheehan
    0

  • Multilingual are sites that target languages independently from where they are used. Multi-country are sites that target specific country and language markets. So, in your case, if you want to target all Spanish speaking users, you should go multilingual and the hreflang annotation should be: Moreover, it would be better to change the subfolder URL to http://example.com/es/, because maintaining the es-ar subfolder will make people think that its content specifically targets Argentinian people

    | gfiorelli1
    1

  • Hi Matt! Did Martijn's response help? We'd love an update.

    | MattRoney
    0

  • I agree with Stephan: you have actually bought the businesses so you can decide to link to your main site. It is of course a good thing to look out for penalties but don't see what you would be doing wrong if you would display a link from the purchased companies back to the parent company. As you can read: Andy wrote that the chance is probably very slim in this case. Bas

    | BasKierkels
    2

  • I believe that Google ranks your pages on the basis of what they deserve with minor modifications.  All of my sites have double listings where they are strong.  Some have triple or quadruple listings where they are outstanding on multiple facets of a topic. There are no "tricks" to getting these.  You need content that answers the query, there can be multiple ways of answering the query, and your content needs to blow the competition out of the water.

    | EGOL
    0

  • Hi CJ, I can't imagine a situation where I'd want to do this, but I'm sure you've done the analysis to determine that it's better for your UX. Just make sure you consider this: Not sure what you're using to make the nav dynamic, but you should make sure Google sees it the way you intend. You can use the Fetch and Render in Search Console. You can also use the first part of this inforgraphic to set up a browser to view your site in a live environment the way search engines see your site. You'll want to make sure that once you've disabled CSS and JS that your navigation is still visible. A lot of authority/juice/whateveryouwantocallit is passed through your nav. Having your nav visible this way also ensures quicker crawling/indexing of new content on your site.

    | LoganRay
    0

  • Converting all of those articles will take time. I would design the site architecture and template and then immediately publish each article as soon as it is ready.  This will get the articles flowing out into the search engines and get the money flowing in.

    | EGOL
    0

  • Hi Karl, Google doesn't mind a bit of content that is the same page on page - they have said so in the page. If the page requires that there is a bit of the content that is duplicated, as long as it serves a purpose, you will be absolutely fine. Look at the pages and see if what is there would be considered beneficial to the page. The issues for duplication tend to arise when you are copying huge swathes of text from one page to another, plagiarising others work or even spinning. By what you have said, it sounds like you will be absolutely fine, but feel free to post an example here or PM it to me if you wish and I will happily take a look -Andy

    | Andy.Drinkwater
    0