Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Incorrect Youtube metadescription in Google SERP
Thank you, Matthew. It's strange that Google would not allow more control over what is being listed under the meta-description in rankings. I appreciate your feedback.
| JaredBroussard0 -
Duplicate Content from long Site Title
Hi Mkyhnn I think your problem is stemming from the fact that in Wordpress and other CMS systems that the title is auto generated and is appending itself to the end of all the URLs on your website. This will, for example on a page like a contact, create a title that is 'contact - Planit NZ: New Zealand Tours, Bus Passes & Travel Planning' so of course most of the short titled pages you have are being swamped by the main site title causing duplication. If this is the case then: Home page: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning This is fine All other pages - write unique titles that relate to the content on the page. Using Yoast or similar you do not have to have the main title append itself to all the pages. You can write them uniquely and then just add 'Planit NZ' at the end to stay within the 60-70 characters that Google recommends. This way will fix your problem. Regards Nigel
| Nigel_Carr0 -
WP URL issue - Concatenated URLs (LOTS of them)
Hi I don't know anything about your website but it looks to me like the links on the page do not have https://www. in front of them and are therefore appending themselves to the page they are on instead of creating a link to another page.. So look at the links on the page: If the page is https://www.atouchofrust.com/terms-of-use/ and you add a link to that page thus: atouchofrust.com/vendor-news Then the result will be a concatenated string thus: https://www.atouchofrust.com/terms-of-use/atouchofrust.com/vendor-news Which will 404. If you change the link on the page to **https://www.**atouchofrust.com/vendor-news Then the link will resolve to the page it is meant to. I think that is your problem. Go through all the cross site links and add the https://www Regards Nigel Carousel Projects.
| Nigel_Carr0 -
Forbes Contributor - Bio Link (Yes or No?)
Thanks everyone for your input, I'll keep the link on there. your responses summed up my opinion but wanted to double check with some other experienced folks.
| tnace0 -
I want to load my ecommerce site xml via CDN
Hello Micey123, That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
| Everett0 -
Drop in keyword rankings with a multi-region website
Ah, all very helpful, thanks! Some interesting bits to pull apart here, I think: hreflang tagging is unlikely to improve your rankings for any given language/territory/page/keyword; rather, it's more likely to prevent the wrong content from showing up in the wrong territory. Make sense? I'd try to manage expectations around "restoring" rankings. What if your performance dropped because of strong competitor activity, changing consumer behaviour, or other factors? Whilst your international setup is part of that picture, the reality is much more complex. I'd try to shift the conversation away from working out ways to / waiting for your rankings to "restore" from a magic bullet fix, and start talking about the many strategies and tactics you might deploy to improve rankings gradually moving forwards. I'd be _really _nervous of outsourced link building. If you're handing money to a third party to get you links, you're only a small amounts of semantics away from buying links outright. What are the doing, exactly? Are you producing exceptionally high quality, useful information and resources, which they're helping to shine a spotlight on - or are you paying them a fee for them to magically acquire links? It feels like, of all the possible risks and causes of your problems, this is the area I'd want to scrutinise the most; and in the meantime, start looking into ways in which your pages can earn links without having to pay a mysterious third party!
| JonoAlderson0 -
Handling Pages with query codes
Hi Richard These are parameters that sit after the main URL and often include 'sort' 'page'. (They can also be created in some eCommerce pages as 'products' but these should be dealt with a mod-rewrite to show properly constructed URLs with category name and title). There are a number of ways with dealing with them: 1. Google search console - you have to be very careful messing with the rules in parameter handling but for some, this is the way. 'sort' then you can tell Google that it narrows the content on the page - you can then choose to let Googlebot decide or block the URLs - I often block them as they just create skinny and duplicate content. Pagination - 'page' you can tell Google that this paginates and then let Google decide. Look at rel/prev tag on those pages as well. Attributes - like size and colour - I generally block those as they just create skinny duplicates of main categories Others - like Catalog - it depends on what platform you use but there could be other parameters being created - I block most of them as they create useless URLs 2. Robots.txt You can use this file to block the indexing of these pages depending on the parameter by excluding them from being followed by the search bots. Once again be very careful as you don't want to accidentally block indexing of useful areas the site. Your bskt page should be dealt with like this but read about all the other ones as well. https://moz.com/learn/seo/robotstxt 3. Canonicals If you are able a great way of dealing with attributes like size and colour is to canonicalize back to the non size specific URL - this is a great way of maintaining the link juice for those URLs which may otherwise be lost if you blocked them all together. You add a rel=canonical tag pointing to the non-parameter version. https://moz.com/learn/seo/canonicalization 4. As a last resort you can 301 redirect them but frankly, if you have dealt with them properly you shouldn't have to. It's also bad practice to have live 301 redirects in the internal structure of a website. Best to use the correct URL. There is more reading here: https://moz.com/community/q/which-is-the-best-way-to-handle-query-parameters https://moz.com/community/q/do-parameters-in-a-url-make-a-difference-from-an-seo-point-of-view https://moz.com/community/q/how-do-i-deindex-url-parameters Regards Nigel
| Nigel_Carr1 -
Content in Accordion doesn't rank as well as Content in Text box?
Google will not treat content that is concealed behind tabs, accordions, or any other element where JavaScript is used to reveal content, in the same way as content that is visible as standard. However, it will still be indexed, so pages may rank for search phrases related to content contained within the hidden sections. Why does Google devalue hidden content? Google’s focus is on ensuring that the user experience within its search results is as good as possible. If the algorithm gave full weight to content hidden using JavaScript, this could be compromised. For example, say a user searches for a term that is matched on a page but only in the hidden section. The user then clicks the search result to go through to that page but can’t immediately see the information they’re looking for because it’s hidden. They give up and return to the search results or head to another website. This, in Google’s assessment, would not be a high quality user experience and the content within the hidden sections is therefore down-weighted. In Summary Hiding content within tabs, accordions, or other elements that rely on JavaScript to reveal it to users is likely to be treated differently by Google, and assigned far less importance Websites, therefore, must take a considered approach and use this method only to hide content that is of secondary importance to the primary topic of the page, or that covers related topics
| Roman-Delcarmen1 -
404 or rel="canonical" for empty search results?
Nonindex sounds like a great idea. But should those empty search pages have the HTTP status 404 or 200?
| haghadi1 -
Overdynamic Pages - How to Solve it?
Hi, In Example 1, each page should use self referencing canonical tags in addition to pagination tags.So for example: URL: http://urbania.pe/buscar/venta-de-propiedades **URL: http://urbania.pe/buscar/venta-de-propiedades?&page=2 ** <link rel="canonical" href="http: urbania.pe="" buscar="" venta-de-propiedades?&page="2""></link rel="canonical" href="http:> And so on..... In Example 2, for pages with pagination + filtering, canonical tags should point to the the relevant page of results without filtering. So for example: URL: http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium <link rel="canonical" href="http: urbania.pe="" buscar="" venta-de-propiedades"=""></link rel="canonical" href="http:> URL: http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium&page=2 <link rel="canonical" href="http: urbania.pe="" buscar="" venta-de-propiedades?&page="2""></link rel="canonical" href="http:> Hope this isnt too confusing Thanks, Matt
| matt-elshaw0 -
Question re: spammy internal links on site
Firts your need to delete/remove those links To delete those links from search engines Go to your Search Console > Google Index > Remove URLs The main reason for that, is probably your site has been hacked and it is used as zombie website. Im pretty sure that problem is the theme or some nulled plugin this kind of problem is very usual when some get a theme of a untrusted website. So make a backup of your site, then delete everything, add a fresh version of your theme and plugins. Once your site is clean follow these tips To secure your website https://wordpress.org/plugins/all-in-one-wp-security-and-firewall/ Another security tips https://yoast.com/wordpress-security/
| Roman-Delcarmen0 -
Stuck trying to deindex pages from google
Ruchy, Yeap it might had helped for a few weeks. But internal links from your site are not the only way to crawl all your pages. Remember that there may be other sites linking other pages. B- Absolutely, adding noindex will help. There is no way to know for sure how long will it take, give it a few weeks. Also, it could help removing manually all those pages with the Google Search Console, as Logan said. Hope it helps!. GR
| GastonRiera0 -
Subpage with own homepage and navigation good or bad?
Thank you for your answer Cesare, I don't want Google to see and rate it as an independent website.... It's just to make a clear seperation between the two services. The services have differens pricings, different How it works-pages etc. So when you are in the import-zone, and you click on Pricing, you will see a page with the pricingof our import-service.... I'm sorry... struggling a bit to explain it in English. Looking forward to hear from more people!
| RobertvanHeerde0 -
Do you Index your Image Repository?
I guess it makes sense to have the images indexed, especially if in your context images could appear as search results. If not it won't hurt either. Google will be able to find your images through the links to your images when you use them in your blog posts. Add a keyword to the ALT of your image, i would also take care of the file name (same or another keyword?). I wouldn't use the same keyword in the title tag of the image again in order not to overdue it. You could even have the images added to your sitemap so Google would be able to find them for sure. But honestly if images usually don't appear as **search results **I wouldn't go so far. Hope this helps. Cheers, Cesare
| Cesare.Marchetti0 -
Any SEO benefits of adding a Glossary to our website?
Hi Roy, I thought about just explaining what the term means. But you seem to suggest I should dedicate more space to each term, am I right? My idea was: to list all terms that visitors might need to have explained and link them with the pages where they are used. I believe each term would be explained via a sentence or two. Is this not enough? Is there any better SEO solution? I know for sure there won't be any videos or guides for now, it will be just a simple page. The page would sit in our Help Centre and we would have a Glossary link in the footer. Many thanks. Katarina
| Katarina-Borovska1 -
Hey guys, for some reason my homepage has gone down in rankings though other pages on my site have not.
It was due to the re-indexing in Google Search Console.
| HappyApple840 -
Implications of Disallowing A LOT of Pages
Perfect, that's my intent. Thanks so much for your help!! I really appreciate it.
| rachelmeyer1 -
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi Rob ! Thank you very much for taking the time to write a really detailed and clear answer, I really appreciate it. This was insightful and answered my question. Cheers.
| LabeliumUSA0