Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi Ian, The details URL should ideally have keywords in it, getting property name in details page URL would be of great help, like : https://realla.co/to-rent/details/Office-to-let-John-Eccles-House-Robert Robinson-Avenue-Oxford-Science-Park-Oxford-OX4-4GP About the category (locations in your case), you are submitting too many of them, your URL structure needs to re-structured, there is work to be done there and sitemap updated according to that. For example: https://realla.co/to-rent/commercial-property/ can be changed to https://realla.co/commercial-property-to-rent/ I hope this helps, let me know if you have further queries. Regards, Vijay

    | Vijay-Gaur
    0

  • Hello there, Like Joseph says allot of people thinks that Google doesn't give them penalty for the spammed links. That is NOT true, i have ben testing that fact and actually even tho Google knows that a site is a spamming site, they will give you penalty untill the current domain / server IP is blacklisted by Google. To blacklist spammers theres actually only 1 way: Disavow the links comming from spammers, as more and other site owners will disavow the same domains / ip, the faster Google will blacklist them. If you aint sure how to Disavow and make sure you dosn´t disavow some urls that is known as a good site on Google, coz that can also make you suffer a penalty on youre rankings. So a good tip is to check the page Aut. and domain age, if it is an old domain with allot of Aut. You need to ask the siteowner to remove the links, after that and if they dont remove the link, you can Disavow followed with a message to Google regarding youre dialog with the websiteowner refuses to remove the link to youre domain.

    | Jesper-Bak-SEO-specialist
    0

  • Hreflang will help guide Google to understand the association of the pages. Rather than add another sitemap for the English pages, I recommend updating your current sitemap to include hreflang tags to the english content. Guide here: https://support.google.com/webmasters/answer/2620865?hl=en

    | katemorris
    0

  • Webmaster tools / search console https://www.google.com/webmasters/tools/home?hl=en https://search.google.com/structured-data/testing-tool/u/0/ You can use this as a template Or use this great tool https://jsonld.com/json-ld-generator/ (remember with JSON-LD you must have the content in HTML if you post via JSON-LD) RKL0Y6H.png

    | BlueprintMarketing
    0

  • Lots going on here, so, a laundry list of follow up questions and thoughts for you... Are you seeing AMP results showing up in the Search Console? Are you seeing them indexed as intended? If you're doing Native AMP, you won't be able to diagnose pages by /amp URL types of formatting. It might be worth trying to fire off an event, or custom dimension in GA, for AMP = Yes / No or something like that. For the sitewide view, have you tested loading pages on a private browser and incognito mobile browser and seeing if they show up in GA realtime in each of the 3 views when they're supposed to? It looks like you might be using Cloudflare - I haven't dealt with an AMP site that uses it, but have you checked whether there are compatibility issues or anything you need to activate? Are any Google Tag Manager pages set to fire on HTTPS only? Are any GA filters in place that specify HTTP/HTTPS that need to be broadened? Your Amp Analytics code seems to match the one on a site that is functioning as intended, so I don't think it's a formatting issue. For the GA view filter - it seems like you should be able to simply include/exclude traffic to shop.winefolly.com - why the added complexity beyond that?

    | KaneJamison
    0

  • Hi, Is the issue got fixed? As we are facing the same issue and if you could provide any assistance on the same. Thanks

    | Devtechexpert
    0

  • I've got the answer myself but didn't know how to delete my question - sorry https://developers.google.com/search/docs/data-types/article#non-amp

    | AL123al
    0

  • The reason I ask about the seo folder is because "seo" isn't usually something that's separate from any of the rest of your site--folders, or otherwise. Rather, it's something that is integrated into the site's words, pages and programming so as to not even identifiable as seo--just proper best practices. When a site's services pages are optimized, for example, they end up being just plain ol' services pages--it's just that keywords have been properly researched and utilized, internal link anchor text has been researched and properly implemented, page titles and metadata have been thought out, etc. The same goes for the homepage and all the site's other pages. You may already know all this and I'm being overly simplistic for you here. If so, my apologies, but I wasn't quite sure from your question what you were looking for.

    | Chris.Menke
    0

  • Not yet. Let me try this. Thanks for the recommendation.

    | dhananjay.kumar1
    0

  • Hi, Not sure why an & would throw an error. Maybe try encoding it like & Worth checking that the rest of the syntax is all correct and all tags are opened and closed properly in case it is not the & character that is actually causing the problem?

    | LynnPatchett
    0

  • I have CloudFlare on about the same number of domains. You can put a separate SSL cert with a 15 year lease on each one if you have access to Apache, you just have to link it in your sites-available configuration files for each site. Sharing 1 SSL certificate among a bunch of sites will not hurt your SEO, just not as secure, but realistically it doesn't matter lol 1 will suffice.

    | TucsonAZWebDesign
    0

  • Hello! For product URLs I would go with the "flat" structure and just do shop.com/product-name (Option 2)

    | evolvingSEO
    0

  • I certainly wouldn't use 2 sets of images. I typically use a framework like bootstrap or materialize and would just use the image class fluid. However when I'm working on a clients and they have some super annoying theme that I don't want to add any unnecessary additional plugins, or if the framework would cause any complications i'll just use the responsive CSS rules of max and min width and specify different optimized images at certain breakpoints: https://www.w3schools.com/css/css_rwd_mediaqueries.asp

    | TucsonAZWebDesign
    0

  • Apparently it won't be considered as a link according to John Muller https://twitter.com/Errioxa/status/1005024345650094080?s=19 Also that was from a few months ago, thing might have changed.

    | Saijo.George
    0

  • Hi Nigel, The website https://www.residentiebosrand.be/ is supposed to be ranking on 'residentie bosrand'. I know authority plays a big part in ranking on keywords. But other sites which are not as relevant as this one rank higher. Do we just have to wait for the website to rank on this keyword?

    | conversal
    0

  • Hi again Kate, you can "fetch as Google" that url on Webmaster Tools https://www.google.com/webmasters/tools/googlebot-fetch and then "request indexing". Here you have more information about it: https://support.google.com/webmasters/answer/6065812 Greetings!

    | paupastorlopez
    0

  • Hi Gianluca! Thanks for the answer. I see that these images and attachments keep showing in Google. Is this hurting our ranking score? Will we have to redirect ALL of them to the home page? Or is it ok for us to wait and let Google recrawl the website and remove them from the index?

    | conversal
    0

  • It definitely passes link juice, one of my extortion activism groups recently was attempting to stop an individual from monetizing a domain that publicly shames people. What the black hat hacker had done was identify dead links on a NY Times article and wanted the DoFollow links from such a authority The individual purchased the domains and redirected them to his site and luckily we caught it on a backlink check and one of the group members was a journalist that was able to get them to remove the links.

    | TucsonAZWebDesign
    0

  • Hi AshShep1, I came across this same issue while using the new Link Explorer. I think it's mainly because the new Link Explorer's index is larger than the previous OSE and thus showing us more links that were not indexed previously by OSE. Regarding your question, if you feel that the links are undesirable, you can always use the Disavow tool provided by Google in the Google Search Console to disavow those links. https://www.google.com/webmasters/tools/disavow-links-main Moz has a pretty good article on this tool here: https://moz.com/blog/guide-to-googles-disavow-tool Hope this helps!

    | NgEF
    0