Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
How Much Domain Age Matter In Ranking?
The age of your domain impacts how well you rank in the SERPs is an SEO myth. John Mueller and Matt Cutts have said the below: • Domain Age is not a ranking factor. • New domains may be dampened for a few months to help fight spam. • Old domains may positively correlate to better rankings due to other factors. Source(s): John Mueller (April 12, 2017), Matt Cutts (Oct 26, 2016), John Mueller (Jan. 15, 2016).
| Kelly-Anne0 -
Allow Embedding on a YouTube but Only for Specific Sites
The content owner is something separate, where you register the content as unique and original. Theoretically, that way you would have the "right" to determine where the content can get posted/shared or not, and you will have the right to remove the video if other people share it, and even let it there and get the ad revenue that is generated from the said video. You can learn more about content ID here. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com
| Dalerio-Consulting1 -
.co.uk or .com for a UK geo location domain?
I would generally recommend buying the .co.uk if your business is targeting a UK customer base whereas if you are intending to trade overseas and don't want to be considered as primarily a UK company I would suggest that you pick .com.
| Kelly-Anne0 -
Page with metatag noindex is STILL being indexed?!
Google might not be seeing the "noindex" tag because it has not crawled it recently. Make sure you check the latest cache date and check if you can see the noindex tag in the cached version. It is very important to make sure you are not blocking the URLs in robots.txt as if you do Google will not see the noindex tag if it cannot get past the block in robots.txt.
| Kelly-Anne0 -
Any Tips for Reviving Old Websites?
If you are reviving an old website make sure it is mobile friendly. Then you will need to refresh the content and update page titles and meta descriptions. Also make sure you add new content regularly.
| Kelly-Anne1 -
Link Juice
It will depend on the subdirectory structure. Generally, it is suggested for the post to be directly after the domain. Generally, the subdirectory is used for categories. I see it as a keyword opportunity: If you think that the keyword of the category can help the optimization of the post and ranking, then you can use the category subdirectory freely. For example, if you are running a course site, and you'll be posting a course on "Python", then "www.site.com/course/python/" would be a more optimized URL than "www.site.com/python/" as you are targeting people who want to find a python course. However, I usually refrain from using the subdirectory structure due to the different targeting between posts. Daniel Rika - Dalerio Consulting https://dalerioconsulting.com/ info@dalerioconsulting.com
| Dalerio-Consulting1 -
Removed everything from my webpage still not de-ranked
Thank you I will have a look at all that.
| seoanalytics0 -
During major update rankings update seem to be on pause ?
Hi seoanalytics, Could you please reply with the articles or comments where its said that? Actually, it happens completely the opposite, when an algorithm update happens we see big movement in rankings. They've said that they deploy several updates each week, but most of them are minor. Here that recent official tweet: https://twitter.com/searchliaison/status/1194365017073700864 Hope it helps. Best luck Gaston
| GastonRiera0 -
Near Duplicate Title Tag Checker
I think the best solution for this might be Google's search operators (e.g., allintitle: keyword(s) site:domain.com) or Google Advanced Search (https://www.google.com/advanced_search).
| AndyRSB0 -
Idle Connection Timeout for Sever Load Balancer
Is time out a response code you are getting when querying your own website in some way? Usually it means you are crawling a site too fast and it's refusing to respond (or it can't respond in time as it has too many requests)
| effectdigital0 -
Event Schema for Multiple Occurrences
That does sound tricky. Maybe you could consider them to be sub-events https://schema.org/Event https://schema.org/superEvent https://schema.org/subEvents ... but I am unsure as to whether subEvents can have specified dates (start / end) I might look more to something like EventSeries https://schema.org/EventSeries "An EventSeries is a collection of events that share some unifying characteristic. For example, "The Olympic Games" is a series, which is repeated regularly. The "2012 London Olympics" can be presented both as an Event in the series "Olympic Games", and as an EventSeries that included a number of sporting competitions as Events. The nature of the association between the events in an EventSeries can vary, but typical examples could include a thematic event series (e.g. topical meetups or classes), or a series of regular events that share a location, attendee group and/or organizers." This would seem to be a better schema to use in your situation. This is the JSON-LD example of implementation from Schema.org: https://d.pr/f/WDYKni.txt (TXT file) It looks like it could be re-engineered to do what you want Whilst Google don't explicitly state that they support EventSeries yet, IMO their documentation cycle for what they do support is wildly out of whack. I have seen front-end instances of them experimenting with loads of schema that isn't in their official documentation. As such I wouldn't be overly, dramatically bothered by that. At the end of the day, the home of schema is Schema.org. I actually often push for schema which Google don't explicitly state that they cover, and I'm often pleasantly surprised It doesn't always yield fancy rich-snippets, but it does help Google to gain contextual awareness and rank pages more appropriately. In fact you can read about that here: https://www.searchenginejournal.com/google-follow-our-structured-data-requirements-to-ensure-rich-result-eligibility/329679/ "Independently, you’re always welcome to use structured data to provide better machine readable context for your pages. Which may not always result in visible changes, but can still help our systems to show your pages for relevant queries."
| effectdigital0 -
I am temporarily moving a site to a new domain. Which redirect is best?
If it's 2-3 months the 302s might decay. Personally, I'd probably 301 and then 301 again (especially if it's nearer the full 12 weeks / 3 months). Wait to see what others say though
| effectdigital0 -
New websites issues- Duplicate urls and many title tags. Is it fine for SEO?
For the first issue, you don't have multiple Page Titles. Only the text inside of is your Page Title. On the other lines you have an OG title (which is for OpenGraph / social and messenger sharing). You also have a Meta title, which is just another form of title. It should be fine to have these three essentially the same, though you'd probably want the ability to customise each of them if desired (sometimes the same link text that compels people to click on Google, isn't so effective on FaceBook or WhatsApp, so at the least you'd want to be able to specify a custom OG title vs your regular <title>tag)</p> <p>The second issue could be a real problem, especially if there are internal links which point to paginated URLs which don't exist, which also don't 404. If you can only create these infinite pagination URLs via manual browsing, it shouldn't be too big of a deal. If the internal link structure 'creates' them, that's another kettle of fish (and I'd take immediate action)</p></title>
| effectdigital0 -
Duplicate content in Shopify - subsequent pages in collections
The advice is no longer current. If you want to see what Google used to say about rel=next/prev, you can read that on this archived URL: https://web.archive.org/web/20190217083902/https://support.google.com/webmasters/answer/1663744?hl=en As you say Google are no longer using rel=prev/next as an indexation signal. Don't take that to mean that, Google are now suddenly blind to paginated content. It probably just means that their base-crawler is now advanced enough, not to require in-code prompting I still don't think that de-indexing all your paginated content with canonical tags is a good idea. What if, for some reason, the paginated version of a parent URL is more useful to end-users? Should you disallow Google from ranking that content appropriately, by using canonical tags (remember: a page that uses a canonical tag cites itself as non-canonical, making it unlikely that it could be indexed) Google may not find the parent URL as useful as the paginated variant which they might otherwise rank, so using canonical tags in this way could potentially reduce your number of rankings or ranking URLs. The effect is likely to be very slight, but personally I would not recommend de-indexation of paginated content via canonical tags (unless you are using some really weird architecture that you don't believe Google would recognise as pagination). The parameter based syntax of "?p=" or "&p=" is widely adopted, Google should be smart enough to think around this If Search Console starts warning you of content duplication, maybe consider canonical deployment. Until such a time, it's not really worth it
| effectdigital0 -
URL ranks in US pretty well except in target city area. Please advise.
Hi Davit, I see! Yes, I do think you should start a new thread specific to technical issues, as they are different than the original scope of this thread. You could get some fresh eyes on it!
| MiriamEllis1 -
My direct traffic went up and my organic traffic went down. Help!
Hey Arnold Ambiel, cheguei a esse tópico porque estava tendo um problema como o seu. O tráfego orgânico e direto foram revertidos. Vi que você tinha esse problema em 2016 e o meu ocorreu em junho. Você encontrou alguma solução para esse problema?
| douglasfaria101 -
Confusing mixture of cross-domain and multi-language - HREFLANG
Have you had any luck figuring this thing out? I have a similar scenario and I can't find any answers to that.
| moz-maddesigngroup0