Well, it can crawl anything found on a web page. If you are referring to a page whose links are dynamically generated in the sense that you build them before serving the page (php for example), then yes. If Google bot reaches that page in any way (it is not blocked etc) then your links will be crawled as well.
Best posts made by iugac
-
RE: Can Google crawl dynamically generated links?
-
RE: Content change within the same URL/Page (UX vs SEO)
Hello. You really need to have separate pages if you want to rank with all of them. Basically, think of the title for example. How do you want to index a specific region if you have only one page? How should googlebot understand that you have multiple content and which content/section to show if a person does a specific query? Escaped fragments could have been used in the past but it was not a great solution and it was discontinued (https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html). As such, I would try to provide separate pages with as much qualitative content as possible and with strong internal linking.
-
RE: Google+ Best Practice
Well, I would make the company page be the rel=publisher and the author page to be rel=author and would share in the way that I proposed. In this way, everyone gets some authorship I think.
Does anyone else have some insight on this?
-
RE: NOINDEX,NOFOLLOW Mistake
I'm really sorry for what happened! I recently had an employee who made such a mistake but at domain level
We are monitoring our websites rankings and it fell from a visibility percentage of 21 to 4! Chaotic. After one month from correcting the problem, we only have a 10 visibility percentage. So, I think you might have to wait a bit until full recovery. However, please bear in mind that my team didn't notice the problem for one month since we did not have an optimization subscription for the client. Also, the domain did not have many linking pages to it and not a high authority. As such, I hope your page will recover much faster than the scenario I am talking about. Good luck! -
RE: Http > https Switch Before Platform Migration?
Hi Nicola! What are your reasons for splitting the migration? Is your site processing sensitive data? If yes, https is an ASAP problem for you since Chrome will already give your users some trouble when navigating your website. If not, I think you will be better of with a single migration. But again, could you give some details regarding your thoughts and reasoning about this? There could be multiple aspects that influence this decision.
-
RE: Help with force redirect HTTP to HTTPS
Hello. You would put it somewhere after # BEGIN WordPress, depending on when you would want the https redirect to happen. The first settings (WP Rocket) basically optimize your website's speed. You would also want to change the settings from the admin panel and also force the admin to SSL. You can read more here -> https://premium.wpmudev.org/blog/ssl-https-wordpress/. Let me know if you have any other questions.
-
RE: Http to https - Have we done it correctly?
The big problem is your redirection. At the moment, you DO NOT redirect people on the https website. Read more about the changes you have to make here -> https://moz.com/learn/seo/redirection. Basically, if you run on Apache, you need to modify your htaccess file and everyone who lands on the non-ssl version should be redirected to the https one. A quick Google search will give you examples of rules to include in your file. For example: https://uk.godaddy.com/help/redirect-http-to-https-automatically-8828.
In terms of the questions you asked:
-
you should modify the settings of the website and set the https as the preferred version. You shouldn't have two different sitemaps. The non-ssl one should not even work (it should be redirected as mentioned above)
-
Of course your robots.txt version should include the https links. Again, the one without them should be already redirected.
Hope this helps.
-