Questions
-
Is it possible that Google is pulling description from third party websites and displaying in the description section in organic result?
Thanks. I think the content on page is pretty good and strong as we have just done an upgrade of the website. Just recently Google decided to show this description from DMOZ.
Intermediate & Advanced SEO | | Malika10 -
Toggle Tabs on pages - How to present information to users
I can't really help from a coding aspect I'm afraid, but I don't know that there is a way to do what you suggested there. However, If both benefits and side effects are equally important, can these not be displayed on the page where they don't feature in a tab? I would be trying to keep the page as straight forward as possible rather than trying to show tabs depending on search intent. -Andy
Web Design | | Andy.Drinkwater0 -
AngularJS - How to check?
Awesome! That's helpful! I also heard that Screaming Frog has the capability to crawl AJAX sites but I am not sure how to utilize that data for a website built in AngularJS
Search Engine Trends | | Malika10 -
Setting A Custom User Agent in Screaming Frog
Setting a custom user agent determines things like HTTP/2 so there can be a big difference if you change it to something that might not take advantage of something like HTTP/2 Apparently, it is coming to Pingdom very soon just like it is to Googlebot http://royal.pingdom.com/2015/06/11/http2-new-protocol/ This Is an excellent example of a user agent's ability to modify the way your site is crawled as well as how efficient it is. https://www.keycdn.com/blog/https-performance-overhead/ It is important to note that we didn’t use Pingdom in any of our tests because they use Chrome 39, which doesn’t support the new HTTP/2 protocol. HTTP/2 in Chrome isn’t supported until Chrome 43. You can tell this by looking at the User-Agent in the request headers of your test results. [image: pingdom-user-agent.webp] Pingdom user-agent Note: WebPageTest uses Chrome 47 which does support HTTP/2. Hope that clears things up, Tom ujNvaad.png t49XhGi.png
Intermediate & Advanced SEO | | BlueprintMarketing0 -
Ecommerce website - Transition from one subdomain to another. Please help!
Personally, I'd say to make 100% sure all your 301 redirects are placed properly, and that each old URL is redirected to the absolute best new URL to get the user what they're looking for.
Technical SEO Issues | | MattRoney0 -
Links: Links come from bizzare pages
Yes I believe its been happening from ages. These techniques are used by competitors mostly to sabotage the websites. However, how do you hack into such websites. Its very annoying. I am keeping an eye on the backlinks profile. Ahrefs didnt show any such link but I got this from the Google Search Console. These third party links pages are not even indexed in Google but still Google has got it in their Search Console report and Ahrefs is unable to detect these. Very strange!
Intermediate & Advanced SEO | | Malika10 -
Angular JS - Page Load
Did you read prerender documentation? https://prerender.io/documentation/install-middleware#apache Because there you can find two examples (Apache + nginx): https://gist.github.com/thoop/8072354 https://gist.github.com/thoop/8165802 How they works? Simple - bot's are received proxified version from this url: http://service.prerender.io/http://example.com/url this works as your server is switch to specific proxy mode called reverse proxy. This works similar as proxy. Proxy caches results from few computers/network to the internet. Computers are clients, they sent requests, proxy go in internet and execute it, then return result to clients. This is normal way. In reverse way - internet is client and proxy serve requests to internal infrastructure. This allow hiding internal infrastructure, easy scaling or even make complex site with few internal servers (one will process /blog, other /shop, third /support, etc). But - this "prerender" version is served only to bots. Normal clients (not in list) received AngularJS version of HTML. Since everything is served from your own server you shouldn't hesitated. Second - do not (!!!) sent prerendered version to clients because prerender can't load pages from your server to make it prerendered. You can make easy overload your server in redirect loop. Also prerender server's too.
Intermediate & Advanced SEO | | Mobilio0 -
AngularJS - What To Consider?
While developing that new website with angularjs you need to keep in mind SEO and what crawlers do. Put it in a simple way, you should always render the page server side at first load in a crawler friendly way, and use angular client side for UX. To use AJAX to load content is bad for seo no matter what js framework/library you use, from old jquery to angular or react it's always the same story, don't load SEO valuable content through ajax, or crawlers will likely totally ignore it. So, secure you are giving visitors and crawlers valuable SEO content right away on first load, server side, and use angular to manipulate it according to user actions. And while doing that, don't be tempted to cloak, or google axe will hit you. I have been developing websites using angularjs for years now, and always got great SEO results. Good luck.
Intermediate & Advanced SEO | | max.favilli0 -
Sitemaps and dynamic pages
Hey Malika, If you're talking about HTML sitemaps for a site with thousands of pages then I would even stay away from it and go for the XML sitemaps that you could add to speed up indexation. The keyword focused ones won't help you enough for now to make life easier to speed up indexation and might even cause bigger issues as you'll have thousand or hundreds of links on one page. Martijn.
Intermediate & Advanced SEO | | Martijn_Scheijbeler0 -
Hundreds of 301 Redirects. Remove Pages or Not?
Hi Malika, From the info you've provided, removing those 301s is going to be a good idea. Below is a relevant section from a Google Developers article. The article is specifically referring to mobile site speed but the point applied to desktop as well. "(2) Number of redirects should be minimized Additional HTTP redirects can add one or two extra network roundtrips (two if an extra DNS lookup is required), incurring hundreds of milliseconds of extra latency on 3G networks. For this reason, we strongly encourage webmasters to minimize the number, and ideally eliminate redirects entirely - this is especially important for the HTML document (avoid “m dot” redirects when possible)." If you're 100% sure there are no backlinks or traffic heading to those old domains, there's no reason to keep the redirects in place, particularly if the redirected pages are old enough that it's unlikely anyone has them saved in an old email or under their browser favorites.
Intermediate & Advanced SEO | | ChrisAshton0 -
Http to Https implementation - What happens to the backlinks?
You could just redirrect all HTTP links to HTTPS using a .htaccess document https://www.captiga.com/linux/redirect-http-to-https/
Intermediate & Advanced SEO | | IsaCleanse0 -
Https Implementation - Weird Redirection After Putting 's' in http://
Thanks Chris Ashton for your response. Yes, I agree with you on the benefit bit. I dont think there is any sort of impact apart from confusing the users.
Web Design | | Malika10 -
If Robots.txt have blocked an Image (Image URL) but the other page which can be indexed has this image, how is the image treated?
May I ask why you/your webmaster would have noindexed your images in the first place?
Intermediate & Advanced SEO | | alphonseha1