Sound as job for NinjaPopup, OptinMonster, PopupDomination, Subscribers Magnet.
If some of them isn't fill all your needs then you can get some free and open source plugin like "Displey Pop". And tweak it for your needs.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Sound as job for NinjaPopup, OptinMonster, PopupDomination, Subscribers Magnet.
If some of them isn't fill all your needs then you can get some free and open source plugin like "Displey Pop". And tweak it for your needs.
So you need to get all links from:
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://moz.com/blog/have-we-been-wrong-about-panda-all-along
So let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://moz.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos.
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://moz.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.
About disavowing. I strongly recommend to read Marie Haynes articles:
https://moz.com/blog/guide-to-googles-disavow-tool
https://moz.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://moz.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://moz.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
"1. Focus Your META Description
Let's say that, for some reason, we really wanted that SEOmoz blog post to rank for "January 19". One solution is to make sure that phrase appears in our META description for the relevant page. If Google can find the matching copy in your description, they're more likely to use the tag as is. It's also just a good exercise – figuring out what your core target keywords are and targeting them naturally in your META description (don't just make it a list of keywords, of course) will help you focus your overall on-page SEO efforts."
https://moz.com/blog/why-wont-google-use-my-meta-description
And i didn't see GlobeCar in your meta description. That's why on one keyword Google didn't display your meta description and show something from code that contain GlobeCar. Also in title you have "Globe Auto", domain is different and this will confuse user so they didn't display original description.
I forgot also this one:
https://support.google.com/webmasters/answer/35624?hl=en
"Google will sometimes use the meta description of a page in search results snippets, if we think it gives users a more accurate description than would be possible purely from the on-page content."
seems that they think that your description is different than on-page content.
Google recommend to keep this with 302:
https://googlewebmastercentral.blogspot.com/2014/05/creating-right-homepage-for-your.html
so keep it as now with 302, just add hreflang for both pages after redirect. Also on original domain add "vary: accept-language" header.
I think that one redirection is better than two redirections specially for mobile users.
Why you make 301/302 redirect to other site when you just can make hreflang cross domain between sites?
Then italian users will see italian version in SERP and english users can see english version.
Here is how to do it:
https://moz.com/learn/seo/hreflang-tag
https://moz.com/blog/hreflang-behaviour-insights
https://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
https://moz.com/blog/open-source-library-tool-check-hreflang
http://www.searchenginejournal.com/getting-a-better-understanding-of-hreflang/60468/
You also need to verify in SearchConsole both sites and can setup on Italian site that it's for Italians. And setup English for worldwide.
No, since you didn't make DOM manipulation. So isn't "hidden".
But it's little bit confusing to user. Please make some A/B test and see is scrolling works or doesn't.
1. There will be some kind of turbulence but if domains are OK (not penalized) then you shouldn't hesitated.
2. This loop is wrong. You should do it xxx.com -> xxx.it; yyy.com -> xxx.it; never enter into redirect loop because this make mess even huge that you expect. Also consider entering HTTP/2 and switching to HTTPS. Because this will be major moving in 2016 year and sooner or later you will do this too.
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
1. You may receive this probably due broken redirects.
2. No.
3. Yes.
4. Check redirects. You can use cURL or wget (console apps). You also can use SC "Fetch & Render".
If you're afraid - PM me with link to make some tests.
Hotlinking can't be stopped with .htaccess. Because users will see "hotling warning" but bots will still see link.
Since i have exactly same issue with mine images - i disavow them w/o any doubt. This is low quality directory with all images based on some keywords like "aztec". They scrape SERP for images, get top 100 and make "website". No, no please, no.
That's why disavow them ASAP.
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.
The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Well - IMHO we are too deep in technology and know differences between multi-regional and multi-lingual sites. But when talking to someone "newbie" he easy can be confused with ton of terms. That's why they give example with something easy to be understanding.
Well there are 4 types of multi-lingual or multi-regional sites that are described here:
https://support.google.com/webmasters/answer/182192?hl=en
Summary - best is ccTLD, next is subdomain with gTLDS. On 3rd is subdirectory with gTLDS and URL parameters is last.
So Moz is trying to diving expert advise to webmasters to NOT use subdomains and giving famous blog example. I fall in this trap almost 15 years ago with subdomain. I wish someone to told me about this then... Other examples can be catalog and different products (catalog.example.com vs. example.com/catalog; product.example.com vs. example.com/product/). This advice in same subdomain keep link juice inside, but you probably know this.
GSC allow setting different directories to specific geo-location. That's true. But you can't change server location with subdirectories. With subdomain this is possible. Example - one company with site of Spanish and German. With subdirectory i have one server and /de and /es folders. But in this case server location is one and only. And server IP is some of signals for geo-targeting so you have tough choice where to be. With subdomain you can make de.example.com and place this in German server and es.example.com and place this in Spanish server.
That's why subdomains for multilingual sites is notable exception of golden rule "do not use subdomains".
So i show on mine second example (PR9) how other do linkbuilding directly with dofollow link in other sites. Later they removed this link in page rendering anyway. So bots see it, users can't see it.
Anyway - your implementation is pure and look clean. But you know "Life begins at the end of your comfort zone". So you can inspect other embedding implementations like Disqus and get some ideas.
This is discussed many times. And answer is YES. You can see here what Josh Bachynski says:
http://themoralconcept.net/pandalist.html
Look on #5 in section Low quality factors.
That's why you can see here or in blog sections moderators often remove links to sites. Because some people just make comment to get a link to their site. You should do this too. Or make them "nofollow" at least.
WordPress themes are only theme without any schema markups. Adding such feature can lead to reject from WPORG themes repository.
All you need is to inspect plugins about this or use JSON-LD in content marking code with schema. For example you can use Raven Schema Creator:
http://schema-creator.org/wordpress.php
Quick way to remove staging url is sending HTTP error 410 as result.
Other is to use in SearchConsole Remove URLs function https://www.google.com/webmasters/tools/url-removal
About duplicate content - you must see actual canonical. If on stage URL there is canonical point to normal site then you shouldn't hesitating. But if staging and normal point to different URLs then you can see some algo filter.