No,
Corporate contacts and corporate logo are part of Schema.org/Organization.
Example:
is valid and:
is also valid. Also valid is this:
All these JSON-LD can be validated w/o problems.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
No,
Corporate contacts and corporate logo are part of Schema.org/Organization.
Example:
is valid and:
is also valid. Also valid is this:
All these JSON-LD can be validated w/o problems.
If your domain is ccTLD (.co.uk) then geolocation of server didn't matter. Of course just need to be fast for that country where users will be.
So far we have Microdata, RDFa and JSON-LD.
I'll cover Microdata and RDFa in same because they're similar. So they both are addition to HTML attributes indicating what Schema.org field names correspond with what user-visible text on the page. Works perfect but need lot of developer work and designer changes. Because both backend (admin interface) and frontend (HTML) must be changed. And there are many issues that can be messed - incorrect implementation, "rich snippet spam", software bugs, etc. Look easy as 1-2-3 but in reality it's pain (PIA) for implementation and support. Example - only product implementation require in backend least 10 edit boxes if they're manual filled.
JSON-LD - it's relative new protocol based on Schema.org. The main benefit is that you split representation layer (HTML) from semantic layer (JSON-LD). In prev. formats - they're same and linked each other. Now they're split. This give you much more freedom than before. You can place HTML data whatever you wish and just add hidden JSON in head or in content that will add semantic marking. This is future (for now).
If you wish to read more about creation of JSON-LD then this article is for you:
http://manu.sporny.org/2014/json-ld-origins-2/
1. If you read SEO blogs and industry needs then you already know that what is SEO and how it's maded. Basically this is onpage + offpage + content + outreach + few more. Some companies are specialized in onpage optimizations, other in offpage (links), 3rd are content creators.
2. True - links + outreach + social media sharing. You also need to update your content to be actual.
3. You can find Moz recommendations here: https://moz.com//article/recommended
So - this is normal. Any site can have few "bad" links or links that wasn't created from it. They can be also created from scrappers, competitors, previous SEO agencies, and/or other.
But you should constantly watch them in SC and update your disavow file keeping them out. Because in long them they can (if wasn't already) hurt your rankings.
I also experience some issues with OSE at the moment too.
Probably it's related with forthcoming Moz index refresh and will be fixed within few hours.
Well they probably explain this better:
https://support.google.com/webmasters/answer/66356?hl=en
http://searchengineland.com/google-links-in-a-press-release-should-be-nofollowed-like-advertisements-168339
http://searchengineland.com/google-adds-large-scale-guest-posting-advertorials-optimized-anchor-text-to-list-of-link-schemes-168082
http://www.wordstream.com/blog/ws/2013/07/24/follow-nofollow-links
Please note two things. First they talk about "optimized anchor text". This mean that if you link your domain with keyword anchors. There are many example like "wedding dresses", "wedding photographer", "wedding cakes", etc. If you link only "naked url" (url w/o keywords in anchors) then it's OK. And second - press releases can mess your linking profile to your site. So there are many articles all over the internet about something holy grail of linkbuilding - "perfect link profile". I won't share links because this is some myth and in reality this is hard to be proof. But why this is important? Because let's says that you have 70/30 or 60/40 external links to home page vs. internal pages. So passing press releases (most of them are to your home page) will change this to 90/10 or 85/15. And this could raise a red flag for Penguin.
As you can see it's tricky for you.
Now it's also tricky for sites sharing press releases. Because general each PR is text + link. And building too much content with outgoing links isn't good. Could trigger Panda. Now add that same content appear on many sites at once. Could trigger Panda too. Now also add "linking schemes" in 1st link. In result this is headache for them. That's why they use "nofollow" to minimize risks for them.
As you can see now - it's tricky for everyone. And safe way for linking is only "nofollow".
+1
This remind me when i talking about websites with some movie director. His idea was "all sites should have something cover page, when you visit it for first time you should see it. Even see this page every time on your visits as initial.". I was WTF?
I think that answers here can help you:
https://moz.com/community/q/blocking-certain-countries-via-ip-address-location
Since question is similar. Adding PHP code for geolocation checks (maxmind geoip) is just 2-3 lines of code. And you can ignore submissions from other countries.
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
There are many SEO plugins in WordPress:
But only two get huge market share - Yoast and AIO. And many authors are compare them side-by-side:
http://www.evolvingseo.com/2014/02/14/all-in-one-vs-yoast-seo-test-drive/
https://www.elegantthemes.com/blog/tips-tricks/wordpress-seo-vs-all-in-one-seo-pack-which-is-the-best-seo-plugin
http://wpscoop.com/comparisons/wordpress-seo-plugins-comparison/wordpress-seo-by-yoast-vs-all-in-one-seo-pack-comparison/
There is even Lynda course for both of them:
http://www.lynda.com/WordPress-tutorials/WordPress-Plugins-SEO/140779-2.html
of course there are also many videos in Youtube.
Probably bug with RTL languages. I view source and there was only 23 times of this "שמלות כלה נפוחות".
Can you mail this to help@moz.com for assistance?
Just as Ian says - this is definitely that:
Example - try to get statistic for this page:
https://moz.com/page-not-exists/
and check results. If you get same results - then this isn't API fault.
Blocking IPs on geolocation can be dangerous. But you can use MaxMind GeoIP database:
https://github.com/maxmind/geoip-api-php
or you also can implemente GeoIP in "add to cart" or "new user" as additional check. So when user is outside of US/CA you can require them to fill captcha or just ignore their requests.
Now from bot point of view - if bot visit with US IP and with UK (example) IP they will see same pages. Just within UK they can't create new user or adding to cart. HTML code will be 100% same.
PS: I forgot... VPN or Proxies are cheap these days. I have few EC2 instances with everything just for mine own needs. Bad Guys also can use them so think twice about possible "protection". Note the quotes.
Link here is nofollow FYI.
Last element:
https://developers.google.com/structured-data/breadcrumbs?hl=en
"The breadcrumb trail may include or omit a breadcrumb for the page on which it appears.".
So both works. I have seen breadcrumbs with last element and w/o last element that point to current page.
Have you read this article:
https://moz.com/blog/10-illustrations-on-search-engines-valuation-of-links
specially #5, #6
So - as EGOL says - use CrazyEgg + Riveted (Analytics plugin) + ELA https://developers.google.com/analytics/devguides/collection/analyticsjs/enhanced-link-attribution to see where they click.
Usual .htaccess mess... WHY? Because of flags:
https://httpd.apache.org/docs/2.4/rewrite/flags.html
As you can see there are few flags L, R, NC and other. But we will focus on L and R only:
R - redirect. This make 302 redirect but you can specify other response codes between 301 and 399. Example R=301.
L - last. This flag causes mod_rewrite to stop processing the rule set.
Let's go back on your file here is structure:
W3TC cache setW3TC compression set
W3TC CDN
W3TC page cache
WordPress handler - with L flag!
Redirects
Force non-www
What is the problem? The problem is that after L flag - everything is stopped. This mean that www and non-www works and no redirect between them. You need to make changes in your file as this:
Force non-www
Redirects
W3TC
WordPress handler
And check and recheck everything one more time. Including redirects.
True, you can't track them since they pass with redirector that. And redirector hide referral traffic.
But there is workaround. You need to create brand new GA property and put it here:
https://www.youtube.com/advanced_settings
https://www.optimizesmart.com/google-analytics-and-youtube-integration-guide/ <- this is much better explanation with lot of screenshots
Please note that i wasn't link them yet so i'm not 100% sure that you can track them all. I can see clicks in notes/annotations from Youtube Analytics:
https://www.youtube.com/analytics?o=U
And if you can't track them all still you can tag links with URL builder:
https://support.google.com/analytics/answer/1033867?hl=en