Best posts made by Joe_Stoffel
-
RE: Google Analytics Tagging
Hello Amanda,
Speaking to issue #2, it sounds like a cross domain tracking issue. These resources should help you resolve the issue:
Self-Referrals - https://support.google.com/analytics/answer/6350128?hl=en
How to Fix Self Referrals in Google Analytics - https://threeventures.com/how-to-fix-self-referrals-in-google-analytics/
Hope these help!
Best,
Joe -
RE: Updating a Business Blog Article - Old or New Published Date
Hi there,
You should include both the original publish date and the last modified/last updated date. Be sure the <lastmod< span="">> dates are updated in the XML sitemap as well.</lastmod<>
Hope that helps!
-
RE: International targeting
If you are only targeting different languages (not targeting a specific region) you should not use the international targeting feature in Google Search Console. You also should not include country codes in the hreflang tags, you will want to remove those.
Also, it is best practice to only use hreflang tags OR an hreflang sitemap, it is not recommended that you use both due to redundancy [source].
"There is no need to use multiple methods for hreflang implementation. Google recommends against it, since it would be redundant. You certainly can use both methods, and there is no clear advantage of one method over the other. Here are some considerations for when you are deciding whether to use the xml sitemaps or page tagging methods:
- Hreflang xml sitemaps can be difficult to create and update. You can use online tools or create it in Excel, but it is difficult to automate the process. If you have xml sitemaps that your CMS updates for you automatically, it would be better to continue to use those rather than create separate, static hreflang xml sitemaps.
- Page tagging leads to code bloat, especially when you are targeting several countries/languages. That can mean an additional 10+ lines of code to each geo-targeted page.
- Some content management systems, such as WordPress and Drupal, offer automatic hreflang page tagging solutions."
https://www.semrush.com/blog/7-common-hreflang-mistakes-and-how-to-fix-them/
-
RE: Specific pages won't index
I agree with Martijn here, the XML sitemap is certainly important but you can also request that Google index this URL specifically through the 'Fetch as Google' Tool. Just FETCH that URL and select "Request indexing" once it has completed.
As for why Google has not indexed this page before now, I not seeing too many reasons other than what has been mentioned. When did this page go live?
-
RE: Site appears then disappears from Google
Hi Jill,
Certainly sounds frustrating. I would recommend finding a way to add text-content to the homepage. With the way this page is set up (all images and no text), Google most likely views this as a blank or thin content page. Here is how Google views the site's homepage.
I did see this snippet of text at the bottom of the 'text-only' version of this page which seems to indicate there is an error, may want to have the dev team take a look. Also, given the very limited amount of crawl-able text on the page, this is not going to help optimize this page for 'photography' related terms.
Text I am seeing on this page "Something is wrong. Response takes too long or there is JS error. Press Ctrl+Shift+J or Cmd+Shift+J on a Mac."
Hope this helps, feel free to follow up with any questions!
-
RE: Updating a Business Blog Article - Old or New Published Date
Yep! Here is an article that can help you out with this: http://www.wpbeginner.com/wp-tutorials/display-the-last-updated-date-of-your-posts-in-wordpress/
-
RE: International targeting
You're welcome. You do not want to include hreflang="x-default" tags for the specific language pages (whether in the sitemap or the code). You only want to include the x-default tag on non-language specific 'default' pages (for example, a default page that prompts the user to select a language or country).
Here is a description from Google on their use of this particular tag: "The new x-default hreflang attribute value signals to our algorithms that this page doesn’t target any specific language or locale and is the default page when no other page is better suited." [Source]
-
RE: My backlinks are not showing in webmaster tools? Why
You are welcome.
Are you sure you are looking in the correct Google Search Console property? (http:// vs https:// vs http://www. vs https://www) - If you are looking in the https://www. property but these links point to the http://www. version of a URL, they will not register in this report.
I assume your links is one of these on the left-hand sidebar? (screenshot). If so, Google should be hitting your links whether they display in Search Console or not. I can see them in the cached version of a page and I can see parisnews.ir show up in an Open Site Explorer backlink report for tizland.ir.
That being said, I would caution you on these types of back links. Sitewide, anchor text heavy back links that appear to have been paid for such as these can trigger unwanted attention from Google and can lead to a penalized website. I'd recommend a more white hat approach to your link acquisition, see this Whiteboard Friday for more on this subject.
-
RE: How to check the competition value of a Keyword.
Hello,
If you are looking into competition levels for a specific keyword, you can utilize the "Difficulty" rating in Moz's Keyword Explorer. If you are using this tool, you should also be mindful of the "Priority" rating for a keyword which takes into account search volume, competition (difficulty), and opportunity for each term. If you are looking for low competition keywords, I suggest diving into keyword research for long-tail queries that are relevant to your head terms. These are less frequently searched terms but tend to be less competitive and drive more specific visitors.
In terms of the Difficulty, Opportunity, and Priority metrics in Moz's Keyword Explorer, Rand provided great write ups for these explaining these metrics in detail and providing insight into 'sweet spots' for each.
Difficulty Score - https://moz.com/community/q/what-is-a-good-keyword-difficulty-score
Opportunity Score - https://moz.com/community/q/what-is-a-good-keyword-opportunity-score
Priority Score - https://moz.com/community/q/what-is-a-good-keyword-potential-score
-
Have Your Thoughts Changed Regarding Canonical Tag Best Practice for Pagination? - Google Ignoring rel= Next/Prev Tagging
Hi there,
We have a good-sized eCommerce client that is gearing up for a relaunch. At this point, the staging site follows the previous best practice for pagination (self-referencing canonical tags on each page; rel=next & prev tags referencing the last and next page within the category).
Knowing that Google does not support rel=next/prev tags, does that change your thoughts for how to set up canonical tags within a paginated product category? We have some categories that have 500-600 products so creating and canonicalizing to a 'view all' page is not ideal for us. That leaves us with the following options (feel it is worth noting that we are leaving rel=next / prev tags in place):
- Leave canonical tags as-is, page 2 of the product category will have a canonical tag referencing ?page=2 URL
- Reference Page 1 of product category on all pages within the category series, page 2 of product category would have canonical tag referencing page 1 (/category/) - this is admittedly what I am leaning toward.
Any and all thoughts are appreciated! If this were in relation to an existing website that is not experiencing indexing issues, I wouldn't worry about these. Given we are launching a new site, now is the time to make such a change.
Thank you!
Joe
-
RE: Which Version Url to Use for Canonical Tags and in General for Homepage.
Hi Ruchy,
Honestly, it shouldn't make a difference, Google looks at homepage URLs with and without the trailing slash as equivalent. For my clients, I typically include the slash (" / ") at the end of the homepage canonical tag, but again, this shouldn't matter.
I actually commented on a similar post that discusses how search engines view these two different variations of the homepage, you can see that here.
Google looks at root domain URLs with and without trailing slashes as equivalent, see the resources below.
-
"Should we always add the final / or avoid it? Does it make a difference?" Answer: "There's no difference between them. (As opposed to not putting a slash on links into a directory, for example.)" (source)
-
"Rest assured that for your root URL specifically, http://example.com is equivalent to http://example.com/ and can’t be redirected even if you’re Chuck Norris." (Source - Google Webmasters Blog)
-
-
RE: Faceted Navigation URLs Best Practices
Hi there,
Generally, faceted URLs are not a best practice for SEO. I suggest you review Google's best and worst practice guidelines for faceted navigation, you can find that here. According to Google, this is often not search-friendly "since it creates many combinations of URLs with duplicative content. With duplicative URLs, search engines may not crawl new or updated unique content as quickly, and/or they may not index a page accurately because indexing signals are diluted between the duplicate versions."
However, If your canonical tags are in place to handle the different parameter/filter URLs that are being created (which will resolve duplicate content issues) you should be fine. Hope this helps!
-
RE: Has your site ever been targeted by malicious link building executed by a competitor?
Hi Justin,
I wouldn't be very concerned about a manual action tbh. If these links do fall under manual review, I am confident that the Webspam team will be able to identify these as a spam tactic against your website. I believe the manual action my old client experienced was more of an uncommon occurrence.
Normally, you would respond to the Webspam team via a reconsideration request after they have applied a manual action penalty (will receive message through Google Search Console). In terms of proactive options, you can report these as webspam to the Webspam Team through Google Search Console here. Unfortunately, doing so will not prevent new links such as these from being created.
Hope this helps!
-
RE: Is there a sweet spot for compressing images for load speed?
I cannot speak to a 'sweet spot' specifically but I would recommend Google's Pagespeed Insights tool for this, you can export optimized/compressed images (as well as CSS and Javascript files) for the page you are analyzing.
I've used this method for many clients and have seen improvements in page load time and have never noticed a difference in image quality.
Hope this helps, let me know if you have any follow up questions!
-
RE: Internal linking to own domain root
Hi James,
Are you seeing signs that this is slowing the impact of your link building? My thought is that you do not need to change anything here, I would leave it as is. Google looks at root domain URLs with and without trailing slashes as equivalent, see the resources below.
-
"Should we always add the final / or avoid it? Does it make a difference?" Answer: "There's no difference between them. (As opposed to not putting a slash on links into a directory, for example.)" (source)
-
"Rest assured that for your root URL specifically, http://example.com is equivalent to http://example.com/ and can’t be redirected even if you’re Chuck Norris." (Source - Google Webmasters Blog)
Hope this helps!
-
-
RE: Should I worry that thousands of spam sites are linking to me?
Hi Steven,
This does sound like it could be a negative SEO campaign, are you seeing spammy anchor text with these links?
Google is effective at eliminating any negative impact caused by such campaigns but they are not always perfect. Are you seeing evidence that this is effectively harming your SEO? If you believe it is being effective, Google suggest you report it in the webmaster forums or by contacting John Mueller (https://plus.google.com/+JohnMueller/posts) and they will look into the situation. Check out this Moz blog post 'preparing for negative SEO' for more information and tips for how to address this situation.
As a precaution, I recommend submitting these spammy links to Google's disavow tool.
-
RE: How can I find all broken links pointing to my site?
Hi Steven,
I assume many of these backlinks will be broken because pages were removed from your site without being properly redirected. If that is the case, Open Site Explorer's Link Opportunities (Link Reclamation) tool should be a big help. This will show all 404 URLs with inbound links that you can recapture be 301 redirecting. Additionally, you can look up the backlinks to each of these 404 pages and reach out to each webmaster requesting they update the URL of their link.
I've also had success exporting Top Pages reports (Moz or Majestic are my preferred tools for this), running any URL with a backlink to it through Screaming Frog and pulling 404 pages/broken links (or even 302 redirects) that way. I usually find additional opportunities that do not show up in the Link Reclamation report.
Hope this helps!
-
RE: Manual action due to hack
Hello,
Did you review the Security Issues Report in Google Search Console? If you have a security issue/have been hacked, this is where you will submit a review once the issue has been cleaned up. This Google Webmasters post on hacked sites/requesting a review should help.
Malware or Spam
- Open the Security Issues report in Search Console. The report will probably still show the warnings and sample infected URLs you saw before.
- If you believe that the sample URLs listed are all clean, select Request a review. In order to submit a review, we ask that you provide more information that the site is cleaned of the hacker's damage. For example, for each category within Security Issues, you can write a sentence explaining how the site was cleaned (for example, "For Content injection hacked URLs, I removed the spammy content and corrected the vulnerability: updating an out-of-date plugin.").