Thanks EGOL!
Posts made by BlueprintMarketing
-
RE: Can building quality links on internal pages help us to improve DA?
Sorry about misunderstanding what you said. Yes but I'm building links to your entire site you will increase your domain authority it is not wise to intentionally build all the links to one page even if it is your homepage. And that would not help your domain authority as much as if you did it across your site.
Sorry for the misunderstanding,
sincerely
Tom
-
RE: Can building quality links on internal pages help us to improve DA?
I think everyone is giving you a good idea about domain authority and I am going to try to answer your question best I think I can about what you're asking whether building internal links is going to help drive traffic to your site or increase page authority/domain authority which are different things just call them page rank for right now.
Majestic and Link Research Tools have some amazing information out there but I don't want to bog you down I want to help you strengthen your internal site structure.
So does Bill Slawski https://www.seobythesea.com/2020/04/pagerank-2020/
- https://www.hobo-web.co.uk/optimize-website-navigation/
- https://ahrefs.com/blog/internal-links-for-seo/
- https://www.oncrawl.com/oncrawl-seo-thoughts/top-20-seo-link-metrics-that-influence-a-links-quality/
- https://www.oncrawl.com/oncrawl-seo-thoughts/how-to-optimize-your-internal-linking/
- https://www.oncrawl.com/oncrawl-seo-thoughts/alternatives-to-google-pagerank-2/
- http://help.oncrawl.com/en/articles/404228-inrank
I hope this is of help to you.
Tom
-
RE: Is CloudFlare bad for SEO?
it should not negatively impact your ranking in any way shape or form. If you are seeing the speed increase from it it will most likely be a positive effect on your rankings. Google no longer uses IP's so it's fine. to have a reverse proxy in front like CloudFlare. you like it and you think it's fine to help move site, by all means, try it out.
It will only have a negative impact on your rankings if you do something that's almost impossible in block Google Bot with your firewall which is not going to happen unless you really start getting technical with the WAF, in other words, don't worry about it
sincerely,
Tom
-
RE: How to use rel=alternate and hreflang=es to help with International SEO?
If you have any doubts I I agree with what Martijn has said.
Also, you can validate that there in the header using this
- For more up to date references (if needed)
- https://moz.com/learn/seo/hreflang-tag
- https://ahrefs.com/blog/hreflang-tags/
- tools https://www.aleydasolis.com/english/international-seo-tools/hreflang-tags-generator/
Hope this is been of some help,
Tom
-
RE: Woocommerce add-to-cart causing increase in temporary redirect
From an SEO perspective using 302 redirects for Woocommerce add-to-cart makes complete sense I would keep it that way.
it sounds like you have and add to cart button on every page and seeing this more often than just once?
Can you share your domain with me for or screenshot of a report showing temporary redirects it is fairly simple to figure out whether coming from and then simply change them if needed income if they are appropriate? I would not bother with the no-follow if it's a 302 without assuming it's very hard to tell if it will make. that much of a difference. Remember yes a 302 is a no index but, could become 301 in Google's eyes in the future.
Could you run your site through a screaming frog? Or would you be willing to share your domain?
if I could see what you are talking about I would be able to give you a lot more feedback that would be more valuable I think.
Respectfully,
Tom
my info
-
RE: Redirect Management on Headless Wordpress w/ React Front End
PS may I see your GitHub?
-
RE: Redirect Management on Headless Wordpress w/ React Front End
I love to see people using Headless WordPress below five given free ways to do this a reverse proxy is a win-win however if you. want to do it with "react-router-dom" for the WordPress settings in the first settings two URLs listed below you can't miss them.
I have always used Pantheon.io for headless or Pagely
One very simple and I believe personally saves you a lot of time method is to use a reverse proxy to do any redirects.
my personal recommendation Fastly.com if you want something a little bit more generic and not as fast or as developer-friendly you can page rules ( redirects if you want to be) very inexpensively like five dollars a month for 30 on Cloudflare.com For tools like this which are very good no matter what code you are running https://www.easyredir.com/blog/how-to-redirect-a-url/
** I would still use Fastly ( they no longer charge for SSL certs)**
I of doneness with node.js I just have not done with react but what you're going to have to read through these two URLs and decide which end of it want to do the redirections from you can redirect from PHP you have to change your settings.php and your function.php file for the setup settings number two Install react-router to create routing for React.
npm install react-router-domTHE 2 URLs listed below
- https://www.efficiencyofmovement.com/posts/redirect-headless-wordpress/
- https://hybrit.org/blog/combine-headless-wordpress-with-a-react-spa-part-1
you're going. to have to WP changes to the settings.php as well as function.PHP
https://github.com/joshsmith01/headless-wp
- https://medium.com/moij/going-headless-with-wordpress-graphql-and-react-b939263a6f3d
- https://discourse.roots.io/t/detach-front-end-and-use-wp-headless/11419/2
- https://github.com/postlight/headless-wp-starter/issues/131
Install react-router to create routing for React.
npm install react-router-domReact Router allows us to create individual routes for our pages and adjust our URL. In order to dynamically create the routing, we can add the following code. Don't forget to import
react-router-domLet's change the following code inside the render function to:
| | // Import Router at the top of the page |
| | import { |
| | BrowserRouter as Router, |
| | Link, |
| | Route |
| | } from "react-router-dom"; |
| | |
| | render() { |
| | const { pages } = this.state; |
| | |
| | return ( |
| | |
| | <fragment></fragment> |
| | {/* Links /} |
| | |
| | {pages.map((page, index) => { |
| | return {page.slug}; |
| | })} |
| | |
| | |
| | {/ Routing */} |
| | {pages.map((page, index) => { |
| | return ( |
| | |
| | exact |
| | key={index} |
| | path={/${page.slug}} |
| | render={props => ( |
| | <examplepage {...props}="" page="{page}"></examplepage> |
| | )} |
| | /> |
| | ); |
| | })} |
| | |
| | |
| | ); |
| | } |view rawApp.js hosted with
by GitHubCreate a folder inside the
srcfolder calledcomponentsand addExamplePage.jsto it. InsideExamplePage.jswe will add the following code| | import React from 'react' |
| | |
| | export default function ExamplePage(props) { |
| | const { page } = props; |
| | return ( |
| | |
| |{page.title.rendered}
|
| | |
| | |
| | ) |
| | } |view rawExamplePage.js hosted with
by GitHub I hope this is been some help to please let me know if you run into any obstacles I like that you are doing headless Pantheon is. a great resource for this as well. Let me know if you run into any obstacles or if I can be of more help. all best,Tom -
RE: What can be done to regain backlinks dropped (20%) in May 4th Core update?
"have seen two large drops in backlinks (10% drop twice) after the May 4th Google Core update"
It is possible to lose the physical backlink because of an update you can definitely lose backlink equity but the physical backlinks would not be touched by an accurate meaning Google might find some links that consider worth nothing and demote them but thay would still physically point to your website.
you are going to want to run a crawl your website making sure you have no broken links 404's etc.
you can use Moz, screaming frog, deep crawl, oncrawl, Ahrefs, etc.
- https://moz.com/blog/googles-may-2020-core-update-winners
- https://www.mariehaynes.com/may-2020-core-google-update/
The best way to Identify the missing backlinks is to use a tool like Moz Link Explorer
however, I think it's very unlikely that physically lost backlinks to your website because of May 4th Google Core update
- https://analytics.moz.com/pro/link-explorer/home
- https://majestic.com/
- https://ahrefs.com/
- http://www.linkresearchtools.com/
"get them back"
You need to you'll need to lineup the site backlinks with the URLs they were pointed to and remain the old URL or create a 301 redirect to an extremely similar page when I say extremely similar I mean almost identical page.
What I also consider doing means it's an outreach if you lost a lot of links to your site miniatures listed above will be to show you exactly what was lost in it sounds like it's a technical error.
if it is something more than not use pitchbox.com they are by far the best method reaching out to people who have moved or removed your links.
If you want to share the domain be happy to take a look for you.
I'm guessing this is just a coincidence regarding the update unless you have other information?
respectfully,
Tom
-
RE: Huge organic traffic drom after a perfect domain migration. What to do?
I apologize I just spent approximately 60 minutes plus writing this out looking at the issues you have in my laptop was unplugged the battery died and I'm back to square one. What I will tell you is I can help me with this need to give me access to search console and allow crawlers your site.
Your current robots.txt file does not contain your new or old site map I found your new site map but I cannot find your old site map check it for a bad bite order mark or BOM
- https://www.deepcrawl.com/blog/best-practice/common-robots-txt-mistakes/
- https://www.distilled.net/resources/free-web-based-robotstxt-parser/
- https://opensource.googleblog.com/2019/07/googles-robotstxt-parser-is-now-open.html
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php## top site map should be old XML vihara
Sitemap: https://meditatieinstituut.nl/vihara-sitemap_index.xml
Sitemap: https://meditatieinstituut.nl/sitemap_index.xmlin addition, you are currently not using Cloudflare for using Apache which shows me the site is clearly not going through Cloudflare a reverse proxy which modifies the headers they would've been rewritten like this https://support.cloudflare.com/hc/en-us/articles/200170986-How-does-Cloudflare-handle-HTTP-Request-headers-
HTTP/1.1 200 OK
Date: Thu, 21 May 2020 12:25:15 GMT
Server: Apache/2
Vary: Accept-Encoding,Cookie,User-Agent
Upgrade: h2,h2c
Connection: Upgrade
Last-Modified: Thu, 21 May 2020 11:33:56 GMT
Accept-Ranges: none
Cache-Control: max-age=521, public, public
Expires: Thu, 21 May 2020 12:33:56 GMT
Referrer-Policy:
X-Pingback: https://meditatieinstituut.nl/xmlrpc.php
X-Powered-By: W3 Total Cache/0.13.3
Pragma: public
Content-Length: 137745
Content-Type: text/html; charset=UTF-8
- http://bseo.dev/WXRAtI
- http://bseo.dev/cM1gCX
- http://bseo.dev/iOr2qC
- http://bseo.dev/8XLDtp
- drop this into Google "site:https://meditatieinstituut.nl" 440 indexed URLs
- drop this into Google "site:https://vihara.nl" 406 indexed URLs
in addition, if you would like to set up the old domain to redirect to the new domain and none of the URLs or change just follow these simple instructions it will work like a one-to-one match I would like to see your HTAccess code
https://medium.com/@natterstefan/redirecting-one-domain-to-another-with-cloudflare-66bd1c88bfae
I would like to know if you are working with a good hosting company as well as what you used to do a search and replace of your URLs?
Check your URLs using WP CLI or better search and replace better yet do a complete check on both sides the migration could've happened perfectly using WP Migrate DB Pro because it now migrates themes images and files it is an expensive tool the best in the industry at what you're doing
https://deliciousbrains.com/wp-migrate-db-pro/
if you would be kind enough to send me screenshots of the configuration of your website or if you just want to speak with me click on my photo and give me a ring I am more than happy to tell you a lot more than I have time to write down for the second time here. I'm sorry that this is a very compact version of what I want to talk to about but it took me a long time to write out everything the first time and then of the power failed.
please look at URLs to them are PDFs to them are CSV's I would like to import your site into Pantheon, servebolt, Pagely or if you're looking to spend under $30 a month https://pressidium.com/
you should also get your backlinks updated with an outreach tool called https://pitchbox.com it is superb for link outreach and would help you site migrations.
sincerely,
Thomas
-
RE: 4xx errors
Okay, I'm a Shopify plus partner so I wish I could've been more help but I could only find it is a 404 from almost a year ago. According to the way back machine or archive.org
https://web.archive.org/web/20180816071312/https://cracklefireplaces.com/
The page that you're referencing has no backlinks.
|
HTML Rel-Canonical
I would strongly consider looking at the liquid code inside of your team as well as simply rebuilding the page. Also when you doing shall provide a search engine optimization you don't want to have a duplicate of the content of your collection inside of your product content.
I don't think it should be too hard to replace the missing page and use the same URL. You may also redirect that URL to your new product page. To find the redirects simply going to navigation and look on the upper left-hand corner and you will see redirects.
You see have you have a collection and a product in the same URL normally that is not a good thing but it does mean that there is a duplicate, unfortunately, checking I could not find a straight product page for That product.
One thing I would strongly recommend no matter what version of Shopify you are running is using a backup tool like rewind
there are other backup tools this is just the one I prefer if you type in backups into Shopify apps you will see what I mean
https://apps.shopify.com/backup having a backup Can be a real lifesaver it should be built-in but unfortunately, it's not.
Lastly, this is a great article on Shopify SEO for my good friends at go Fish I would recommend reading it as it goes in deeper than what I had about the issues with having collections and products inside the same URL.
https://moz.com/blog/shopify-seo
Let me know if I can be of any help,
Tom
-
RE: My articles aren't ranking for keywords
Give me a little bit of time to inspect everything it looks like you did a good job with the content but all of it is orphan that I've seen. Meaning there is no other internal link pointing to it. This makes it difficult for Google to find also the main website it's on does not have a lot of backlinks that has very few this can affect your ability to be indexed quickly and rank.
DOMAIN LRT Power11LRT Trust1Backlinks 84Ref Domains 13 give me a little bit of time and I will give you a better answer. Sincerely, Tom
-
RE: Client suffered a malware attack. Removed links not being crawled by Google!
I'm sorry to hear that that happened to you.
I'm assuming that you do not have a backup?
if you do have a backup obviously could return your site to normal but let's base this question on the idea that you do not have a backup.
what I would do is make sure that that site is cleaned up professionally if you do it through this Sucuri I can tell you through first-hand experience they do an excellent job & will back the work for one year and it sounds like you have a pretty nasty bit of malware on your site. Go here and you can have the site crawled and it will tell you a little bit more about what has infected the site and then you can purchase one of their plans to have your site
https://sitecheck.sucuri.net/ this is a free check of the site I am sure you'll see that you had some sort of malware injection.
This is where you go to purchase the plan that best fits your budget and your needs. All of the plans will remove the hack from your site and remove any blacklists
https://sucuri.net/website-security-platform/signup/
This is 100% platform-independent and will give you a very powerful firewall for free with the service for the next year as well. This will prevent your site from being attacked in the same manner it was before.
I wish I had a free solution to offer you but this is the closest thing and it's definitely cheaper than hiring a developer to try to figure out whether it's a really bad attack or just something minor you want to make sure. all those back doors are closed.
I hope this is of help to you,
Tom
-
RE: Do keywords within a dropdown menu add any SEO value?
I would have to see the website in order to tell you rather or not I think having the drop-down would be beneficial. Also whether your menu is made of Java or HTML in my opinion still matters.
Sometimes having a drop-down menu can be beneficial if it's siloed correctly would you be opposed to posting the site?
Like Martijn said it depends on a lot of on-site structure
Hope I have been somehow helpful,
Tom
-
RE: Can Schema handle two sets of business hours?
Hello,
covid I with you and your client safe days ahead
https://developers.google.com/search/docs/data-types/special-announcements
https://www.schemaapp.com/how-to/your-guide-to-covid-19-structured-data/ https://schema.org/CovidTestingFacility
https://support.google.com/business/answer/3039617?co=GENIE.Platform%3DiOS&hl=en
for Structured data I would use it like pharmacy’s do but name is what you like.
this tool will help you make your own
https://technicalseo.com/tools/schema-markup-generator/
Here are some great examples
<title>Dave's Department Store</title>
<scripttype="application ld+json"="">{
"@context":"https://schema.org",
"@type":"Store",
"image":[
"https://example.com/photos/1x1/photo.jpg",
"https://example.com/photos/4x3/photo.jpg",
"https://example.com/photos/16x9/photo.jpg"
],
"@id":"http://davesdeptstore.example.com",
"name":"Dave's Department Store",
"address":{
"@type":"PostalAddress",
"streetAddress":"1600 Saratoga Ave",
"addressLocality":"San Jose",
"addressRegion":"CA",
"postalCode":"95129",
"addressCountry":"US"
},
"geo":{
"@type":"GeoCoordinates",
"latitude":37.293058,
"longitude":-121.988331
},
"url":"http://www.example.com/store-locator/sl/San-Jose-Westgate-Store/1427",
"priceRange":"$$",
"telephone":"+14088717984",
"openingHoursSpecification":[
{
"@type":"OpeningHoursSpecification",
"dayOfWeek":[
"Monday",
"Tuesday",
"Wednesday",
"Thursday",
"Friday",
"Saturday"
],
"opens":"08:00",
"closes":"23:59"
},
{
"@type":"OpeningHoursSpecification",
"dayOfWeek":"Sunday",
"opens":"08:00",
"closes":"23:00"
}
],
"department":[
{
"@type":"Pharmacy",
"image":[
"https://example.com/photos/1x1/photo.jpg",
"https://example.com/photos/4x3/photo.jpg",
"https://example.com/photos/16x9/photo.jpg"
],
"name":"Dave's Pharmacy",
"telephone":"+14088719385",
"openingHoursSpecification":[
{
"@type":"OpeningHoursSpecification",
"dayOfWeek":[
"Monday",
"Tuesday",
"Wednesday",
"Thursday",
"Friday"
],
"opens":"09:00",
"closes":"19:00"
},
{
"@type":"OpeningHoursSpecification",
"dayOfWeek":"Saturday",
"opens":"09:00",
"closes":"17:00"
},
{
"@type":"OpeningHoursSpecification",
"dayOfWeek":"Sunday",
"opens":"11:00",
"closes":"17:00"
}
]
}
]
}</scripttype="application>remember to submit the changes to google here [https://search.google.com/search-console/special-announcement](https://search.google.com/search-console/special-announcement) I didn’t finish the last one but the tool will help ma it work sincerely, Tom -
RE: Any idea how to build back links for YouTube Channel?
I am very happy I could be of help!
Sincerely ,
Tom
-
RE: Any idea how to build back links for YouTube Channel?
Happy to be of help. One of the best ways to get links to a video. Is use something like WISTIA or even YouTube on page and link to the actual page containing the video. That would be where you see big results.
With Wistia now adding JSON-LD to the website page it’s perfect for anything on the site. You can also use YouTube great off site.
I if you want the best out reach tool checkout https://pitchbox.com it will help you make everything so much faster.
all the best,,
Tom
-
RE: Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
301 or 302 redirecting well if you’re going to do one you should do 302 but it’s not going to help you much unless you’re going to send the URL to a 410 when something is no followed it is the same thing as what robots.txt will do.
https://support.google.com/webmasters/forum/AAAA2Jdx3sUEbHp0yjgT6c?hl=sv
do you have a report from Google knowing that you have been penalized?
is there anyway you could run Screaming Frog and show some of these URLs that you’re talking about?
respectfully,
Tom
-
RE: Website Redesign & Ensuring Minimal Traffic/Rankings Lost
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom