I'm not sure where you're seeing these crawl errors. Most likely, though, they are links in your website where you list your contact information (your email address). So, the pages that list sales@intercallsystems.com might have the link to your email coded wrong.
Posts made by GlobeRunner
-
RE: Http://newsite.intercallsystems.com/vista-series/sales@intercallsystems.com
-
RE: 404 crawl errors ending with your domain name??
In order to understand your question, can you give me more information? What crawler are you using? What do the URLs look like? You can give an example, just remove the domain name if you like.
-
RE: What Google Analytics Data to Share with Potential Website Buyer
Initially, I would get the competitor to sign an NDA so that there aren't any issues later. It really doesn't matter how much you're going to sell the website and domain name for, you want to protect yourself in the future.
Initially, I would point them to SEMrush.com for data, as that typically has as much as someone would need. And, it's a third party offering the data. As for Google Analytics access, I wouldn't give them Analytics access at first, I would ask them what data they are looking for. You can typically give them a PDF that shows the past year of page views, showing the "monthly" view. I also would share with them the referral sources, so that they understand that the site isn't gaming the page view numbers.
-
RE: How do I set up 2 businesses that work together but are ran seperately with two separate websites but similar content?
It is definitely possible to maintain two separate websites. It sounds as if they don't do the same thing, so technically the content will be different. The websites use a different template, and from what I can tell they don't have any duplicate content issues.
Since the sites are related, it would be natural for the two websites to link to each other. But you don't want to run into any search engine penalties from having them link to each other. In order to do that, you'll need to make sure that the link profiles of each website are completely different.
What I would focus on is the links to each website, and have a plan to acquire good, trusted links to each of them. One is product related, so you'll want to focus on where you can get your products listed. The other is a service type business, so getting the site links that are appropriate for that site would be helpful.
-
RE: Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I know this is frustrating. There are a few areas that I would look into that could be causing this: duplicate content issues and links. First, look to see if you have any duplicate content issues on the site. There could be a duplicate copy of the site (perhaps a dev version that should not be indexed) or even certain content on your site that's causing issues. You might try Siteliner's crawler to identify if there are any issues you can fix.
Another possible reason is the links to the site. The site could have been hit by negative SEO, and a lot of "low quality" links or off-topic links could be pointing to your site. I've seen this in the past, and the only thing you can do is identify the links and disavow them. Sometimes you can get them removed, but disavowing them should work.
-
RE: HTTPS Campaign Settings
You definitely need to verify the https version of the site in Google Search Console. I would also make sure you verify all of the versions: https:// https://www http:// and http://www which are all different versions. If you have submitted a disavow file previously you will need to also upload a new disavow file for your new https version, as well.
As for moz, I would go ahead and set up a new campaign, as it's a different URL.
-
RE: Is pagerank still a ranking factor for Google?
Google may still be using something related to PageRank or may still be using PageRank internally. They may use it for determining which pages to crawl first--and which ones to crawl less often.
However, publicly there is no PageRank data available, and it's not going to be updated in the future. So, we can consider it dead at this time.
-
RE: GWMT / Search Analytics VS OpenSiteExplorer
Whenever you deal with links, even though I really like OSE, typically we have to compile all of the link data from multiple sources. We typically use OSE, Majestic, ahrefs, Google Search Console, as well as others and compile all of the links into one spreadsheet and then look at them there. Different sites have different crawlers and no one source is the most accurate.
-
RE: Anybody have a SMX West 2016 Coupon Code?
I would check the social media sites, some sponsors may have coupon codes, and they typically share those codes on the social sites if you follow and connect with them.
-
RE: CcTLDs vs folders
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
-
RE: Paid Link/Doorway Disavow - disavowing the links between 2 sites in the same company.
I'm not sure if I totally understand your question. Can you explain what type of doorway/paid link activity you're referring to?
You need to evaluate each link, one by one. If the link is a good, natural link, then I wouldn't disavow it. If that link is a sponsored link or paid link, then you might consider adding a nofollow link attribute. If it's something that violates Google's Webmaster Guidelines for linking, then you should remove the links, not just disavow them.
-
RE: Doing a re-design but worried about my new navigation affecting rankings
Generally speaking, the navigation should be fine. I would be more worried if you were going to change URLs (the page URLs would be changing to other URLs). I would take a look further at the internal linking structure currently and see which pages link to which pages. Then, consider if your new navigation would add more internal links to those pages or take away internal links to those pages.
You can crawl your own site and see how many links are pointing to certain pages in order to see if your new navigation will increase the internal links to decrease those internal links.
-
RE: Strange 404s in GWT - "Linked From" pages that never existed
It's quite possible that at one point there was a link there--because the page rendered for some reason. I would crawl the site yourself using a crawler (there are several available) to make sure that the page isn't reachable from, perhaps, a bad link on the site.
Check the archive.org to see if the page existed at one time or not.
I would also take a look at the page's server header again to see if the site is showing a 404 error or a "200 ok" along with a "page not found". It's possible that the page doesn't exist but it delivers a "200 OK" server header anyway. Another option is that it might be in your sitemap.xml file.
When in doubt, if the page doesn't exist, I would mark it as fixed in Google Webmaster Tools and watch if it comes up again. If it doesn't come up again as an error, then I wouldn't worry too much about it.
-
RE: How to handle backlinks associated with website analyzers like whois.domaintools.com or alexa.com?
As previously mentioned, it's natural that sites get certain links without doing anything. These types of links are natural. Typically I see many of them have nofollow attributes on their outgoing links, but some do not. I wouldn't worry too much about them. It wouldn't hurt to disavow them, but again most likely they're not going to hurt your site's rankings.
-
RE: Can a homepage have a penalty but not the rest of the pages?
Daniel,
Yes, it is very possible (and that's a lot of the penalties we see lately). The penalties are only given to certain page on the site, usually because of the over-optimization of anchor text links pointing to those pages that are penalized.
So, I would do a thorough review of all the links pointing to the site and make sure that the anchor text isn't over-optimized. There should be more "brand" phrases and compound phrases rather than exact-match keyword anchor text links pointing to the page.
In Google Webmaster Tools, when sites get manual actions, they're either site-wide or partial matches when it comes to the link penalties that are applied to sites.
-
RE: Disabling a slider with content...is considered cloaking?
ACann, as long as you're not serving up content to the search engines separately than what the users see, there shouldn't be a problem. I could see if you disabled it just for the users and you still allow bots or search engine bots to crawl the content then that would be an issue. It sounds like you just disabled it for all visitors. So, then I don't see any issues.
-
RE: Is there a way to set up a wordpress site so that the content is changed based on a location?
Ron,
There are WordPress plugins that allow you to serve content based on geolocation. For example, the "Custom Content by Location" plugin comes to mind http://wordpress.org/plugins/custom-content-by-country/
Alternatively, outside of WordPress, you can use PHP code to determine the user's location and then serve them up separate content. There's a sample of that code here: http://www.adviceinteractivegroup.com/how-to-display-unique-content-based-on-geolocation/
-
RE: Wordpress themes causing google penalty(need experts to settle a debate)
The theme generally doesn't matter at all. As long as the content is unique on your site, then there's no problem whatsoever. Like previously mentioned, there are plenty of people using the same WordPress theme, and it's the content that's unique, not the theme.
-
RE: Site Redesign: 302 Query
Hemblem,
Although you're redesigning the site, I actually don't recommend using a 302 redirect during the 'redesign' process, as it can have disastrous effects on search engine rankings. I would prefer that you keep the current website up and running: and then 301 redirecting the appropriate pages when the site is ready to go live.
I realize that you want to do, but I have seen too many websites have problems getting things straightened out with the search engines to do what you're suggesting.
-
RE: Brain Teaser - Dead Link Ranking in SERP's
Vanadium Interactive, I'm not sure what you're asking here. Obviously the first order of priority would be to fix the site. Most likely there's an issue with the .htaccess file that is causing the issue, since the non-www version of the site is accessible and the www version is not. Should be an easy fix by any competent programmer familiar with .htaccess files.
The next priority would be to totally get rid of iframes on the site, there's just no need for them anymore, and they're not very search friendly. Each individual page can get indexed separately, and you might have someone visit the "top frame" page of the site, and they couldn't navigate the site very well.
Sites like this generally rank well because of the links pointing to them, not because of the actual content on the page. I can see that the page was cached and it was working recently, but just recently ended up with the current problem.