Questions
-
Bad for SEO to have two very similar websites on the same server?
Hi there, The first question, as the others have stated, is whether the content has been duplicated. If so, that's a problem no matter where the websites are located. However, if you have two totally different websites from a content point of view on the same topic, hosting them together should not be a big problem. The only issue would be Google's understanding that the two resources were owned by the same entity, and therefore ranking them together might not constitute the most diverse user experience for people searching on the topic. Google reads registration information as well, and also understands that some servers / hosting services host hundreds of thousands of websites so some topic cross-over is natural without the sites being owned by the same person or company. If you had two sites selling pet food on the same dedicated server for the sole purpose of having two sites rank for the same pet food queries, I'm not 100% sure Google would ignore their hosting though.
White Hat / Black Hat SEO | | JaneCopland0 -
Does showing the date published for an article in the SERPS help or hurt click-through rate?
It will if its a too old article say like 1 yr old. But it also depends on the article. some articles are ever green and even its old it is still valuable.
White Hat / Black Hat SEO | | vivekrathore1 -
Does Google Analytics Adjusted Bounce Rate Lead to Increase in Average Time per Visitor?
You are correct, adding code to a page to 'adjust' the bounce rate can effect your 'average time per visitor' statistic. This is because of how google measures the time spent on a page... Normally, if a user opens one page, then does not visit any more pages on your site, it will count as a bounce (even if the user had remained on that page browsing for 10 minutes). This is because there is only one call made to google analytics when the page is opened. There is no call made to google analytics when the page is closed. So normally, the 'time on page' is calculated by taking the time stamp of when the current page is opened, and comparing it to when the next page on your site is opened. The difference between the two is your 'time on (previous) page'. So what happens when a user only opens one page on your site and leaves (bounces)? This will be counted as a 0 second visit (even if the user was on the site for 10 minutes). Thus bringing down the average visit time for all visits. What happens when you add the 'adjusted bounce rate' code to your page, is that a 2nd call is made after x seconds to the google server.... Allowing google to know that the user has in fact remained on the page for an extended period of time. So now a whole bunch of these '0' second (bounced) sessions will be converted to longer sessions based on the time between the 2 time stamps. The more 'one page only' visits you have to your site, the more this has the potential to skew your average session time. On a side note, this will also effect the last page visited of multi-page sessions, as normally google would not know how much time was spent on the last page of the site as well.
Search Engine Trends | | ForForce0 -
Sitemap Question
I don't think it's a big deal not to have it, but for something that can be done in literally 10 seconds I'd play safe than sorry. To explain what I meant about the duplicate home pages, you can often have multiple versions accessible to the public. For example: Yourself.com www.yourself.com www.yourself.com/index.php I haven't looked at how you have it set up, but all should go only to one version to prevent home page duplication, and the final version you decide on would be the one you include in the sitemap. Hope that helps!
Search Engine Trends | | David-Kley0 -
Wordpress Speed Optimization Inquiry
After some testing, we found WPEngine to be the best solution for Wordpress site speed. They are a hosting company that specializes in Wordpress and they use a lot of caching and CDNs without you needing to set anything up or worry if things are working properly. I'm sure similar results can be duplicated by building out your own caching and CDN, this is just another option for you.
Search Engine Trends | | WilliamKammer0 -
Dev Subdomain Pages Indexed - How to Remove
Go to WMT for whatever dev site you want to remove from the index. Use the URL removal tool, but in the box just enter /, nothing else. This will remove the entire site.
Intermediate & Advanced SEO | | Kingof50 -
Link Juice Inquiry
Hi Brandon, To the best of our knowledge, Google does not "notice" or count URL content after a hash, meaning that the link juice would stop with www.example.com. It is almost certain that the home page is receiving the benefit of these links. Cheers, Jane
White Hat / Black Hat SEO | | JaneCopland0 -
Social Signals Inquiry Regarding Competitor
Since it is shares and likes in a matching number for each page it sounds like the probably bought a service from a site like Fiverr to give them a certain amount of shares, follows, re-tweets, etc. They could also be using a program like Synnd to automatically get social shares. Either way, I wouldn't spend much time worrying about it.
Social Media | | Stellar_SEO0 -
Inbound Links Inquiry for a New Site
Hi To answer your questions: For a site that is only one to two months old, what is considered a natural amount of inbound links if you're site offers very valuable information, and you have done a marketing push to get the word out about your blog? This is really a "How long is a piece of string" question. It depends. If the site is for an established brand launching say a new site, then inbound links during that time could escalate to 1000s or tens of thousands wouldn't be unrealistic to expect. For an unknown, who knows? In one sense, it doesn't really matter. What matters is that those inbound links are producing results, both from people clicking on them and it benefiting your site from an SEO perspective. Even if you are receiving backlinks from authority websites with high DA, does Google get suspicious if there are too many inbound links during the first few months of a sites existence? Again, it would depend on whether or not the new site was for an established brand or for an unknown, but suspicion isn't necessarily based on numbers - although it would be fair to say that the higher the number the more it might flag up an issue. The main thing though is that Google's algorithms are sophisticated and able to detect link quality on the basis of a number of metrics, e.g. the social profile of a site. You could just have 10 links and it could flag an issue. I know there are some sites that blow up very fast and receive thousands of backlinks very quickly, so I'm curious to know if Google puts these kind of sites on a watchlist or something of that nature. Or is this simply a good problem to have? As I said above, the more links accrued in short space of time, the more likely a yellow or red light might start flashing on Google's dashboard, but again it comes down to link quality which is evaluated on a number of metrics that will determine if there is an issue. I hope that helps, Peter
White Hat / Black Hat SEO | | crackingmedia0 -
Seeking Top Notch Marketing Company with experience in growing sites post manual penalty
There is a list of recommended businesses via a link at the foot of the page. Most SEO guys are now very experienced in dealing with penalty issues as it has become so common.
White Hat / Black Hat SEO | | MickEdwards0 -
What is the point of having images clickable loading to their own page?
I could see linking to an image file itself as useful if the image were larger and you wanted to display it outside of a paragraph of text. Many infographics could qualify for a page of their own. The site would still benefit from traffic and from authority.
Intermediate & Advanced SEO | | Thos0030 -
Noindex : Do Follow or No Follow Tags?
I think the way you have your tags set is perfect. While having a robots=noindex tag allows google to read the page and pass pagerank, there's no point listing them in search results as they actually don't offer any valuable content; they link to, but they don't have it. That said, google goes over your tag pages passing pagerank, which ultimately get more juice to your articles. Sometimes, within a post, you can link to a tag page within the content, so users can read more about certain topics and you are passing more juice to that tag page, although not being indexed, that tag page passes pagerank back to the articles. The noindex has nothing (almost) to do with pagerank, it could be a tremendously useful page (you can even earn links to it from external sources), but they are only valuable to users reading your content and interested in a particular "subject" you mention within the post/context.
Intermediate & Advanced SEO | | FedeEinhorn0 -
NoFollow tag for external links: Good or bad?
I don't think nofollowing lots of links would be unnatural, as it doesn't strike me as a quick "hack" to boost your rankings. If nofollow was taken off lots of links then that might be something Google would pay attention to as PageRank would then be passed to these links. If you can't endorse / trust the links then nofollowing them is what Google recommends: https://support.google.com/webmasters/answer/96569?hl=en&ref_topic=2371375 If you want to be extremely cautious then you could add nofollow to the links on some of the websites, and see if anything bad happens before doing the rest. George
White Hat / Black Hat SEO | | webmethod0 -
How are you supposed to manage your backlink profile?
We have had over 9k new backlinks shown daily on majestic as of late...I notice many site wides, and then some that are relevant. I can't see google expecting us to manually review them all!
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Focusing on Multiple Niches for one site: good or bad?
Good response Patrick. I would agree with him. You can target multiple niches on one site however take due care with your on site efforts / taxonomy structure and get ready to add loads of good relevant content.
Intermediate & Advanced SEO | | SEO5Team0 -
Over-Optimization Inquiry
Hi, The question you should really be asking yourself is if the keywords in the image alt text are an accurate description of the image. The goal should be to get the image to rank for relevant queries in image results or universal results. If you're simply targeting the same keywords in the title and image alt text, but it's completely relevant, then I wouldn't worry about it so much. Google's over-optimization penalty (I assume you're referring to Penguin) usually affects sites with much more nefarious spam techniques than what you've described. Things like keyword stuffing accompanied by a ton of keyword-rich links to the page from low quality sites. If you want to check to see how the algorithm changes affect your site, you could look at the Penguin update dates on Moz's algorithm change page and compare it to your site traffic. If you've been hit by Penguin, you'll likely see a steep drop in traffic after one of the algorithm change dates. -Trung
White Hat / Black Hat SEO | | trung.ngo0 -
Google Manual Penalties:Different Types of Unnatural Link Penalties?
Hey There I would download all of your link data from; webmaster tools ose majestic maybe ahref too And pull it together and comb through it for bad links. I think you'll really have to look through them to see what's going on. Maybe something was missed? First you need to confirm there actually are no spammy/bad links In a removal / disavow situation the goal is to remove/disavow ONLY bad links - which there could only be 10 out of 100's - so you should sort through them. -Dan
Intermediate & Advanced SEO | | evolvingSEO0 -
Is Inter-linking websites together good or bad for SEO?
Google's advice on this is a bit vague, and the practical consequences can vary a lot. Linking together a couple of sites is usually fine - linking together dozens or hundreds could get you marked as a link network and get all of your sites penalized. Usually, as Richard and James said, it's more that Google will simply devalue the links, especially if those sites share ownership/hosting/etc. It's just too easy to cross-link your own properties. I don't think getting too fancy with hosting, C-blocks, etc. is the answer. That's a lot of work, and Google can still connect you on ownership and other cues. To erase all of those cues is a lot more time, effort, and money than links from a couple of sites are really worth. The best advice I can give is that, if you cross-link, do it in a way that's clearly of value to users. In other words, just linking these sites to each other in the footer is almost going to guarantee that Google ignores those links. If, however, you can link specific content to directly relevant content on another site, they're much more likely to let those links carry equity, and that's going to be valuable for your visitors and let them usefully traverse your sites. So, think of it more as a CRO task - how can you get visitors from one site to meaningfully engage in and convey on your other sites? If you can do that, and if you're only talking a handful of sites, you have some chance at making those links carry value.
White Hat / Black Hat SEO | | Dr-Pete0 -
Can you noindex a page, but still index an image on that page?
In theory, this shouldn't be a problem. When you use an image on a page, you are embedding that image from somewhere else - it's own URL. Therefore, if the page itself is "noindex,follow" - that does not apply to the images (or anything else like a video, pdf etc) that is embedded into the page. Provided that the URL holding the image is allowed to be indexed on your root domain, which it will by default, you can index the image. You can test this out by uploading images to your root domain and then, in the next few days, performing this site search: site:yoursite.com filetype:jpg (or png/gif/whatever you used) That will show any of the images that you have on the site that are in the Google index. If that does not work, try this search: site:yoursite.com inurl:imagefilename Replace those details with your real site and what you named the image when uploading and it should show you whether or not it's indexed or not. If not, you could try sharing the image on Google+. G+ is ridiculously effective at getting web pages and content indexed. But just because the page where the image is present on is instructed to be noindexed does not mean that the image itself will inherit that quality. By default, it should be indexed properly. You can also create image sitemaps to help with indexing (I believe you're running Wordpress, Yoast's SEO wordpress plugin can help with this). Hope this helps.
Technical SEO Issues | | TomRayner0