Questions
-
News site and duplicate/cannibalism/old
No index wouldn't be the best practice, especially if there are links pointing to that page. I would 301 redirect them or let them 404 off. 404 isn't always a bad thing. Its quite useful for getting rid of poor preforming content and unwanted material. You could always combine the old with the new and just 301 re-direct the old URL Hope this helps
On-Page / Site Optimization | | Colemckeon0 -
Spam Score and You
How the spam score works on blog type website? Some of my pages and blog post have hight spam score then main domain.
Intermediate & Advanced SEO | | Sjani111 -
Google Site search operator showing different results than Search Console
First of all Google Search Console can show you Crawled Pages and **Indexed Pages. **Google follows three basic steps to generate results from web pages: Crawling Indexing Serving (and ranking) Crawling: The first step is finding out what pages exist on the web. There isn't a central registry of all web pages, so Google must constantly search for new pages and add them to its list of known pages. This process of discovery is called crawling. Indexing: After a page is discovered, Google tries to understand what the page is about. This process is called indexing. Google analyzes the content of the page, catalogs images and video files embedded on the page, and otherwise tries to understand the page. Serving: When a user types a query, Google tries to find the most relevant answer from its index based on many factors. Google tries to determine the highest quality answers, and factor in other considerations that will provide the best user experience and most appropriate answer, by considering things such as the user's location, language, and device In summary, no all your pages at least not all the pages in your Search Console will be available on SERPs
Moz Pro | | Roman-Delcarmen1 -
Internal Linking issue
Thank you! I think the problem is what you mentioned in the previous post. So MOST of them are indexing but I guess really it is a ranking problem. I have been banging my head against a wall and I cannot figure out why this site isn't ranking, driving me nuts!
Technical SEO Issues | | HashtagHustler0 -
At scale way to check content in google?
In Search Console (new version) go to Coverage > All Known Pages. Then click "Valid" above the chart, Then click on "Submitted and Indexed" and download to a spreadsheet. Do the same for "Indexed, not Submitted in Sitemap". Combine these two downloaded lists into one list in a tab in excel, and in another tab you can put the URLs of all the pages you think should be getting indexed (unless you think all of them are in your sitemap anyway). And then use vlookup formulas to find whether each of the pages you know are on your site are actually indexed (by looking up in the tab containing the downloaded lists).
Intermediate & Advanced SEO | | seoelevated0 -
Internal Links Not Registering
Hi, This has happened to me too. Before the changes to domains, I had over 24,000 links showing on my internal links in the Links research. Now I have less than 3000. I have checked on Search Console and that is still showing as 24,000, but they aren't registering on Moz. What's changed? Thanks Lydia
International Issues | | lydiarosdesign0 -
Cant find source of redirect
A few thoughts: Install the browser extension Ayima, which will let you see if this is actually the result of multiple redirects. There are other ways to see this same info, but the Ayima extension makes it really simple to see the multiple hops, when there are. You might try to sequence the existing redirect that is in your htaccess file (A to B) all the way up, or all the way down. There is a sequence followed (most specific to most general).
Technical SEO Issues | | seoelevated0 -
Url shows up in "Inurl' but not when using time parameters
There are several ways to do this, some are more accurate than others. If you have access to the site which contain the web-page on Google Analytics, obviously you could filter your view down to one page / landing page and see when the specified page first got traffic (sessions / users). Note that if a page existed for a long time before it saw much usage, this wouldn't be very accurate. If it's a WordPress site which you have access to, edit the page and check the published date and / or revision history. If it's a post of some kind then it may displays its publishing date on the front-end without you even having to log in. Note that if some content has been migrated from a previous WordPress site and the publishing dates have not been updated, this may not be wholly accurate either. You can see when the WayBack Machine first archived the specified URL. The WayBack Machine uses a crawler which is always discovering new pages, not necessarily on the date(s) they were created (so this method can't be trusted 100% either) In reality, even using the "inurl:" and "&as_qdr=y15" operators will only tell you when Google first saw a web-page, it won't tell you how old the page is. Web pages do not record their age in their coding, so in a way your quest is impossible (if you want to be 100% accurate)
On-Page / Site Optimization | | effectdigital1 -
Competitor has same site with multiple languages
1. Good! 2. You are confused for good reason. There has not been clear direction for here for some time. If you used HREFLANG between the two, it seems for the last number of years the content would not be seen as duplicative. You are telling Google that the content is the same but in different languages inherently. 3. There is so much that goes into this, but I can tell you with years of experience under my belt that the numbers don't ever tell the whole story.
White Hat / Black Hat SEO | | katemorris0 -
Header/Menu Links
Hey! Great question. I would organize your main navigation to be most helpful to the people who are visiting the website and not necessarily Google. If it makes sense to add those links into the nav when you are taking that step, then do it. You can also tell Google about importance of links by your overall inner linking stucture (which can be viewed via Google Search Console). You can also work on doing link building to the most important inner links on your site too. This will help with ranking your page. I hope this helps! John
On-Page / Site Optimization | | JohnSammon0 -
Hoth v Fiverr v general backlinking services
I agree with Will. I would stay away with these services as they can hurt you more than help you - especially down the road. My recommendation would be to look at PR services to help you get backlinks through PR outreach (not just doing e-releases).
Link Building | | JohnSammon0 -
Content update on 24hr schedule
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate. In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem; You are adding 1300 new pages to your site every night This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case You are actually scraping key information to include on your site You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
Intermediate & Advanced SEO | | R0bin_L0rd0 -
How to break the news to a client?
Hello there! Without knowing anything else about the website, these numbers do sound incredibly unrealistic. Though nothing is technically impossible, I feel that setting realistic expectations upfront is incredibly important. Did you perform an audit of the site and discuss goals and the scope of work prior to being hired by this client? Also, do you know why the client chose those numbers as goals or are they arbitrary? My recommendation would be to meet with the client and discuss your concerns, recommendations (including other methods of generating traffic), and realistic goals for the website as a whole and how that could correlate with increases in traffic over time. I feel that it is better to part ways in advance if you do not feel that you can meet the (albeit unrealistic) expectations of the client versus disappointing them and potentially tarnishing your reputation in the long run. I also personally feel that guaranteeing traffic or using only traffic as a KPI or success indicator is a slippery slope. What is the true end goal? In this client's case, would it be paid subscriptions? Consider educating the client on the value of the quality of traffic versus the pure quantity of traffic and incorporate CRO into your recommendations to define your conversion funnel and optimize your conversion events. Just a few thoughts off the top of my head - hope this helps!
On-Page / Site Optimization | | Grace-N0 -
Redirect Plugin: Redirecting or Rewriting?
Thank you! I just always felt like, why use the plugin? When I could hard code it in on a server level. It wouldn't even need the website to still get the redirects!
Intermediate & Advanced SEO | | HashtagHustler0 -
On Page Optimization Management Issue
I was wondering if in the meantime this fix has been set up yet? Or in which stage of development it is? I still cannot delete those page+keyword combinations that make no sense.So I guess no solution has been developed yet. Still would like to know where you stand with this?
Other Questions | | f_taleman0 -
New Spam Analysis Tool Results Questions
Sounds good! That's what I was hoping for. I was warned early on as the Disavow is a very very powerful tool and to use it as a final step, so I have been trying to be very strategic with my usage etc. Thanks!
Other Research Tools | | HashtagHustler0 -
Manual Links Vs. Smart Links
Personally, I wouldn't use it. In my opinion it looks incredibly spammy, unless it's a blog post referring to another post or page on the website.
Link Building | | Griffo0 -
301 vs. keeping identical URL
First situation: for both options, in theory, you will be maintaining the same link power, however, I would choose option two. You won't have to worry about GoogleBot crawling the old URL to find the new one. Depending on the keywords you are targeting, you might lose some ground by changing the URL structure. If the old URL structure fits into your new web navigation model, I think you should stick with it to maintain the same results. Second situation: if you feel that the pages aren't of valuable on their own I would 301 them to the one URL. If they made sense to keep from a user experience perspective you would want to add "rel=canonical" tags pointing to the one URL instead. Hopefully that clears things up!
On-Page / Site Optimization | | RangeMarketing0 -
H Tags Vs "H Style" Tags?
Thanks guys! Just wanted to clarify if that was actually an H1, or just an H1 styling thing to be viewed as "make this look like H1, but it isn't actually H1".
Intermediate & Advanced SEO | | HashtagHustler0