Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Suggestions on Link Auditing a 70,000 URL list?
Hi! - I wrote this guide a few years ago on penalty recovery which may help you as it contains a lot of methods around auditing the links - https://moz.com/blog/ultimate-guide-to-google-penalty-removal If we were to approach a product with 70k URLs. We'd do the following steps: Pull all the URLs into a Spreadsheet Split the URLs into domains Filter the URLs are search for common spammy words. e.g 'Link', 'Best', 'Free', 'Cheap', 'Dir', 'SEO' etc (mark as spam accordingly) Run contact finding across all URLs using a tool such as URL Profiler with Whois Lookups Filter by contact name and find duplicates (mark as spam accordingly) Filter by website type and mark as spam accordingly Manually check remaining links By working through by domain, you'll rule out thousands of spammy links very quickly. Though 70k will ultimately take a few solid days of work. Hope this helps, Lewis
| PinpointDesigns0 -
Site not showing up in search - was hacked - huge comment spam - cannot connect Webmaster tools
HI Alistair, I don't think that is because Google is so Angry and its not letting you verify your site in search engine consule. There might be some technical issue or anyother possibel reason. such as, your server might be not responding to Google bot or Google is unable to connect with your server. In addtion to that if its download is not redirecting. There are other option available for you to verify the site, I am sure, you're femiliar with those but i reckon to use via domain name provider. Moreover, before you submit site, make sure your site is fully secure. I hope this will help.
| Mustansar0 -
Easiest Way to Balance Links Across Site?
In addition to Joe, I dont see any reason to buy cheap linkbuilding ( pardon me if misunderstood) However, i would stress on organic local citation as well as niche links must be earn by delivering such services. Moreover, try to avoid over optmisation of any keyword/page etc.
| Mustansar0 -
Link Resolvers, Academic Publishing, and SEO Visibility
Hi Eric, As I say I'm familiar with the general principles of redirects, having applied them before with other clients. Unfortunately it's the industry-specific knowledge I need. Link resolvers are regularly used in academic publishing to point from, for example, library holdings pages to an ebook or a publisher's platform, at which point temporary redirects make sense for various reasons. My concern is that in using them to point from our old site to our new site will cause us problems.
| BenjaminMorel0 -
Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
I don't think there is a one-size-fits all recommendation to make here, which is why that post has so much detail about how to do the research necessary to determine what the best route is for your business. I agree that improving content is better than simply noindexing it, but I also think noindexing it is better than leaving it up long-term unimproved. And the reality is many businesses with tens-of-thousands or hundreds-of-thousands of product pages, and most blogs with thousands of posts, aren't going to be able to economically scale rewriting all of it. The best solution for them, in my opinion, is to get rid of the pages that are dragging them down - at least get them out of the index. They can always be reintroduced once they're improved.
| Everett2 -
How do I get the sub-domain traffic to count as sub-directory traffic without moving off of WordPress?
You're welcome. The canonical tag is used to prevent duplicate content concerns. Your traffic reporting is still going to register traffic at the sub-domain page visit.
| Chris_Hickman0 -
New domain or subdirectory?
You're welcome. Take a look of the links that are pointing to your sites. There might be some to disavow. Use also complementary platforms: ahrefs, magestic, or others. A spam score of 2 is not big deal. Monitor your backlinks and do not do spam Best Luck! GR.
| GastonRiera0 -
Wrong redirect used
Hi Vettyy & Michael, Thanks a million for your response. Implemented both your suggestions, Screaming frog showed the previous 302 are now 301 so everything seems ok on that end. I have also updated Google My Places listing to reflect Using site:mydomain.com has showed a mixture of https vs http so I am guessing I just need to wait & monitor - cross my fingers & hope for the best.
| Patrick_5560 -
Is this the correct way of using rel canonical, next and prev for paginated content?
Fantastic, thank you Paul! Those links are very useful, and I might have already read those when I setup those canonicals (I jut forgot after a few years to have worked on that!) I'll check them out carefully again Appreciated your help and prompt reply All the best, Fabrizio
| fablau0 -
WordPress posts Title field inserts title into blog posts like a headline but doesn't ad H1 tag how to change?
This is what I found in view source, does this = h1 tag? | div class="head1"> <a <span="" class="html-attribute-name">href</a><a <span="" class="html-attribute-name">="</a>http://klenklaw.com/theft-from-estate-disqualifies-beneficiary/" rel="bookmark" title="Permanent Link to Isn’t It True That A Theft From The Estate Disqualifies Beneficiary From Inheriting?">Isn’t It True That A Theft From The Estate Disqualifies Beneficiary From Inheriting? | | | | | | | | | | | | | | | | | | class="entry"> | | |
| SEO4leagalPA0 -
How best to deindex tens of thousands of pages?
I suppose you could also try creating a page that had links to all of the pages you want to noindex, then from Search Console use the Fetch as Google tool to fetch that page and choose the "Crawl this URL and its direct links" option to help Google see the noindex meta tags faster.
| vcj0 -
Penalty for adding too much content too quickly?
Thanks for replying. The site is getinspired365 dot com. We saw a spike of 11,000, then 29,000 then back down a steady ~1500. Yes, we have structured our sitemap such that there is 7 sitemaps (one for authors of 15,000) and then 5 for our quotes (40,000 each) and one for our topics (2000). Looking at it around 90% has successfully been indexed. This was done around 2 months ago and as I say it has pretty much all been indexed but it is not ranking - at all. However, our first batch of content is ranking and ranking really well. It is as though this new content has some sort of penalty and is therefore not ranking in Google but I am not sure 1. What the penalty is and 2. How to fix it? I want to deindex the entire site and start again, and just add the content in much smaller batches but I am not sure how best to do that. thanks
| SteveW19870 -
Where does Movie Theater schema markup code live?
I'm not sure about the 3rd request but for the first one I would suggest you implement the schema via a json script on the pages that has the information about what is showing. The json for each movie would look something like: from: https://schema.org/MovieTheater I would also suggest using this testing tool to check if what you implemented is working: https://search.google.com/structured-data/testing-tool
| VERBInteractive2 -
Site Structure - Is it ok to Keep current flat architecture of existing site pages and use silo structure on two new categories only?
Hi Chris, Thanks so much for taking the time to answer my question, I find it helpful. Cheers Sean
| servetea0 -
Indexed Answer Box Result Leads to a 404 page?
Google does not update these features very often. We moved some content to another website in February and some of these features are still showing in the SERPs for the old location. However, we placed 301 redirects on the original pages so that this traffic from Google, existing links from other websites, bookmarks from past customers and many other types of traffic will be redirected to the new destination. We always put up 301 redirects anytime we move anything and they will be there forever - even after my funeral - and my business continuity plan instructs my heirs to keep these 301s in place. I believe that holding these 301s in place beyond my lifetime is best practice (I am a lot older than most Moz fans).
| EGOL0 -
Pagination & duplicate meta
Regarding the links which point to pages, but include the hash. If Google is only seeing this page http://www.key.co.uk/en/key/workbenches Will it be seeing these as pages which have duplicate content?
| BeckyKey0 -
Url title and then category vice versa
I recently came across a great article that might be able to help you out. Ideally, you would want to organize individual content pieces within their appropriate categories to help Google and other search engines easily decide what it is about. I would recommend updating your site structure. Bruce clay talks in depth about how to silo content and I think it will answer alot of your questions.
| JordanLowry0 -
Link juice through URL parameters
Shockingly, when asked point blank if affiliate programs that employed juice-passing links (those not using nofollow) were against guidelines or if they would be discounted, the engineers all agreed with the position taken by Sean Suchter of Yahoo!. He said, in no uncertain terms, that if affiliate links came from valuable, relevant, trust-worthy sources - bloggers endorsing a product, affiliates of high quality, etc. - they would be counted in link algorithms. Aaron from Google and Nathan from Microsoft both agreed that good affiliate links would be counted by their engines and that it was not necessary to mark these with a nofollow or other method of blocking link value. But note the point they had not mentioned what will they do with low quality links. From the above points it clear that Google will passes a link juice. But still many of us in affiliate industry uses a parameters and redirects in affiliate urls. Reason is just simple not all the affiliate are as genuine or reputed as Amazon. So if your links in 50 sites and may be 40 site can be those which Google does not like so links from those site may harm your site. So as I said above its always good to save website's image while leaving some link juice.
| Mustansar0 -
Moving multiple Sites to One Site and SEO Impact/Ideas
Nothing is guaranteed when you move domains. I see the following risks.... we are launching all the pages again with new content, redeisgn Yikes! New content might not be received as enthusiastically by the search engines. It might not be received as enthusiastically by visitors and they will bounce - sending a bad signal to search engines. If this was my site I would keep the content and the internal linking among the pages the exact same as it is now. If the rewrite is not done in a way that preserves the visitor's pleasure your rankings might fall. These original sites... if they are popular and have a lot of type in traffic, a lot of branded traffic, a lot of brand mentions, a lot of domain mentions, a lot of good will, a lot of navigational search, then you will be walking away from that. Rankings can be supported more by these signals than by links in many cases. No guarantees. We are in the process of moving 2 sites with higher page authority to another site we own I believe that (generally) best SEO practice is done when the strongest domain becomes the host of the content. It then has the ability to pass its authority and reputation into the content being moved there. **What is the health of these other domains? ** If you can't say that these other sites have clean link profiles, healthy content profiles, etc. then merging them could have some risk. It might pay to check these things out before merging. You know the story about "one bad apple."
| EGOL0 -
Multi URL treated as one?
Hi TapGoods, In Search Console, next to the red 'ADD A PROPERTY' button, there is a grey 'Create a set' button. Click this button and you can group your accounts and be able to view a combined data from these accounts. You can read more about this here: https://support.google.com/webmasters/answer/6338828 With the Moz data, once you've created your property set, give Moz access to the set instead of the single account it would've had access to. About the sitemap, assuming that the four URLs resolve to a single URL (which they should), you only need to submit your sitemap to the account for the URL that your site uses. Cheers, David
| davebuts0