Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • You can use Screaming Frog to pinpoint where your 404s are coming from. Here's a great write-up with a few different ways to use SF for this: https://www.screamingfrog.co.uk/broken-link-checker/ Another option is Google Analytics. First, navigate to your All Pages report, then set primary dimension to Page Title. Next, go to your site and trigger a 404, take note of the page title, it should be something like 'Page Not Found'. Whatever that page title is on your 404 page, enter that in the inline filtering and it'll narrow the reporting down to just 404 pages. Then drill down into that result and see a full list of URLs that are throwing a 404. Set the secondary dimension to Previous Page Path to see the page that linked to the broken page. Hope that's helpful!

    | LoganRay
    0

  • I would say make sure to include a 'meta no index' tag and block search engines from crawling it with a robots.txt. Then you can think about creating a placeholder with information about the rebrand. No sense in having it indexed just yet. Hope that helps some.

    | JordanLowry
    0

  • Apparently this didn't work. According to the Moz tool I have the same amount of duplicate title tags as before.

    | moon-boots
    0

  • That's great to hear Marcel -Andy

    | Andy.Drinkwater
    1

  • Hi there, It's a delusion that 30x redirects can lead to PageRank loss. This has now been confirmed multiple times by Gary Illyes and John Mueller. As long as you properly implement a 301 redirect, there shouldn't be any issues.

    | solvid
    0

  • Are you referring to having a page like this -- http://www.selectequipment.net/brands/Cutler-Hammer and simply adding a facet/filter for product type? The individual product pages Should be optimized for Brand+Product+Model. The faceted navigation within the site alludes to the fact that you can "drill down" onto a specific brand, but clicking the link takes to the general brand page vs. a page filtered for that brand...  or allowing you to filter product type when viewing by brand.. In this case, I would create the navigable heirarchy within the site to view all of the brands and filter by product type within the brand pages, essentially creating all of these "combos" you are talking about.  Further, I would create limits so pages with fewer than say 5 unique items (you determine appropriate #) are noindex,follow.   Lastly, I would make sure all product pages are also optimized to include the brand, product type, and part number as part of the optimization goals for each product page. Hope this helps! Cheers, Jake Bohall

    | HiveDigitalInc
    0

  • It's hard to say how much traffic you'll lose from the merge. Like Logan said, you'll definitely lose a bit when you first move, but long term, you'll need to look at your competition to figure out if it's better to keep the pages separate or combine them. I don't recommend keeping pages A, B, and C if you're going to hide them from the main structure of your site. Pages get most of their Page Authority from internal links (unless they're link bait), so they won't be able to rank anyway. That said, here's how I'd estimate the loss of traffic from the move: Use Google Search Console to determine the primary keyword/s for page A, B, and C Use a tool like Open Site Explorer to determine the number of links A, B, and C have. (Bonus: look at the websites linking to A, B, or C. If those are resource pages, there's a good chance their webmaster will update their links to page D, which will help with the traffic dip. If they're from news articles, you'll probably have to rely on 301s.) Search for each of those top keywords and look at your competition. Does the competition closely target the term? Will page D seem as relevant to the keyword as A, B, or C did? Now, look at the Page Authorities of the competition for each keyword. Will page D, which will have a combo of links from A, B, and C, blow your competition out of the water? About match it? Still be a bit behind? Here's the part that's really tough: for each keyword, estimate where page D would rank, given how well it targets the keyword and how many inbound links it has. Estimate the % increase or drop in traffic based on adjusted click through rate. You can find this by playing around in Google Search Console to find a time when your site ranked in a different position, or by using average click through rates, like here. Once you're done, put together your estimated percent increases or drops in traffic to estimate how the new page will perform. (I recommend you look at a percent change because adding up totals only for top keywords won't take long tail keywords into account, and you'll almost definitely come up with a much lower count than you're currently getting.) Not the easiest process in the world, and your estimate will almost definitely be wrong, since you make a lot of assumptions along the way. But it should give you an idea of whether you'll eventually gain or lose traffic from the move, once that initial Googlebot confusion wears off. Hope this makes sense! Let me know if you have any questions! Kristina

    | KristinaKledzik
    0

  • Hi Arnold, Very Interesting case, and congratulations to you for getting a viral video without even knowing about it. You have asked a much interesting question, how to find out some of your or some else's video that gone viral when they have not tagged you or provided a backlink directly to you. Considering your case scenario as an example, If I were you, I would have first analyzed the traffic patterns, set up a question on the registration form "Where did u hear about us" . Also, called up signing up customers to understand where were they finding out the information about the website. Other than this, there is no direct technical way to understand the video content if the are not connecting your website via backlink or a hashtag. I hope this helps, please feel free to respond and ask further questions. Regards, Vijay

    | Vijay-Gaur
    0

  • I got there in the end.  They have a Wistia video loading on the homepage, but Wistia robots blocks this resource.  When the resource is blocked the CSS is loading a holding image.  However, this is configured to fill the whole page.  So therefore when googlebot crawls it cannot render anything further than this image or this defined area in CSS.  Dev is fixing.

    | MickEdwards
    0

  • AS a rule of the thumb, no matter what changes applies on 3XX redirections, but the least you do it the best, not just for juice loss but for easier management of your website. You definitely do not want 2x or 3x 301 to happen unless it's really unavoidable based on how complicated your website is. Now your best bet depends on what you want to accomplish. In the past I always tried to be conservative and try not to lose too much of my so hardly earned traffic, and didn't want to lose a piece of it, but after a while you see consequences of that, as you start having a mixed composition of legacy URLs on your website. I would say, test in a relatively small section and see what happens. If your loss of traffic/rankings is too significative roll the changes back (don't forget the 301 back), and use your preferred method, but take into account that in the long run you want to have a manageable website limiting exceptions as much as possible. On a side note, people normally looks at 301 like a loss of value no matter what, but that's not always the case, the big deal with 301s is the loss of value accrued from other pages, so, if after you 301: change all the internal links so you don't have unnecessary internal 301s contact external websites to get the url changed. Once you do that, the 301 won't matter at all, as the resources sending value to that page are now linking to the new one. Hope that helped.

    | mememax
    0

  • Thank you for all your answers. EGOL, your link is great and recent. I am removing redirections and inactive product pages are starting to be indexed. Marked your answer as the "Good Answer" Moosa, your idea is great - will propose to my team. Thomas, thank you for the links. Yes, the inactive products post is mine too. The other mainly for activating many pages at once though - also replied to you in there. Cheers,

    | viatrading1
    0

  • Thanks you Thomas, we activated them - and will try to improve our "Related Products" and "Availability Notification" section for inactive products. Cheers,

    | viatrading1
    1

  • You are most welcome, feel free to ask further questions.

    | Vijay-Gaur
    0

  • Hi Tertiary Education, From the way you describe it your pages are being considered duplicate content becuase they are so very similiar, that is to say that they are near-duplicates if not actual duplicates. It sounds like the content on those pages is somewhat thin- is there anything else on the page apart from the map? If the answer is "no", or "not much", then consider padding them out with (quality) content that serves your audience. I think this is what Patrick is referring to with his suggestions (forgive me if I am mis-interpreting you Patrick!), and I think he is absolutely right. Other things to consider as creating unique title and descriptions for each page (I mention it in case you have over looked it). By adding valuable differentiated, content to those pages you will be differentiating one from the other as well as providing value to your users. As Moz sees these changes it should stop flagging them as duplicates, and as google re-indexes the pages it will begin to appreciate the differences between them and gradually index them as non-duplicate content. Hope that helps

    | unirmk
    0

  • Hi Andrew, Re. your comment: My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. ... you are right to be concerned about this. It is going to look, as you say, like a very thin page with no value-add for anyone visiting, or Google. If the pages served no real purpose to Google, you could always robots them out, but what is being suggested isn't a particularly good idea if you are still wanting them to rank in the SERPs. -Andy

    | Andy.Drinkwater
    0

  • Chris, I really appreciate the time and effort you put into this thorough answer! Truly proud to be part of such a great community! You said some really great points and put some of my worries to ease. I will continue link building and making a few corrections. Thanks again for your helpful advice! Rachel

    | Rachel_J
    0

  • Thank you for this. Google Tag Manager seems to be the most effective tool, but will the Moz Tools pick up on this or will it keep flagging my pages as Duplicate Content and so on?

    | moon-boots
    0

  • Hi There , We had a similar situation with a client, who had used a separate website name for his company (with a DA 20) http://www.divvymaster.com/  and wanted to his website product separately on another website https://www.fairsplit.com/ We built the new website, 301 redirected all the old website right pages to their equivalent/ parallel pages on the new website. As for the main domain, we used explanatory links with text to explain the migration to the user. This had following impact. The old website divvymaster.com DA fell from 20 to 9. The new website authority is around 20 from 1 in 2.5 months. We are getting traffic for almost all the ranked keywords from earlier website plus more visibility on search engines for new keywords for the new website. So, we consider that we were able to pass the authority of the website. Having said that, we also did some good organic backlinking work for the new website. But I would recommend passing the authority to right pages with 301 and informing the users about the redirect on the right pages. Regards, Vijay

    | Vijay-Gaur
    0

  • So interestingly enough, and without trying to identify the people involved, I have found a local SEO company that has somehow acquired a fully unrelated domain which supposedly used to have something to do with students and kids lives and was a resource of sorts back in the day. This site has FOLLOWED backlinks (with the student resource context) from the United Nations, from cbs.com, and from a wide range of websites mostly referencing it as a student and kids life resource. Given the looks and feels of their site, they seem like a shabby SEO company with rather grey or dark themed SEO tactics. This company ranks on page 1 for City+ SEO combination of keywords for three to four MAJOR cities in the US. I want to find out what you all think of this... Is this ok? How is this ok? Is this going to cause them to be flagged eventually automatically for a penalty or purge? Does Google have anything against this sort of thing at all? Danke

    | TheSymmetran
    1

  • I don't believe that it "hurts" to put up a temporary site.  However, if it subtracts from your time of getting your real site up then that is a sacrifice. I would rather put up the first few pages of my real site a day or two early than get a temporary site up a week before. The only exception is that I would want something up ASAP if I knew that important people were looking for me.

    | EGOL
    0