Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Steven, It's very hard to find definitive info about this update since, to my understanding, Google is yet to even confirm that it happened but from what I can gather, it was another revision of onsite quality. Search Engine Land has about the most helpful article I've managed to find on it yet. To save you some legwork, one of Search Engine Land's previous posts on the topic (linked to in the one above) also provides a link to two Google resources on this topic. Note that they're old links more related to Panda but it does talk about what Google views as "quality" which is what this Phantom update is focussed on. More guidance on building high-quality sites Search Console Help: Create Valuable Content There's a bit of reading there but hopefully it's helpful.

    Intermediate & Advanced SEO | | ChrisAshton
    0

  • Hi Scott, 1. That looks good to me! An additional factor you'd want to consider is how you're treating visits from the subdomains to the main site, and vice versa, in analytics - do you want those to be treated as referrals, or as part of the same session? - and configure accordingly. 2. If you've marked the subdomains "noindex, follow" Google will likely pass some link juice, but as usual in Moz Q&A the answer is "it depends" In this case, it depends on whether or not Google crawls the pages on the subdomain in the first place, and how closely-related Google perceives the subdomains to be to the main domain. So the answer is "some, probably, but probably not as much as links from unrelated sites that aren't noindexed." 3. From your question, it sounds like you're pretty familiar with the subdomains-vs-subfolders conversation in SEO, so I won't go into it here. Again, you're going to want to be really intentional when it comes to tracking on these sites to make sure you're properly tracking traffic between them. This sounds like it could make a really interesting blog post once you've got it all set up!

    Intermediate & Advanced SEO | | RuthBurrReedy
    0

  • Are you using a CMS, or some inhouse solution? If it is a CMS, in many cases you should be able to update that CMS so that the 2 links are generated but the page itself isn't generated twice. Another option if 2 pages must exist, would be to set a canonical on both pages to the 1 main location for the content, while using a pushstate on the url to manipulate the browser into the main pathing. Although the more I think about that one, it may not be a 100% viable option.

    Technical SEO Issues | | RosemarieReed
    0

  • If you use SEMrush you can find out what queries their site appears in the SERPs and at what ranking.

    Other Research Tools | | EGOL
    0

  • Yes, I use the "Related Keywords" tool frequently. Clicking on a related keyword to obtain "related keywords of related keywords" is also helpful. The most valuable keywords are obtain from the names that customers use when they call on the phone or when they write to us.  That "language of the customer" often does not appear in keyword tools.

    Online Marketing Tools | | EGOL
    0

  • I see two possible problems here. Both have been pointed out but I'll address both. First, duplicate content from the same owner. If they have another site that has the same intent ... same topic ... reason for the site, you might want to merge them. I can't tell as I don't speak that language, but if they are competing sites, you don't want to pour resources into two places. That's my recommendation to the site owner. Second, the bad links. It is possible that even links that have worked in the past for other sites might be hurting this site. Again, it's hard for me to tell, but if the link brings no value other than being a link and no one uses the link to get to the site ever, you run the risk of hurting that site with that link. Free directories are prime targets for Google. I am not sure how stringent Google Greece is on this yet, but this might be an issue. With both of those things being said, if the links are hurting the site and you merge the two sites, you might hurt both by merging them. I'm sorry I can't be more explicit of the cause, but with some poor links in the past and some less than stellar links now, and a possible duplicate intent site from the same owner, either or both of those could be the cause. If it were me, start with merging the two sites. If the rankings drop for the well ranking site, clean up all poor links to either domain. If it stays down, unmerge and just let the poorer ranking domain die. Your other option is to tell the client to focus on the site that is doing well, drop this one. But be careful with the link building. You don't want the same thing to happen to the other site and that's possible.

    Moz Tools | | katemorris
    0

  • Hi there. It's more of a technical problem, so rather than looking for an answer on a forum, please email to help@moz.com with explanation of your issue and account credentials - they'll look into it for you.

    Other Research Tools | | DmitriiK
    0

  • Oh, it's year of 2015 and you still use "timthumb.php"? https://blog.sucuri.net/2014/06/timthumb-webshot-code-execution-exploit-0-day.html https://www.binarymoon.co.uk/2014/07/dont-use-timthumb-instead/ https://www.binarymoon.co.uk/2014/09/timthumb-end-life/ Also if i think that lack of alt text for images can be terrible for your page. If you're still looking for modern gallery you can use PhotoSwipe script and he is SEO and semantic friendly.

    Web Design | | Mobilio
    0

  • Hi Jay! Did you ever get your data? Or did these fine folks help?

    Intermediate & Advanced SEO | | MattRoney
    0

  • I do a lot of link audits and most of the sites that I am working on do not have a penalty. Some have been suppressed by Penguin and some are sites that are trying to avoid a future Penguin hit because they've had low quality link building done for them in the past. When a site is affected by Penguin, the algorithm can act like an anchor that pulls the site down and keeps the site from ranking at their full potential. But, it's often difficult to know whether Penguin is affecting you or not because Google doesn't give you any notification or warning of the fact. As such, if there are possibly low quality links present then yes, by all means, disavow! I want to caution you though to not rely blindly on Link Detox data. I have reviewed many disavow files that have been created after using this tool and they are horrendously inaccurate. I've seen automated link auditing tools recommend disavowing fantastic naturally earned press links from highly authoritative sites. And, what usually happens more often is that the tool classifies many unnatural links as good ones. I think that these tools can be helpful when it comes to putting your links together in a manageable form, but you absolutely must look at each link individually. You can probably go straight to disavow for some links such as ones that come from sites that are obviously just for links such as freelinkdirectory.com or something like that. But, in most cases, a critical human eye needs to be used to look at each link and determine if it exists just for SEO purposes or has legitimate purpose outside of SEO. So, to answer your questions: Yes...go ahead and disavow even if there is no penalty. But as mentioned above, manually check these links first before disavowing. And don't worry that filing a disavow is going to put you on Google's radar. That's not true. 2)Yes, a 301 redirect passes link signals to the redirected site. So, if you purchased a site that has spammy links pointing to it and redirected that site to your main site you will have spammy links pointing to the main site. You'll need to file a disavow on the main site that contains the domains that are linking to the site you purchased and redirected. With that said, if there are a LOT of spammy links, you may want to assess risk vs reward. It's possible you won't be able to find and disavow all of the unnatural links and could invite Penguin issues on the new site. If the domain is one that could get a lot of type in traffic and you really want to redirect it there are ways you can redirect without passing on any link equity such as redirecting through an intermediary page that is blocked by robots.txt. If the links you are asking about are now pointing at 404 pages then they are essentially removed and you don't need to disavow those links. Hope that helps! Marie

    Link Building | | MarieHaynes
    2

  • Keyword research will be key, and that's really going to be where you want to focus down. Find related terms, long tail, and opportunities you may not have considered in the past. Don't get too held up on the head terms to start - Optimize for long tails (with head terms in them), then once those get traction you'll see progress on the head term. You really need a process/strategy around keyword research. It's not as simple as going to Keyword Planner and picking a few; competitive analysis and proper research is key to the whole process. When I talked about page depth, there is only so many clicks a user will take on a site before they move on. Basically what I meant is to organize the architecture of the site (navigation) so that the user needs as few clicks as possible to reach the products. Fewer clicks (hops) for the user also means fewer hops for Googlebot. Fewer hops between products means more quality pages indexed, more quality pages indexed means a wider spread of keywords to be found on, more keyword rankings = more traffic. Optimize for the customer first, since that's how you make money. Make the site easy to navigate, and you'll see a lot of benefit from that.

    Intermediate & Advanced SEO | | Eric_Rohrback
    1

  • Hi Laurean, are you able to share the site that you're looking at? It would be much easier to give you an answer if I could take a look. Alternatively, if you know any similar sites that have the same issue, I can look at that example to begin with. Craig

    Intermediate & Advanced SEO | | CraigBradford
    1

  • You could launch it on the .com.au extension - but I fear that the geotargeting resulting from the ccTLD is a much stronger signal for Google than the ahfref tag. Not sure if this link is still valid but it states: Q: Does “rel alternate hreflang” replace geotargeting? A: No. This link-element provides a connection between individual URLs, and only allows Google to “swap out” the URLs from your site currently shown in the search results with ones that are more relevant to the user. It does not affect ranking, as geotargeting would. Check this video from Matt Cutts (2013) on how Google deals with ccTLD's: https://www.youtube.com/watch?v=yJqZIH_0Ars If your dot.com domain is not available - try if some of these generic tld's is available for your domain - and redirect to the .com as soon as you can. It's not optimal - but I honestly think that the .com.au/us/ solution is not going to work. Dirk

    Intermediate & Advanced SEO | | DirkC
    0

  • Hi Gayane, We have a long recent thread on this same topic here, https://moz.com/community/q/does-multiple-sites-that-relate-to-one-company-hurt-seo , which you might like to check out

    Local Website Optimization | | MiriamEllis
    0

  • Thanks for your input Gianluca, The ecommerce ( http://www.fiberscope.net/ ) doesn't seem big compare to bigger fishes. What do you think, are we on the right path with the current structure?

    Intermediate & Advanced SEO | | Meditinc.com
    0

  • Ah, much clearer answer.  Thank you.  Saves me some work.

    Other Questions | | mike_sif
    0

  • Yes, you absolutely should add unique text to each of these pages. Not only so that they aren't flagged as duplicate, but because it's always an SEO benefit to have more good content. If you don't have the capacity to write such content, however, you may want to remove them from indexation. The reason that these pages are being flagged as duplicates is that Google isn't parsing these PDFs. Which means that, all Google and others see are pages with no content and an iframe. It's also pertinent to note that Moz will flag anything with more than 90% overlap as a duplication. I hope this helps!

    On-Page / Site Optimization | | Lumina
    0

  • Also, for the future, I recommend using two plugins for WordPress: Redirection by John Godley: This plugin lets you set up redirects within the WP dashboard, which means you can keep track of what's being redirected where within your WP administration. Yoast SEO by Team Yoast: This plugin lets you add and edit a lot of different SEO-related items, gives you tips regarding keywords you want to focus on, and a lot more. It's incredibly useful. To be clear, I don't work for or with either of these developers/teams, but they're great plugins that sound as though they'd be especially helpful for you.

    Content & Blogging | | Lumina
    0

  • In theory - yes, but here is how look in practicle: http://www.stateofdigital.com/google-broken/ Just check examples with "bandwagon" or "carro". I'm not sure how this happen but i can often see similar results since our Bulgarian language is close to Russian. So in bulgarian SERP i can see sometimes russian results. Of course result is relevant, just different country. I also have site related to Greek language. But this site can be visited from Cyprus and other worldwide locations due keywords. I can attach SearchConsole and Analytics screenshots if you don't believe it.

    On-Page / Site Optimization | | Mobilio
    0