Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Yup, can do the same approach with SF. You can run it in List Mode which will let you upload the list of URLs to crawl, and you can set up an Extraction to separate out the h3s (Configuration > Custom > Extraction) Paul

    Intermediate & Advanced SEO | | ThompsonPaul
    0

  • My concern is that page 1 and page 2 are redundant. They are only redundant if they are filled with identical content.  Content about brands can be sales information or information of many other types.. how to select amount their models, how to use, how to repair, accessories available, many, many other types of information - each different, each targeting a different searcher, each capable of stimulating conversion, each a candidate for producing links. Entire websites can be made from one brand, one product line, one model... potential customers, new buyers, long-time owners, service providers, journalists, and many other types of people love this type of website.

    Technical SEO Issues | | EGOL
    1

  • Thanks for your thoughtful response.  As you point out, there are many different business models, and what might be best "general" practice, might not quite fit every model. Moz Q&A is great for this type of discussion.

    Link Building | | EGOL
    0

  • Targeting the same keywords with an additional site could certainly affect impact your existing rankings. You'll essentially be competing against yourself. But it won't make any difference whether the second site is on an addon domain in your primary hosting or on a different host/IP address. Google has so many ways of knowing that two sites are related that goes far beyond what IP they use. There can be instances where a second site for the same topics can be necessary/effective, but you'll want to be really sure that's the best approach as opposed to adjusting your existing site to accommodate whatever it is you're trying to accomplish. You're literally doubling your workload and competing against yourself in the process. What's the purpose of the second site? To go after a completely different market segment? Paul

    Technical SEO Issues | | ThompsonPaul
    0

  • Hi, I'm dealing with ReactJS sites on a daily basis and luckily haven't seen any big issues. But the most common issues I see that come across are that Google isn't able to view the content on the page. What is usually the starting point is using the Google Fetch and Render tool in Google Search Console and from there figure out if the page is being rendered correctly. If not, next steps are likely going to mean that you need to fix that first before moving on to other areas. Martijn.

    Intermediate & Advanced SEO | | Martijn_Scheijbeler
    0

  • Definitely wise in that case to be specific and use "Washington D.C." and "the city of Washington" in your content to specify to users (and Google) that your page is about the city, not the state.

    Keyword Research | | randfish
    0

  • Hi there! Sorry for any confusion about this. The score within your listing report is representative of what our system observes at this exact moment. So, it is a snapshot in time of how your listing distribution looks. This means that if, for example, one of our partners had an API outage at the time which caused them to be unable to report your listing's status to us, then that listing would appear as blank within your listing report and be reflected as such in the snapshot score. If you are managing a location through a Moz Local subscription, our ability to detect a given listing does not affect our listing submission service. We're constantly listening to our partner networks to detect those best matches, and as soon as we are able to find it the listing report will reflect that. Generally, if you know you have a listing on a particular site, you don't need to worry about what is detectable. If this is not a managed listing through Moz Local, you will want to make sure that your Hotfrog listing exactly matches what appears in the listing report. That will give you the best chance of being found! Please let us know if you have any other questions! You can also reach out to us at help@moz.com if you would like help digging further into your account.

    Moz Local | | moz_support
    1

  • Hi @ThompsonPaul, I've just been reading into this as just about everyone in my field use third party reviews and mark it up with snippets. I'm under the impression that Google is fine with the use of third party reviews. In the Google guidelines, it says: "Only include critic reviews that have been directly produced by your site, not reviews from third-party sites or syndicated reviews." I am under the belief if you are not using a critic review then you are safe to use third-party reviews. Refer to the bottom of the guidelines for the Review Snippet Guidelines rather than the Critic Review Guidelines Especially because our team has experienced Third Party Reviews being approved by Google.When we originally put in the mark up we got a penalty for another reason (Developer incorrectly released on all pages). Google then reviewed our mark up approving the use of third party reviews and removed our penalty. I'd love to know your thoughts! Let me know

    Reviews and Ratings | | Sally94
    0

  • @Dave, We are having the same issue. We are getting our HTTP and HTTPS crawled but nothing else. Google is able to crawl our site and I see HTML links in the source code. Any idea why we'd be experiencing this?

    Other Research Tools | | mvnmarketing
    0

  • I know this is old - but I was also wondering this same thing. Our specific use case is that we would love to be able to run a on-page grader via Moz on our staging environment before going live to make sure we are not degrading on-page seo for any specific keywords when changing things around to new designs.

    Feature Requests | | hcaruso4
    0

  • Hi Brooks Thanks for your reply, yes helpful thanks. Not sure if i made it clear above this is a clients website, they are based in the lake district and the operate from there.   The scope of their primary customer would be as far reaching as the whole of the UK who would search for their service in the Lake District  as well as local people searching for that service. So local searches would I guess just use "tipi camping" and everybody else I'd hope would search for  "lake District tipi camping " When optimising the site content I guess I would need to target "Lake District Tipi camping" in the content and title tags etc.  Presuming this would automatically take care of the local searches this way too. It was just the low volume traffic i was some what concerning. I have also been focusing my efforts on ensuring a good local presence, citations, reviews google business etc.

    Intermediate & Advanced SEO | | Bengo-99
    0

  • Actually, geo-basedIP redirects are still a very bad idea from a user and bot perspective. While Google has said it is testing crawling from other areas, they still primarilycrawl from the US. If you do Geo-basedredirects, they will only ever see the US content. Users travel. People travel. Assuming a user should only see a certain set of content based on their physical location is assuming too much. Use case in the consumer field: While attending a friend's wedding in London, I could not get to the US version of a site where I wanted to buy furniture to be delivered in a few weeks. Use case in business: Users travel for business all the time. If they are visiting a headquarters in another country but researching a topic for use in their home country, they might be seeing the "wrong content." Rather than assuming, use IP detection to ask the user to set their location. "We see you are in the UK, do you want to set that as your preferred location?" Once they choose their location, a cookie is set and that is all that user sees from then on out, until they change that setting in the footer or in their account.

    Intermediate & Advanced SEO | | katemorris
    0

  • Hi Andrew Here a few things to check or rule out: Are those pages accessible to be crawled (not blocked with robots.txt etc) Are they also internally linked? (ie;s crawl with Screaming Frog, starting at the homepage and see if they turn up) Is the page actually indexed (search the URL in Google) but just not showing up in Search Console? How long are you waiting before resubmitting - also does it literally get half way down the list, or do you mean 50% are not indexed? Overall, I would just submit the sitemap and you don't need to keep resubmitting. I would rather do some crosschecks to make sure the URL is accessible (crawlable) and even maybe indexed already, just not showing in the report. Usually, there's some other issue with the URL besides a sitemap issue - and like you mentioned, I'm not sure how long you're waiting, but it can indeed take weeks for them to show up.

    Search Engine Trends | | evolvingSEO
    0

  • Here are a few things that many people do not understand. The date of posting does not indicate who owns the content. The date that Google finds the content does not indicate who posted it first or who owns it. Ownership is independent of date of posting and date of Google discovery. Google does not always grant best rankings to "who they discovered first".  The rankings often go to "who is the most powerful". If you file a DMCA against someone who has documented permision to post the content and they decide to sue you for having that content taken down, you are probably going to lose, and you might have to pay more than you expect in damages and more than you expect in attorney fees. The good news is that legal advice on copyright often costs a lot less than you expect and a Hell of a lot less than getting sued.   Know what you are doing and the potential consequences before filing DMCA. There are two ways to get the canonical applied.  A) the webmaster of the website that is publishing the content must insert a canonical tag into the of the html of the page.  It should read like this...    B) the webmaster of the website that is publishing the content can apply rel=canonical using .htaccess.

    Technical SEO Issues | | EGOL
    0