Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Christy, I'll watch it now - thanks for the response!

    Moz News | | Jacksons_Fencing
    1

  • That is correct Robert. My question is about content. An example would be I love golf | My girlfriend and I play golf everyday | In the evening we have dinner at home |  etc....

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Few battles are won by comparing your page against their page. The value of your website weighs very heavily in deciding who will win. If you want to win, attack by building one of the great websites in your industry.

    Intermediate & Advanced SEO | | EGOL
    0

  • Well, one thing I do know is that... to make objects more accessible, you often need to re-code objects (like the video-player itself) This particular post is quite interesting (even though it's old, circa 2010): https://dev.opera.com/articles/more-accessible-html5-video-player/ ... particularly in that, the author is considering WAI ARIA which makes web-content more accessible for disabled people. You can see where the author starts getting into WAI ARIA here. He's just doing very basic things like, making sure the web player itself (the buttons and stuff for playback) can be operated. I'm sure though, that there are ways to take this ind of stuff further The problem is, this is a basic HTML 5 video player. If your videos are embedded using an external video player (like the YouTube one, or like this Brightcove thing) - then often you have no opportunity to recode the player. If you can't do that, then you can't make it more accessible than it is out of the box If you're happy to supply extra content along-side the video which helps search engines to read it, then you'd be looking for the Transcript schema which is part of the VideoObject type. Google does use the Video schema, but they don't explicitly state that they look for the transcript in their primary documentation On the official SEMRush blog, they seem to think that transcript schema is 'fantastic' for SEO (ctrl + F for "transcript" on the post). This 'relatively' recent study seems to find that, transcript schema doesn't really boost rankings all that much (though I have my suspicions that it could heighten relevance, if nothing else!) This page from the University of Washington is quite detailed in terms of creating more accessible videos. It seems that, some support at least would be required from Brightcove. Even if you managed to caption your video, on the Brightcove end - their system would have to be capable of 'adding' your captions to their web-live video files. The post also contains information on generating accessible transcripts From the sounds of it, the best implementation for the transcript would be to have it visible on the page (in a place where a blind user's reader would hook into it, and read it - as if it were standard content). You could then wrap it in transcript schema, so for this one I'd probably use a microdata implementation rather than JSON-LD After that you could do more research and see how much stuff you could push into it from accessibility initiatives like WAI ARIA Captioning... I've tried to provide you with some resources, but I'm still not quite sure myself Hopefully this will at least get you pointed in the right direction

    Intermediate & Advanced SEO | | effectdigital
    1

  • Thanks for the info! So what this means is that, although Analytics (where people typically analyse all their funnels, by porting their AdWords data into GA) would be able to track that users are visiting pages, and Analytics (if you have set it up right) will be able to know where users came from (e.g: AdWords / Ads) - Analytics will not be able to determine that open chats (leads) are occurring and it will not be able to determine the value of any successful conversion For that to happen, your chat plugin (and your AdWords account) would actually have to talk to Google Analytics. For example, if your chat plugin were coded to fire a confirmation message from the operator to the chat-user (which contained the amount they had paid and the fact they had converted) - that information could be wrapped into events fired to Google Analytics via JavaScript. From there you could easily filter down to users from paid-search (PPC / Ads) only and then just view the number of conversions, and the value which were ascribed to them What we have identified is that the weak point you have, is one of these 3: Your AdWords / Ads is not talking to GA properly (maybe you don't have Google Analytics? In which case... that would be your centre-point where all the data needs to go, you need to get it) Your chat plugin is not sending data to GA, which could then be married (within Google Analytics) to your AdWords / PPC data Both of the above at the same time So the steps I can see are: Step one would be (if you don't already have it) setting up Google Analytics Then having it properly integrated with your AdWords so both talk to each other Then making your chat plugin, also talk to Google Analytics Finally - deriving all your wonderful insights, in the Google Analytics back-end For example, ZenDesk properly integrates with Google Analytics (see this post). Luckily your chat plugin (Chatra) also has this functionality (see here). One concession with my answer here, I haven't told you how to get AdWords data into Chatra, or Chatra data into AdWords. Sorry about that, but trust me when I say - an Analytics (GA is free) integration will be better for you. Sorry I'm not 'exactly' answering the question here, still doing my best :') You must confirm with Chatra, as part of their Analytics integration - exactly what data will Chatra send to Google Analytics? For example, maybe the Chatra / GA integration, only sends number of chats and length of chats, but not chat-based conversions (as you specified earlier: users convert on the chat, so that conversion data MUST come from Chatra or similar). If that's the case, you'd then have a problem and have to seriously consider other alternatives like ZenDesk or something else For each plugin you think of trying, you'd have to email them with the same question. What data can the flow from their service, to your Google Analytics? If you don't like what you hear - it's the WRONG plugin for you Hope that helps

    Paid Search Marketing | | effectdigital
    0

  • thanks for the suggestion. I should be able to do this, but if we need further help I will be sure to reach out to you.

    Behavior & Demographics | | Globalgraphics
    0

  • I think that there is a good chance for an increase of total sales by combining the sites.  I vote for this because the two sites have diverse link profiles and combining them will make a big gain in the link diversity and domain authority of the site that remains.  I would merge them with optimism rather than with fear.

    Intermediate & Advanced SEO | | EGOL
    0

  • The erroneous advice you got was that Google handle reconsideration requests within a few days. Usually it does take a few weeks and in extreme situations it can take months. Usually when it takes months, it's because you have repeat-offended over the same issue. If you get a penalty for link-spam and then do more link-spam, each time you submit to be reconsidered they leave it longer and longer It's also down to Google's internal resources. The sad fact of the matter is that, although losing Google does heavy damage to your site... Losing your site, doesn't do heavy damage to Google. If there are other matters which Google are pulling focus to internally, it can take quite some time to 'be seen' as it were It would be strange of Google to mis-apply a penalty of some kind. Usually if you have hacked content, either those pages get nerfed or your whole site gets nerfed. Having one page get nerfed which was not part of the assault, is extremely unusual I know that one thing you can do, is to create a free account to query Google's 'safe browsing' API https://developers.google.com/safe-browsing/ My next step would be, to ascertain the URL of every page on your site that exists now or has existed within the past 12 months (just to be sure). You can get historic URLs out of the Wayback Machine, Google Analytics (by making a table that combines host-name and page / landing page - unfortunately they won't give you protocol... so hope that hasn't changed for you in the past year!) or Google Search Console. The live URLs, just crawl with Screaming Frog or similar. Once you have a complete list, get a developer to build a rough script that will query all your URLs against Google's safe-browsing API. That would tell you if Google still sees a problem. It will tell you where Google sees the problem, if it does indeed see a problem in this area (and whether it sees the problem on live or dead pages) When you have that to hand, you'll be in a much better position to know whether you still have an issue or whether you just haven't been seen by a Google rep yet. When they decline a reconsideration request, usually they do tell you I think that the safe-browsing API, whilst free to access - is limited to 10k queries per day. Don't try and get clever and get around it, if you are already having Google problems (you don't want MORE!) Another thing, it never hurts to link to an 'open' (link access required only) GoogleDoc (their version of word) within your reconsideration note. In the GoogleDoc you can much more fully explain, even with screenshots - what the heck is goin' on! You need more information right now. Sorry I'm not giving you an instant solution, but I am telling you exactly what I'd do in your position

    Technical SEO Issues | | effectdigital
    0

  • A follow-up question - when should I be adding keywords to "Lists" vs. "Campaigns"? What are the pros and cons of each feature? The guide doesn't quite address these questions.

    Other Research Tools | | hppcseo
    0

  • It may do or it may not. It may or may not impact upon duplicate content, it always impacts upon crawl allowance I'm going to use trailing slash URLs (a more common issue and consolidation feature) in my example, but it's equally applicable for stripping .HTML or non-resource (PDF, JPG, JS etc) file extensions Quite a lot of sites, even if they refuse to clean this up, will at least 'canonical' one URL to the other. That let's Google know that one version of the page is canonical and should receive relevant SEO traffic - it avoids content duplication related penalties or algorithmic devaluations. There are two things it doesn't help Google out with It doesn't tell Google not to crawl both URLs (you might say the canonical tag does that, but keep in mind Google has to have already loaded both URLs to read both canonical tags so... no) It doesn't consolidate SEO authority to the same degree that 301 redirects do. Say one page has some nice backlinks and the other one does too, that 'ranking benefit' won't all be consolidated onto one page. The canonical tag will make sure only one page ranks, but it won;t gain the 'optimal' benefit of the backlinks for both web-pages (301s do a better job of that, generally) So as you can see, even if you avoid content duplication issues, there are other problems that could potentially arise. This being the case, it's best to consolidate your URL architecture at and and all levels My preference is this logic in the htaccess (via 301s): Always force a trailing slash for pages (as they may have sub-pages, and can also be directories) EXCEPT if the active URL is a file (e.g: somesite.com/some-folder/some-image.jpg) - in which case, do not force a trailing slash (files are never folders / directories) But if the file extension is page-based rather than resource based (e.g: .html) then strip the extension and finish with a trailing slash SEO is about avoiding risk. If there is conflicting information on a subject, pick the tried and tested (safe) method Note that if you are on an MS / IIS server (rather than Linux / Apache) you may have to modify web.config instead of '.htaccess'

    Intermediate & Advanced SEO | | effectdigital
    0

  • There are several ways to do this, some are more accurate than others. If you have access to the site which contain the web-page on Google Analytics, obviously you could filter your view down to one page / landing page and see when the specified page first got traffic (sessions / users). Note that if a page existed for a long time before it saw much usage, this wouldn't be very accurate. If it's a WordPress site which you have access to, edit the page and check the published date and / or revision history. If it's a post of some kind then it may displays its publishing date on the front-end without you even having to log in. Note that if some content has been migrated from a previous WordPress site and the publishing dates have not been updated, this may not be wholly accurate either. You can see when the WayBack Machine first archived the specified URL. The WayBack Machine uses a crawler which is always discovering new pages, not necessarily on the date(s) they were created (so this method can't be trusted 100% either) In reality, even using the "inurl:" and "&as_qdr=y15" operators will only tell you when Google first saw a web-page, it won't tell you how old the page is. Web pages do not record their age in their coding, so in a way your quest is impossible (if you want to be 100% accurate)

    On-Page / Site Optimization | | effectdigital
    1

  • Moz has a pretty in-depth article about it. https://moz.com/blog/an-introduction-to-google-tag-manager

    Inbound Marketing Industry | | HashtagHustler
    1