Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Best Practices to Design Site Mock Up Using Wordpress Rather than Wireframes?
If your developer is asking you to select a theme, it sounds like he wants to work directly on a theme because he's going to create a child theme. By selecting the parent theme first, the structure is already there and he can do some easier CSS tweaks. If I were you, I would step back and determine whether you are paying him to create a custom theme for you, or just to create a child theme. There are pros and cons to both. The pros to a child theme are that the parent theme should continue to receive updates so your theme is not frozen in time; as accessibility, design, and SEO enhancements go into the parent theme, those will continue to apply to your site. The main advantage of a custom theme is the lightweight code base - your site should load faster as it only contains the elements you need, and not a bunch of extras that you may never use. Depending on how it's coded, a custom theme may also be better for SEO - again because there's simply less code there weighing things down. If your contract with the developer does not state whether he'll create a brand-new custom theme or just a child theme, I'd take the time to have a quick conversation and get both of your expectations on the table so you can determine what will work best in this situation. If it turns out that a child theme is the best option for you at this time (perhaps due to budget, or due to a desire to have continual updates but not have to pay someone to continually update a custom theme), then I'd have no qualms allowing the developer to work directly in WP on the child theme. It's easier to adjust things on the fly and show a client an actual prototype where they can resize the browser and see all the responsive sizes, and it will save time all around. However, if a custom theme is the way to go, I would ask if he is giving you a truly custom theme or if he's working with some predetermined framework or parent theme and only making tweaks to a child theme. That really is the only reason I can think of that a dev would insist on designing in the code itself rather than mocking things up.
| WebElaine0 -
Trailing Slashes on URLs
Have you seen this related question? You can edit your .htaccess file to remove the trailing slash everywhere. Or, if you want to enforce the trailing slash and you're using Yoast, you can do that with one check of a checkbox. Either should be fine, but I think you have the right idea planning to remove all trailing slashes since that's how external links reference your site.
| WebElaine0 -
Posting same content to different high authority websites
Hi Jonny, Are you talking about using small differences in content to post two different Domains? If so why not use Hreflang? granted it doesn't make sense for every website but if they're essentially the same site which I don't know if they are not this would help you quite a bit. Allowing you to post the content English let's say on one and German on another is that what you're trying to do? When you say backlinks are localized are you talking about having a different TLD or you talking about the same domain and TLD? if you're talking about the same domain and doing something like example.com/Kentucky/shoes and you want to have content on another domain with something like example.com/Annapolis/shoes you would most likely find out that Google would pick one of the two and rank them. However, if they are outside of the United States please look into using Hreflang if you could give me some examples of the URLs and just put example instead of the actual domain or example 1 example 2 if their different domains I can give you a much clearer answer. Also if you could discuss the content differences that you used to localize the content that would make things a lot clearer as well. Hope this was of help to you, Thomas
| BlueprintMarketing0 -
Hyphens in Keyword
Hi, John Mueller said it depends, "sometimes they're seen as reciprocal synonyms, sometimes not." So in some keyword cases, Google would show the same results for a keyword with and without a hyphen and other times, it won't. Also read this @ https://www.seroundtable.com/google-hyphens-in-search-queries-23965.html Hope this helps. Thanks
| Alick3000 -
Will link juice still be passed if you have the same links in multiple, outreach articles?
ok, you said link scheme and that text viries slightly, im sorry if I missunderstood your question. Does the pages linked in B or C have a link to A? If that´s the case, then you are in trouble. There is no way to assure that will be no harm, because google keeps the whole history of every site, so they can link internally from that history and somehow punish sites. From my point of view it can be a little suspicious to Google. I´d separate a few weeks from publishing each blog post so it looks less unnatural. And also, remember not using exact match anchor text in every text and/or using the same anchor text. Hope it helps. GR
| GastonRiera0 -
AMP for Online Forums/Communities
Hi Patrick, In my opinion, implementing AMP should be considered if happens some (or all) of these scenarios: Mobile traffic from google is really a big percetage, Site speed is not good and improving AMP causes less effort and/or better improvements, If revenue is hurt by slow page speed. NOT just making the effort whether other sites like yours has AMP. Also, remember the last part of Eric's article: Making the AMP page correct, with the right user experience and that has no errors. Dont know if you've read it, a few weeks ago, there were some news about the AMP project, saying that will improve how it's URL is shown. This is: Improving URLs for AMP pages - AMPproject.org Hope it helps. Best luck. GR.
| GastonRiera0 -
Is anyone seeing excessive ranking fluctuations?
SEMrush Position tracking is showing a lot of false flux lately for a couple of reasons. 1) They don't identify your position in the SERPs and report it as "not in top 100". If you click to see the cache that they used to obtain the rankings your domain is there. 2) Google has been inserting and removing a link to the image results at the top of many SERPs. "Images for [your query]" When present, SEMrush is counting this link to the image results as position #1 and the result is all results below are pushed down one position. When not present, all of the results below move up one position. You don't notice this very much in your visibility score if you are on the second page. However, if you have keywords that rank in the top three or four it can make a huge change in your visibility score. 3) Google does seem to be experimenting a bit and even losing track of a few pages for the keywords that they typically rank for. So, yes there is a little more flux in Google, but the number changes in many of the tools are a result of contaminated data and bad data collection.
| EGOL1 -
Why is our noindex tag not working?
Hi Eddy, Edit: this was already answered before I could post my reply. But I've left the example. The issue with the meta robots tag is that you are using curly quotation marks around robots and noindex: You have: “robots**” content=“noindex”/> Instead of: name="robots" content="noindex"**/> This will fix your issue. Cheers, David
| davebuts0 -
Volatile SERPS?
We're tracking 5 phrases. Two have improved and two are static, but this free-falling one is our most important. They are very deficient in backlinks. The competition started years ago, and even the previous SEO guy, according to ahrefs, got only 36 links in a whole year. We can't beat the competition on total links, but I can beat them on higher quality. I'm a part of the local blogging community and have done a ton of work with local non-profits. Even so, it seems more volatile than it should. A competitor jumped 8 positions whose top link is from a parked domain page advertising porn sites. Not a single quality link according to ahrefs. Then I saw this thread in webmaster world talking about some glitchiness and wondered if anyone else had seen it. https://www.webmasterworld.com/google/4880306.htm
| julie-getonthemap0 -
Navigation Menu - Whats too much
Reviewing analytics and running usability tests are the two best ways to decide how much to have in each navigation system. There are several different forms of usability tests that you could use to determine the best way to organize your website and how to label each page: TreeJack is a service that will let you try out different navigation menus to make sure people can find what they're looking for; you could do card sorting which gives people a set number of categories but they physically or virtually group the cards into categories and you can then use the categories in your navigation; you can even create prototypes with a tool like Axure or Balsamiq and have people try out a few different options to see which one works best. If you don't have the time or budget for usability testing, looking at analytics is second best. Things to look for: what content is the most visited on your website? Are people getting there by navigating through your website, or are most of them coming directly from organic search to those key pages? How long do people spend on particular pages? If some of the pages have very low time on site, it's a good idea to shorten the navigation path - you can either deep-link to those pages in sitewide navigation or just look at specific pages and add smaller nav menus within say a sidebar or a CTA button within that page's content, which gets people from that page to a deeper page with 1 click versus drilling down through several different links one at a time. Another great place to look: if you're tracking site search, see what people are searching for and what pages they're searching from the most. If 75% of people who visit the homepage search for 1 of 3 terms, then put prominent featured sections about those 3 terms right there front and center to help them get there. Also take note of the specific keywords people are searching by and use those as your navigational labels - that can be even more helpful than simplifying hierarchy, if you name things the way people use them naturally. In my personal experience it's best for SEO as well as for users when you stick to the old no-more-than-100-links-per-page rule. If you provide too many options, people just get overwhelmed and don't know what to pick. So my own rule of thumb is to only link to about 5 top-level pages in my sitewide header navigation; under each of those have no more than 4 to 5 sublinks, and leave it at that. But I always make it very, very easy for them to drill down deeper - if the site is 4 or 5 levels deep, those 4th and 5th levels are accessible from the 2nd and 3rd directly, so they don't have to click 5 times to get down 5 levels - they can hit the homepage, go to a 2nd-level page, and from there straight to 5th-level if that's what they're looking for.
| WebElaine0 -
Domain authority a better metric then referring domain count?
DA is better. 1. The number of referring domains is one of the metrics used to calculate Domain Authority. 2. My website can have 3 times more referring domains, but will that make my website better if those domains are trash with 0 trust?
| Igor.Go0 -
User Intent - Office Chairs & Content Writing
Yup, can do the same approach with SF. You can run it in List Mode which will let you upload the list of URLs to crawl, and you can set up an Extraction to separate out the h3s (Configuration > Custom > Extraction) Paul
| ThompsonPaul0 -
I've screwed up. Domain pointers I forgot about. Think I am getting dinged by google.
Hi Doug If you have duplicate content then you could add a cross-domain canonical on all the pages from site 2 to site 1. Then when it's dropped away just 301 everything, That means you'd still get direct traffic to it but Google would rank the main site 1 and drop site 2 because all the canonicals would reference site 1. You just put the 301s in the .htaccess file I wouldn't do it this way - I'd just make sure all the content was on site 1 then 301 but I understand you might be nervous. Regards Nigel
| Nigel_Carr0 -
Same server for different client sites?
Thinking about this further, Wix, for example - would have multiple sites on one server. The same underlying code runs a wix website, but the content is different. This is kind of like the scenario we have, although obviously we're not as big as Wix and we'll have fewer sites on the same server. But that's the scenario - same underlying CRM that clients use to 'build' their site, so in that sense some of the code/framework is the same, but each client adding their own content. There is no way around that the code is the same, but the content is different - so that should be OK - right?!
| Go-Auction0 -
React.js Single Page Application Not Indexing
Hi, I'm dealing with ReactJS sites on a daily basis and luckily haven't seen any big issues. But the most common issues I see that come across are that Google isn't able to view the content on the page. What is usually the starting point is using the Google Fetch and Render tool in Google Search Console and from there figure out if the page is being rendered correctly. If not, next steps are likely going to mean that you need to fix that first before moving on to other areas. Martijn.
| Martijn_Scheijbeler0 -
Does this matter? spammy image links eg: sites loading our images on their spammy domains
I also just realized that these are not just images being hotlinked. It seems that on each page with the hotlinked images there is also an actual link with an anchor of "." being used.
| plahpoy0 -
Keywords and keyword traffic
Hi Brooks Thanks for your reply, yes helpful thanks. Not sure if i made it clear above this is a clients website, they are based in the lake district and the operate from there. The scope of their primary customer would be as far reaching as the whole of the UK who would search for their service in the Lake District as well as local people searching for that service. So local searches would I guess just use "tipi camping" and everybody else I'd hope would search for "lake District tipi camping " When optimising the site content I guess I would need to target "Lake District Tipi camping" in the content and title tags etc. Presuming this would automatically take care of the local searches this way too. It was just the low volume traffic i was some what concerning. I have also been focusing my efforts on ensuring a good local presence, citations, reviews google business etc.
| Bengo-990 -
Allowing correct crawlers for GeoIP Redirect
Actually, geo-basedIP redirects are still a very bad idea from a user and bot perspective. While Google has said it is testing crawling from other areas, they still primarilycrawl from the US. If you do Geo-basedredirects, they will only ever see the US content. Users travel. People travel. Assuming a user should only see a certain set of content based on their physical location is assuming too much. Use case in the consumer field: While attending a friend's wedding in London, I could not get to the US version of a site where I wanted to buy furniture to be delivered in a few weeks. Use case in business: Users travel for business all the time. If they are visiting a headquarters in another country but researching a topic for use in their home country, they might be seeing the "wrong content." Rather than assuming, use IP detection to ask the user to set their location. "We see you are in the UK, do you want to set that as your preferred location?" Once they choose their location, a cookie is set and that is all that user sees from then on out, until they change that setting in the footer or in their account.
| katemorris0