Happy to have been of help, please do let us know how the test goes 
Posts made by mememax
-
RE: Implementing nofollow tag affect ranking
-
RE: Internal anchor text
There are places where variation is not only allowed but also recommended, ex. external backlinks, internal links from other pages, etc. You may want to use call to actions or LSI keywords. You can use this tool to find interesting LSI terms: https://lsigraph.com/
about the breadcrumb I would keep it always the same. Breadcrumbs purpose is not for optimizing anchors but to give users a navigational support so if I were you I would always keep them descriptive but never over optimized.
-
RE: Internal anchor text
You have to be natural. That is the key. If the page you're linking is about "red leather boots" then the anchor should reflect that. What you have avoid is forcing things.
Example of a good breadcrumb
Home > Shoes > Red Leather BootsNow an over optimized one
Home > Boots > Red > Red Leather BootsYou are repeating words that do not add any value, same things happens with URLs many people repeat their product in the folder then in the URI of the page.
My recommendation would be to keep it optimized but without forcing anything. If you are in doubt if Google will like it or not, think about your users. Will they like this link showing that text or will it be too redundant? there you got the answer

-
RE: How can another country domain appear in a page title?
what's the IP they're hosted? Maybe google is serving the same site as their IP is identical?
I see that the .de version is returning a security certificate issue so I wonder if Google may be demoting the site.
Moreover, where are you guys googling from ? Maybe google is forcing the UK version as you are searching from UK.
-
RE: Implementing nofollow tag affect ranking
I think that having outbound links is a key part of an healthy site (unless your accepting a guest post for each post you create). I think there are also some good assets from linking outside of your site:
- you may get links in return as using those outbound links as part of an ego bait campaign
- people you're linking to may want to give visibility to that mention so they may share that article on their social media and other channels they control
- there are some articles about outbound link cogitation which works similarly to internal backlink cogitation but instead of considering inbound links it consider outbound. In other words, sites linking to same sites are somewhat similar
As you can see there are different approaches, but natural outbound links are a natural part of the we, which is the "web" because of its networking system. Cyrus, covered the topic in a previous whiteboard https://moz.com/blog/external-linking-good-for-seo-whiteboard-friday you may get some nice insights from there
There are many things you may do before considering adding the nofollow tag, maybe you can apply other optimizations to improve your rankings. Anyway, if you think there is some good opportunity, you can always test it. Instead of testing the entire site at once why don't you just modify a couple of pages and see if there is a significant ranking change? Data is far superior than any opinion I can give you

-
RE: Redirects Being Removed...
Hey Becky,
quick note here, I may not be understanding correctly but I see one thing omitted. While I agree that in the long run you'll need the 301 ony from pages which receives links from outside in order to keep the SEO juice flowing into your website, I think you should also apply 301 to pages which doesn't, just because need to understand that when one page returns a 404 which is the one you want to be indexed instead of it, (if you have one). This will make the transition process easier for google to digest.
e
-
RE: Two companies merging into a new website. How to merge two existing websites into a brand new website and preserve search rankings.
well I think you highlighted all the steps you had to take, and moving to a brand which has more trust seems the right move. But if you're moving now to brand C I think you should jsut follow the same process using BRand A stronger structure on brand C and replicate it for brand B, am I missing something?
-
RE: Blacklisted website no longer blacklisted, but will not appear on Google's search engine.
first of all let's make a step back.
1. when you got hit by a manual penalty (which I asusme is the case as you filed a reconsideration request in search console), you never recover to 100%. You've made a dirty play, got a yellow card for that and you have it, at least is what I would do in my personal life, so it makes complete sense that google marks people that tried to game their system anytime in their life.
2. page 1 has just 5 free spots as other 5 (from what I see) are youtube/vimeo videos, so it's harder than you may think for getting there. I'm in fact seeing you rank with your https://www.verdictvideos.com/services/day-in-the-life-legal-video/ within the vimeo results, which means you may have done a good job in your multimedia by building sitemaps and sharing them around.
3. your backlink profile is really poor. You just have 4 links pointing to the homepage, and your domain has a trust flow lower than 20. I personally find it low. And speaking about legal stuff I would want to have a trustworthy website being featured at the top. Half of your backlinks are blackfriday stuff built on low quality websites... I don't know if you have removed them or not but they seems to be an huge part of your backlinks, and the other part are highly optimized keyword rich backlinks,
I would try to variate them more and build more relevant links on more autoritative websites which is what you need. Your competitor doensn't seem to have a very solid linkbuilding profile either (<cite class="_Rm">coltoncreative.com</cite>) so with the right links and the good optimization (I haven't checked your onsite SEO) you may achieve good results.
-
RE: Does content revealed by a 'show more' button get crawled by Google?
this is one of the few things where google has a pretty clear statement:
"If you think a content is relevant to your users you should always make it clearly visible"
If you think about that it makes complete sense, if someone searches for a content and clicks on a result they expect to see that text, if that is hidden somewhere they won't consider that result relevant for their search, and that's what google do not want to happen.
I have to agree that the 500 words content still works for the long tail, so I would say, keep your important content at the top of the page and reference other supplementary content at the bottom or at the side but always try to make it visible.
You can see Google standing on Barry Schwartz latest article on google discounting tabbed content
As an addiitonal thing it's totally safe to hide some content on your mobile version if you've a responsive website for improving user experience, as far as that content is clearly shown in your desktop version.
hoep this helps
-
RE: International SEO - Domains or Folders?
Boom, That's an huge question Lee. And the answer is also huge. I will try to summarize as much as I can. Some points which are common to all solutions are:
-
You can easily achieve the localization by setting the folders/subdomains/domains in search console and the language focus via hreflang.
-
Normally you shouldn't be facing duplication issues if you deliver the same content for US and EN because duplication happens only within the same market. So having two identical pages focusing on two markets (us and EN) shouldn't be an issue for google.
Starting from here, there is NO best solution, but there is a better solution depending on your dev resources and your Goal. For tackling internationalization you have 3 options:
-
Folders. (domain.com/en - domain.com/es) this is the easiest solution technically speaking as you can easily recreate what you have in different folders and also take advantage of the strenght of your domain by passing the juice from the main domain to its subfolders. The "con" I see with this approach is scalability. This is probably the best solution for smaller websites, with less dev support but which won't be growing too much. You don't have too much room for customization and also you're not getting the benefit of a loclaized TLD and domain name which could be a deterrent in some cases.
-
subdomains. (en.domain.com - es.domain.com), while still not getting the value of full localization, you have some differentiation you could apply as the subdomains could be assigned to different server/hosting, and you can also build two different sites while using the same domain. The technical challenge is a bit higher but you won't be getting as much value as you are getting from your main domain as if you were using folders. Folders vs subdomains is the eternal debate but I think that there is a clear perspective in the seo community that folders preserve more value of the original domain rather then subdomain, which are seen essentially as different sites by google. BTW you can find a nice whiteboard from Rand about the value of subfolders vs subdomains here
-
dedicated TLDs/domains. This is the most dev and SEO heavy solution as you're creating something from scratch, which initially has 100% cost and 0 value. This is the solution that pays the best in the long tho as it is the one that leaves room for more scalability. The main advantage I see here is on the linkbuilding side. all the links will be highly relevant to the local site, as you will have all US links to your US site and all UK links to your UK site. Here another point from rand about the three solutions
So at the end, you decide, but if I were you I would try to take the most advantage of what you have today using the folders solution. In this way you can enter a market with some value that would help you ranking higher as if you were starting from scratch, and if you see room for improvement you can always get serious about it and create a new domain where you can 301 all your existing content. this is a delicate process, but in this way you'll be safe at the beginning when things are harder, and once you decide to invest more you can swap to the domain approach with a more dedicated service AKA linkbuilding heavily on the new domain.
I can keep speaking about this for hours but I will stop here as you may want to ask more questions and get a better view on what are your options and doubts about each approach. I can link to a ton of resources, but I would say that your best readings are from Gianluca Fiorelli, who is not only a Moz associate but also an expert in international SEO.
I hope this helps.
e.
-
-
RE: AdWords Editor auto-correcting keywords
Not sure about this:
Over the Summer, Google updated the Keyword Planner to combine more terms and it made it less accurate. You can no longer search for exact matches, misspellings, or plurals using the tool and have to take the aggregate data. Hopefully Google will build a new tool to fulfill this need that many marketers have.
I'm actually seeing the old keyword information, in order to do that you need to have an active adwords account. Not sure if there is any spent threshold tho.
Or if you are trying to do keywords discovery you can easily use keywordtool.io which is a combination of adwords kw planner and ubersuggest.
If you have actively running adwords campaign, I wouldn't use SEarch console data but run a search terms report so you can real user queries reaching your site.
Hope this helps.
-
RE: Any benefit of Geo Targeting in Organic Positions?
I'm unsure if I get your request. are you asking for which are the advantages of setting your site to a given region in Search Console or to apply GEO targeting like HREFlang?
-
RE: Facebook Like to Download and Page SEO
Getting on the root of y our question, I don't think this would hurt your SEO directly.
the only consequences (then you'll determine if you already took them into account or not) I see are this ones:
- if you're not allowing users to see the link, it means you're blocking google too, meaning you're not getting your pdf indexed
- it would be harder for tracking purposes, as you'll have to do double click, you can still track downloads but again I think things get a bit trickier
- As Ajay correctly says, you may have issues with the Policy team in FB, as you're forcing likes to get downloads. This was already discouraged from FB to get people to follow a page in order to participate to contests. I don't think it will be huge but you ahve to take this possibility into account.
My recommendation would be that if you want people to like your site, ensure you provide them a reason for it, not just force him to do so, you want engaged likes which may return back to your site not stand alone likes that will not match your real users pool. Hope that helped.
-
RE: I'VE DONE EVERYTHING RIGHT BUT STILL GET LOW GOOGLE RANKING
I'm glad it helped, it will be good to have a look at your website in a while and see how are you doing after receiving so many feedbacks

-
RE: I'VE DONE EVERYTHING RIGHT BUT STILL GET LOW GOOGLE RANKING
there are several things you could do on your site in order to improve it, SEO is a matter of details so start working on them

- internal links. Many of them have a blank anchor, ensure to put your keywords in there
- content. Add relevant content together with images, show 2 or 3 projects in each area and describe them with a nice attractive text
- Internal architecture. Ensure you're not just linking from the homepage down but also interlink your internal pages among them. Example: http://www.pokudesign.com/services/ why not linking to those services? Give a two lines snippet each and then link to a page for each of them.
- Backlinks. This is painful but rewarding. Get them, a bunch of quality links is fine, don't overdue this a be clean. I don't know your competitors backlinks profile but you want to ensure that you're going high quality.
- blogs. i assume those pages are yours. http://www.pokudesign.com/blogs/ why don't you integrate them in your site? Again add content, be catchy and descriptive, I'm sure you can say a lot of things about the nice things you've done, show people you care google will reward you.
that would be a starting point

-
RE: Goals for Sub Domain
Hi Anshul.S, sorry to say that, but there is no way you'll ever get the exact same numbers in Salesforce vs GA. That's because Ga is tracking Goals based on JS which could be triggered several times even without a purchase.
To give you some examples:
- an user could be refreshing the thank you page and you'll be getting multiple visits
- you may have a purchase of 100$, but then the person doesn't pay or cancel the order, it own't be real money.
You are correct about integrating GA and salesforce, but you have to use salesforce to validate GA data which is mostly interesting for detailed analytical purposes not revenue or business ones, where the most reliable sourc eis your own DB.
Regarding the source, I did this in the past, you want to ensure that you create your cookie in the same way Google is doing it.
- Consider that Google changes the source of traffic in case the session times out (more than 30mins) or if there is a change of source, ex. you serch in google enter the website, leave, then enter again using direct.
- Also ensure you're using the same attribution model Google uses. Last click, first click, etc etc
- Ensure that you're correclt carrying over the source in your cookie, by properly QAing.
Hope this helps.!
-
RE: Will it upset Google if I aggregate product page reviews up into a product category page?
Sorry to say that but you're kinda gaming the aggregated reviews schema.
Google states that for aggregating reviews, you're good as far as you're referring to a specific product, but you can't use schema markup to refer to a category.
I mean you can but Google won't simply be showing your markup as that's not the way aggregatedratings are supposed to work, no one will get angry but you'll be getting no rewards for your work.
Detailed info about review snippets here: https://developers.google.com/search/docs/data-types/reviews#review-snippet-guidelines
As an alternative you may want to consider other options like aggregate price.
-
RE: Splitting a strong page - SEO
AS a rule of the thumb, no matter what changes applies on 3XX redirections, but the least you do it the best, not just for juice loss but for easier management of your website. You definitely do not want 2x or 3x 301 to happen unless it's really unavoidable based on how complicated your website is.
Now your best bet depends on what you want to accomplish. In the past I always tried to be conservative and try not to lose too much of my so hardly earned traffic, and didn't want to lose a piece of it, but after a while you see consequences of that, as you start having a mixed composition of legacy URLs on your website.I would say, test in a relatively small section and see what happens. If your loss of traffic/rankings is too significative roll the changes back (don't forget the 301 back), and use your preferred method, but take into account that in the long run you want to have a manageable website limiting exceptions as much as possible.
On a side note, people normally looks at 301 like a loss of value no matter what, but that's not always the case, the big deal with 301s is the loss of value accrued from other pages, so, if after you 301:
- change all the internal links so you don't have unnecessary internal 301s
- contact external websites to get the url changed.
Once you do that, the 301 won't matter at all, as the resources sending value to that page are now linking to the new one.
Hope that helped.
-
RE: Site Not Indexing After 2 Weeks - PA at 1
being a standard drupal setting doesn't mean it makes SE friendly
anyway I have no backup for that theory, I'm just assuming that new sites should be as simple as possible in order to google to get a better grasp of them and start trusting before complicating thing in an unnecessary way, unless the node url is adding any value to you, I would get rid of it and 301 it to its canonical. -
RE: Site Not Indexing After 2 Weeks - PA at 1
I would not focus too much on GSC if the site has been recently built, the place where you want your pages to be is google index

The place I would check is GA to see how many of your pages are getting at least 1 visits and which ones don't. You can then look for the pages which are not getting traffic and understand if they are not being indexed or not ranking high.

This page for example https://www.northshoreymca.org/programs/creative-arts I've noticed that you're using the rel="<a class="attribute-value">shortlink</a>", which I don't think it's adding too much value, probably is making things messier, as you two different version of your page, in fact if you check for /node/ folder you're finding pages you don't want (ex.https://www.northshoreymca.org/node/145 instead of this <a class="attribute-value">https://www.northshoreymca.org/content/ymca-north-shore-annual-gala</a>) I would set up a 301 at least so you ensure that google is not deciding which is the best page to index and serve to users, I know you hav a canonical but it could be something you could test, and help google not making too many decisions, especially because canonical is just a recommendation done to bots, and which google normally doesn't follow boldly for new sites.
hoep this helps, let me know how it goes!