Hi,
How do I "force" Google to change the Parent Organization in the Knowledge Graph box? I've gone about & suggested changes a few times by the Parent Organisation value is still being displayed.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
How do I "force" Google to change the Parent Organization in the Knowledge Graph box? I've gone about & suggested changes a few times by the Parent Organisation value is still being displayed.
That's good to know! Thanks!
Hi everyone,
We are changing a website's domain name. The site architecture will stay the same, but we are renaming some pages. How do we treat redirects? I read this on Search Engine Land:
The ideal way to set up your redirects is with a regex expression in the .htaccess file of your old site. The regex expression should simply swap out your domain name, or swap out HTTP for HTTPS if you are doing an SSL migration.
For any pages where this isn’t possible, you will need to set up an individual redirect. Make sure this doesn’t create any conflicts with your regex and that it doesn’t produce any redirect chains.
Does the above mean we are able to set up a domain redirect on the regex for pages that we are not renaming and then have individual 1:1 redirects for renamed pages in the same .htaccess file? So have both? This will not conflict with the regex rule?
Thanks so much for taking the time to respond. Our website still has a small amount of SEO authority and I think too much internal links is spreading our equity thin. Having a look at our pages, the blog and product categories are inflating our internal links. I'll see if I can remove these.
Hi Miriam,
Yes, these are very helpful! Thanks for the advice on doing a competitive audit. I will certainly do this for both clients! 
Hi everyone,
Too much of anything is not good. In terms of internal linking, how many are too many? I read that the recommended internal links are about 100 links per page otherwise it dilutes the page's link equity. I have a concern about one of our websites - according to search console, the homepage has 923 internal links. All the pages have a corresponding /feed page added to the page URL, which is really weird (is this caused by a plugin?). The site also has an e-com feature, but it is not used as the site is essentially a brochure and customers are encouraged to visit the shop. I assume the e-com feature also increases this number.
On the other hand, one of the competitors we are tracking has 1 internal link site-wide. Ours is at 45,000 site-wide. How is it possible to only have 1 internal link? Is this a Moz bug?
I know we also need to reduce our internal links badly, however, I'm not sure where to start. I don't know how these internal links are linked together - some aren't in the copy or navigation menu. When I scan the homepage links using 'check my links', the total links identified for the homepage is only 170.
Hi Miriam,
Thank you for your response and sorry for the delay in responding!
1 - No, there are no other businesses in the same category located at this address. I have seen instances where Google filtered out same category businesses in the same building - one of our clients is actually experiencing this! They are lawyers that share the same office as 3 other lawyers and our advice was to try and have the best ranking power out of the listings they're competing against.
2 - Camp Hill is a suburb in the city of Brisbane. Customers usually use the term "pool builders brisbane" or "pool renovations brisbane" when looking for local pool service providers. We want to rank highly for these terms and also appear in the local maps for these terms, but we are getting filtered out for some reason. I understand that results vary also on proximity of the searcher, however, where I'm searching from (Brisbane's central business district), the business doesn't appear readily on maps. The business is only about 4km away from where I'm searching from, but competitors that are as far as 19km away appear on maps with no issues.
Any help would be appreciated. Thanks Miriam!
Hi everyone,
We are having an issue with this local business. The Google listing isn't immediately appearing on the map. You have to move the map or zoom in and out for the listing to appear. I find this really odd as our competitors - with no reviews and way further in proximity - are appearing with no issues.
The listing is only about 4km where I'm doing the search, while competitors with no reviews are about 20km away. We are ranking in the top 5 organically for the search term I used (pool renovations brisbane), but nowhere in local unless the map is moved. When the listing appears, sometimes the pin also looks grey instead of red, while others are red (if that makes sense).
On top of this, their organic rankings have also been on a downward trend since June. I'm currently doing a backlink audit to see if it's contributing to the issue. If anyone also has other ideas, could you please let me know? Thanks.
Yes, thank you very much!
Great article, thanks for sharing!
When Google rolled out Panda and Penguin, I assume these links have been discounted and any site punishment has already happened. The website is performing well, however, just recently these comment backlinks have started popping up again as "new" backlinks on Ahrefs. Being "newly-found", will this have a negative impact on the website? Should we be worried about these and start disavowing or is it OK to ignore them? Thanks.
Thanks! We probably should have combined JS with CSS and not built a site fully reliant on JS. This looks like what our competitors have done.
Thanks so much for the very helpful insights and for running our website through tests, I appreciate it. I'll try running the site on lighthouse. I agree we do have speed issues that we need to solve. Our page is also not showing up at all with GSC fetch and render.
Also, I tried Googling our brand + content within the expanding tabs and some did not show up on the SERPs. All other content not in expanding tabs showed up. I know Google still reads and indexes tabbed content but treats it with less importance. But I guess, not all of it will get indexed.
Thanks again! 
Hi Brett, thanks for your response, I've read a couple recently published articles, but this was the one that stood out - https://www.elephate.com/blog/ultimate-guide-javascript-seo/ and kinda alarmed me.
There is a part there that says: there is virtually no real life case of a client rendered JS website/brand/store ranking high. So I can’t guarantee that your JavaScript-rich website will rank as high as its HTML equivalent.
Our site was built on WordPress, but predominantly JavaScript. We have been really working a lot on on page content and link building the past 6 months, but we could not beat our competitors in the top 3 for the keyword 'seo brisbane'. The closest we've gotten was #6. We've been monitoring their sites as well and it looks like only 1 is doing active link building. The others seemed to be just cemented there.
We're looking at other reasons why we're not moving up and Javascript is one of them.
We have other sites we manage that are also experiencing slow progress. So you are right, my question is centered on how Javascript sites affect SEO and how to know if it's the culprit and how to fix.
Thanks!
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language?
What I've gotten so far:
Javascript can block search engine bots from fully rendering your website.
If bots are unable to render your website, it may not be able to see important content and discount these content from their index.
To know if bots could render your site, check the following:
Google Search Console Fetch and Render
Turn off Javascript on your browser and see if there are any site elements shown or did some disappear
Use an online tool Technical SEO Fetch and Render
Screaming Frog's Rendered Page
GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???)
Using our own site as an example, I ran our site through all the tests listed above. Results:
From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
One of our shopify sites suffered an extreme rankings drop. Recent Google algorithm updates include mobile first so I tested the site and our team got different mobile-friendly test results. However, search console is also flagging pages as not mobile friendly. So, while us end-users see the site as OK on mobile, this may not be the case for Google?
I researched more about inconsistent mobile test results and found answers that say it may be due to robots.txt blocking stylesheets.
Do you recognise any directory blocked that might be affecting Google's rendering? We can't edit shopify robots.txt unfortunately. Our dev said the only thing that stands out to him is Disallow: /design_theme_id and the rest shouldn't be hindering Google bots.
Here are some of the files blocked:
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9103034/checkouts
Disallow: /9103034/orders
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /discount/*
Disallow: /gift_cards/*
Disallow: /apple-app-site-association
Check the websites of the competitors that outranked you. What's their on page like? Do they have a local presence in the US? Who are linking back to them? Do their sites have a better link profile than yours? Check their estimated traffic per month. Is it higher than yours? You can get the link and traffic data from AHREFS.
Try adding the areas you serve in your copy. Plus, on the footer and/or contact page - add your business name, suburb and postcode, hours and phone number. Even if you don't have your full street address on your website, Google should be able to tell the areas you are serving based on your website content and rank you for local searches accordingly.
Yes, if your meta descriptions are duplicates, Google won't use these and instead get a snippet of content on the page. If you want Google to use your assigned meta description, write original and quality descriptions. If there's absolutely no way for you to assign individual descriptions on pages (I've seen some page structures built to share 1 metatitle & description - it's ridiculous), make sure the first few sentences of the page are acceptable as your description as Google would likely pull that info.
Using a canonical depends on how much duplicate content you'll have on the page. If it's a sentence or two, an attribution at the end of the article should be enough. If more than that, you may have to add a canonical tag on the section of the page in order to avoid the duplication issue. If a page has a canonical, Google would likely not index the page and index the original source instead.