Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
PDF in search results?
Tons of good suggestions here. Some profiles that can easily move up high in the SERPs for a non-competitive name search can be: LinkedIn, Twitter, Quora, About.me, Product Hunt, Medium, Pinterest. Create a few of these, and link to them from the exact match domains, mentioned by Russ.
| anthonydnelson0 -
Local Landing Pages struggling with rankings although I've done most things needed. Any idea?
Content needs to be vastly different, not just slightly varied. This can be painful, and take a lot of time and creativity to figure out how to write the same 300-500 words in a different way. I wrote a personal blog post not too long ago on ways to write content for location pages: http://doyledigital.com.au/content-for-location-pages/
| DoyleDigital0 -
Referral Spam
According to Google, they don't use any Google Analytics related data for rankings - https://www.youtube.com/watch?v=CgBw9tbAQhU Is it a type of commercial warfare by competitors? No, it's spam, which is targeted towards your curiosity - "oh, I got this visit from this website. Let me check it out". The only negative effect is polluting your GA reports and data. Fortunately, there is a way to setup filters and block all ghost and referral spam. Here is an article - https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter About experience with it - yes, I have been fighting it for quite long time by now and what I see is that the amount of referral and ghost spam to our website grows like crazy, however rankings are not affected at all. At least, they are growing, accordingly to our SEO work.
| DmitriiK0 -
Wordpress log attempt spam?
You can install something like https://wordpress.org/plugins/better-wp-security/ There are settings to auto-block people after a certain amount of incorrect logins (or blacklist). I also recommend changing some Wordpress defaults (like the login/register page) which will help prevent lots of bots.
| OlegKorneitchouk0 -
The W3C Markup Validation Service - Good, Bad or Impartial?
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important. I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good. W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected. Practical people have a different opinion. I try to be as practical as possible.
| EGOL0 -
How to avoid instead suggestion from Google search results ?
Yes, some of our partners have websites. We have updated links to those websites in www.zotey.com
| segistics0 -
Duplicate Page Content for www and non-www. Help!
Ahh great stuff. Glad you managed to track it down -Andy
| Andy.Drinkwater0 -
SEO trending down after adding content to website
Thanks all! Happy to report that today we started trending up again. @matt - they were new keywords @moosa - yeah I am confident in the web copy, it was written for users, not bots but still gave some keyword rich text organically. Again, thank you all for your great answers!
| swat18270 -
Product meta tags are not updating in my Magneto website!
I'm trying to update the title tags and meta descriptions, I'll have to ask my web developer about that but thanks, Everett.
| One2OneDigital0 -
This question has been removed
We find with the vast majority of our client's websites it takes a little while for Google to notice you. Give it a bit of time and I'm sure you'll start seeing your brand name up there. And as pointed out above, get some high quality links pointing at your sites and you'll start to see some improvements.
| phil9070 -
Automate XML Sitemaps
Hi there. Upon request to sitemap.xml run a let's say sitemap.php or .js or whatever, which would read the directory for new files, then update sitemap.xml. If you're using CMS - then it should have automatically generated sitemap by default. If not - the same idea, just instead of reading directory for rhysical files, read database. Cheers.
| DmitriiK0 -
Apple has recently disabled all third parties cookies in all safari browser either ipad, iphone or desktop
Hi I still only have version 8 so I can't test it, but my GA is showing data for version 9 with sales. I have also checked that these aren't Paypal orders. Once I get version 9 I will test, but our visitors are currently able to checkout. Thanks Andy
| Andy-Halliday0 -
Removing extension
Hey Paul, What is the main reason of removing the extension? If you already have SEO+user friendly link structure and getting good organic traffic, I don't think you should mess around with anything. As this .php is really better than scary dynamic codes in URLs. The thing is, this change can cause you a massive traffic and ranking drop. If you can take that risk, all you have to do is to make redirection of every link and update your sitemaps. And when you're doing this, make sure to launch aggressive campaigns to maintain the flow of traffic. Hope this helps! Umar
| UmarKhan0 -
Site hacked in Jan. Redeveloped new site. Still not ranking. Should we change domain?
Hi Kate, Thanks for your message: SPAM was identified in October 2014, so we asked our hosting provider to re-install a clean backup of the site. We didn't have a clean offline version, so we were using automatic backups stored on the server. We changed FTP and CMS passwords.We did this on 3 separate occasions in October, but the issue remained. In November, we subscribed to Securi.net and had them scan and clean the site. They did this twice, but the spam remained. By January 2015, all the content keywords for the site were showing as Chinese, so we took the decision to re-develop the site on a new CMS and move hosting, as our hosting provider had provided no assistance whatsoever. The new site launched in March 2015 - we did all the 301s etc. that are required when re-developing a site. The Manual SPAM action was also applied in March. There is no user-generated content on the site. We submitted the reconsideration request on 4th of April and it was accepted on April 17th. I started to disavow the links and work with parameters to ask Google to no-index the URLs using parameters at the start of May and removed the old files from the old server on 3rd of June. At the end of August, I went through all the remaining records for SPAM URLs returning 404s and amended the robots.txt to disallow the remaining SPAM directories. I also ran the google Site:ultimatefloorsanding.co.uk search and added all the remaining SPAM URLs to the robots.txt and Google URL removal tool. I'm really not sure what else I can do. So we are thinking of starting again, but are reluctant to do so. Is there anything else you can think of that I can do. All advice would be gratefully received. Thanks Sue WhPNH2X
| galwaygirl0 -
What is the best way to correct GWT telling me I have mobile usability errors in Image directories
Roberto, I guess you should try out this method. I'm sure, it will work. Do let us know if it works out.
| UmarKhan0 -
Is my knowledge graph code wrong?
Thank you IssueTrak, Although we have been a member for a long time we just started being more active on the Q&A, more stars will come. Let us know how it works out.
| VERBInteractive0 -
Strange URL's for client's site
Thank you for the great advice Dirk! I will likely have to get one my more technical co-workers to help with this, but now I can at least adequately describe the problem and solution to this. Three separate URL's for the home page alone is definitely a priority to be fixed. Thank you again!
| everestagency0 -
Multiple H1 Tags on Page
Hello Chad, Absolutely, depending on how you structure your tags. The biggest problem with multiple H1's is that using different keywords can confuse Google and Users about what a page is attempting to rank for. If you have multiple keywords you are attempting to rank for, you are better off creating different pages with a small number of H1's optimized to each keyword respectively. This will help Google contextualize the page. This comes up frequently on Moz Q&A, and opinion has shifted throughout the years, but it is now considered best practice to use a single keyword/H1 tag for each page for best ranking potential. Hope this helps, Rob
| Toddfoster0