Subdomains can be seen as a seperate site altogether from the main domain. I would recommend keeping the blog on a directory within the main site.
Hope this helps some.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Subdomains can be seen as a seperate site altogether from the main domain. I would recommend keeping the blog on a directory within the main site.
Hope this helps some.
This is a very vague open ended question. I would suggest reading the beginners guide to SEO from Moz.
I would turn and run if someone offered to build backlinks for $5. I would recommend trying to leverage the Fiverr community to illustrate some graphics for your content piece depending on what you are writing about.
Also, take a look at Thumbtack and Upwork you can hire some higher quality free lancers on those platforms to help create some high-quality content. I would invest a lot in creating quality content while looking for ways to leverage and amplify it to build backlinks rather than paying someone from Fiverr to do it.
Think about doing some research on influencers within your target market to amplify your content and build backlinks. Hope this helps you out some.
thumbtack
The first thing I would do is work on trying to drive organic traffic and look at conversion optimization along with some possible user testing. I would also, depending on your budget look at doing some paid advertising assuming you have already done some keyword research. Moz has a good article on conversion rate optimization. Also try using peak user testing and get some free feedback on the usability of your website. You could also look at building out unique content based on certain travel destinations as well.
Hope this helps some.
If possible you want to redirect the page to its equivalent where applicable. It is considered best practice to not do a blanket redirect to the homepage however if there is no other relevant page then the homepage would work just fine. You want to think about what is best for the user. Moz does a wonderful job explaining redirects in this article.
Also, are you asking about a raise in page rank for the remaining pages ?
A good rule of thumb for SEO is to write your pages for the users, not the search engines. So I would assume if it seems logical and beneficial to the reader to link to 2nd and 3rd level pages then it shouldn't be an issue. If it makes sense and provides usability then I would not worry about it.
You could very easily run into a duplicate content issue if you are using content that is identical to other websites on your own website. I would highly recommend reading Moz's duplicate content post. You are basically paying to have someone ruin your seo.
I would recommend finding another company that will provide you with unique content so you will not potentially incur any duplicate content issues.
Depending on what CMS you're using you should be able to add a meta no-index tag sitewide fairly easily. I know with WordPress there's an option in the backend under "Setting", "Reading" Discourage search engines from indexing this site that should allow you to apply a meta no index tag site wide.
If you're not on WordPress you should be able to edit your code and apply a no index sitewide. You might need a tool like FileZilla so you can set up some sort of FTP access and edit your Header file directly.
Cheers 
Google is a great tool for this. Example query - Blogs AND intxt:"marketing" or Blogs AND inurl:"marketing".
Hope that helps some
First off I would not recommend spinning any of the articles at all. That will appear deceptive and very spammy. However, I would look into repurposing existing content for your seo clients rather than spinning existing content. Look at adding something unique to a previous blog post or article instead.
Second as for the archived e-newsletter if it is on multiple sites you are going to run into a duplicate content issue. What you can do if your curious is add your hosting clients website as a moz campaign and crawl it. Then see how many instances of duplicate content you have.
Hope that helps some.
One thing I would check out is that you correctly claimed your local citations. You can use the Moz local too to run a quick scan and check for accuracy and make sure everything has been claimed. I would also recommend running a moz crawl because when I ran a quick crawl a lot of pages are showing a "429" status code - Too many request. This could be a server issue. Here is a previous Moz thread on this issue https://moz.com/community/q/429-errors.
Hope that helps some.
I would block those pages within the robots.txt file and with a Meta Noindex tag. Moz has a good article explaining the robots.txt file and how to block access to certain pages.
I would recommend checking their page rank and spam score for starters. However, whatever links are duplicated I would no follow them.
Are these clients going in a portfolio section of your client's site? I would just consider the user experience and think about building content around each client with a link and a description of services rendered so it isn't just a bunch of links.
Changing the stylesheet wouldn't have an affect on SEO but it could have an effect on user engagement which could reduce rank positioning over time. I would recommend some heavy user testing to make sure there are no negative consequences of switching up your aesthetics on the site. You can try out peak user testing they offer a free service.
As far as the second issue if it makes logical sense for the end user and it helps with user experience to condense the menu items then it shouldn't have any negative effects on SEO. Just remember design with the end user in mind.
You could see a drop in ranking. It is difficult for a single web page to be relevant for a large set of keywords for searchers.
Did you submit a nested xml sitemap? What you could do is segment your sitemaps based on categories then you could see if there are: no index tags, robots.txt blocking a section or maybe those pages are too deep.
For now I would check to see if anything has a no index tag and review your robots.txt file.
Hope that helps some.
Matt Cutts has said 'Facebook and Twitter posts are treated like any other web pages for search'. As of now Google does not use social media platforms such as Facebook or Twitter followers for search ranking.
However on the other hand if you are getting a lot of related question around your products it could be an idea to build content around some of those re-occuring questions and add it to your website in a FAQ style. Then you could engage with people on your social platforms and refer them back to your site.
Yes each location should have its on Google my business page, bing places for business etc. However I would probably just keep one Facebook page unless you have the time to manage multiple social media pages for each location. The addresses need to match up for each location as well. Moz's local search center has some good resources to help get you started.
Hope that helps some.
I would recommend using screaming frog to crawl only product level pages and export them to a csv or excel doc then copy and past your xml sitemap into an excel sheet. Then from there I would clean up the xml sitemap and sort it by product level pages and just compare the two side by side and see what is missing.
The other option would be to go into google webmaster tools or search console and look at Google Index -> index status and then click the advanced tab and just see what is indexed and what all is being blocked by the robots.txt.
I would recommend using Screaming Frog to crawl your site and pull a report for all your internal links in excel and keep tabs on them. It is fairly easy to use and you get information on the Anchor text, destination page etc.
Hope this helps some.