Questions
-
Is it possible to predict the future DA of a site?
Hi there Donna nails it here. You can't predict DA in the future. The best you can do is make sure your onsite and offsite SEO is on point and that you are following best practices put forth by Google. There is a great resource from Moz on Domain Authority and what you can be doing to make sure that you are taking proper steps to ensure higher DA in the future. Best practices + time = higher DA. That's the best prediction you get, but at least the future is bright! Hope this helps in addition to Donna's answer - good luck! Patrick
Intermediate & Advanced SEO | | PatrickDelehanty0 -
What are he benefits of getting 200+ links from DA 40+ non-relevant websites?
Sounds really spammy, in my opinion a single link on the prestigious national organisation may serve them better
Link Building | | TheZenAgency2 -
Google Search Console - Indexed Pages
Yes, I've seen the same thing. I've got a client with a small site of only 6 pages. The sitemaps report shows all 6 have been indexed. Then I look at the Index Status report and find 46 pages indexed. 46 pages indexed on a site with only 6 pages!! So this seems to confirm your comments Logan. The question I have is: how can I get a list of all the indexed pages?
Intermediate & Advanced SEO | | muzzmoz0 -
CPC of Adwords Remarketing for Search (RLSA)
It is not priced exactly the same. They’ll charge based on the bid adjustment applied to the RLSA list (e.g., increase by 10% on the RLSA list) or based on the bids you apply for a specific RLSA campaign.
Paid Search Marketing | | erikabarbosa0 -
Is there a tool that will automatically post tweets to a list of users
It does exsist, but it is often considered spamming! Be carefull what you are doing!
Online Marketing Tools | | Stramark0 -
Is there a tool that will find a string from the source code from a list of URLs?
Cheers Dirk Didn't know Screaming Frog could do that. Works perfectly! Screaming Frog is just the gift that keeps on giving
Behavior & Demographics | | richdan0 -
Would merging a site with strong DA with one that has weak DA be a smart move?
Since DA and PA are rather logarithmic, if there is a small difference in DA or PA between the sites then merging them will be a nice benefit. If there is a big difference in DA and PA the merger will probably not make the stronger site more competitive. An important thing to look at is the number of links that these two sites share. If they have very similar linking domains and linking pages then not much will be gained by the merger. If they have diverse and very different linking domains and linking pages that is when the most will be gained, based upon link metrics only. Another important thing to consider is the traffic producing assets. Does the site being redirected have unique, substantive, well-written and well-ranking articles. If it has lots and they are supreme quality then it will be a good asset gain for the site that receives them. Finally, will the site being 301ed contribute new products, new keyword reach, improved rankings. These are what might be improved on the weaker site if they are placed on the stronger site. Also, will the merger give current shoppers a greater selection of products, greater selection usually means larger average shopping carts.
Technical SEO Issues | | EGOL0 -
Why would a business want to cap their Adwords budget?
Thanks Jasmine Great answer and one to which I can completely relate. I have been consulting now for approx. five years. One of the things I have found interesting is the degree to which so many businesses hamper their own growth through pointless bureaucracy, internal politics and employees who put their personal career progression above the needs of their employer.
Paid Search Marketing | | richdan3 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
I highly doubt it would effect rankings due to low quality issues but it will show that you have site map error warnings in your GWT console. That issue is technically classified as 'Warnings' and not 'Errors'. The right thing to do in that scenario is take the robots.txt block off and just use a 'noindex' tag on the pages. That way they can stay in the site map but they won't show up in the index. Otherwise you should remove them from the sitemap if you don't want the warnings in GWT.
Technical SEO Issues | | Dezzign0 -
How do I redirect the Author archive page in Wordpress?
I agree with Ryan. That's great advice. One more thing I would add is to go ahead and pro-actively add some extra high authority page backlinks to the "about me" page. You want to ensure the Page Authority is higher than any other pages on the site that may out rank your author archive page. Just to make sure the pages you want ranking first are where they should be in terms of your websites pages authority. Hope that helps a little further, Joe
Technical SEO Issues | | jlane90 -
Why can no tool crawl this site?
I would look into finding a method to redirect via your server rather than with javascript. This will ensure that bots can properly crawl your site. I would also add hreflang tags which should help Google with the multiple language versions of the site. Also in the short term you may want to do something like add a link or a delayed meta refresh just in case someone either has javascript disabled or is using script blocking extensions. This will make sure they at least see something instead of a blank page.
Technical SEO Issues | | spencerhjustice0 -
Where can I get a list of broken links to my client's website?
Hello, As i may understand, you want to get a list of websites linking to internal pages of your client's website (which are now borken 404). So, you can benefit from that link juice by doing some internal 301 redirects. Welll i have gone through such issues times before, and here are more tahn 1 way to get those links: 1. Google Webmaster Tools: This is the easiest one i think. Go to Google WMT ---> Traffic ---> links to your site ---> You will find a list of top linked pages at top left. (screenshot here: http://i.imgur.com/fFoLEJC.png). Click on "More" (i have it in french here: "plus"). You will geta list of the most popular pages in your site. Click on those pages 1 by 1, and u will get the external inbound links to that page. 2. Open site explorer from Moz.com here: Go fetch the inbound links to that domain, then click on the "top pages" tab. You will have a list with internal pages who received links and social share. Export this to a csv file. Then, take those urls 1 by 1 and fetch their inbound linsk using opensiteexplorer again. You will get a list of ddomains/pages linking to those internal pages 1 by 1 . 3. Use cognietive seo tools. They have a filter by url, so u can exclude hpage and llok for inbound links to remaing pages. But their index is limited (my own opinion) 4. Majestic seo tools: great index, easy-to-use filters so u may exclude hpage and lookup the rest. Hope it helped. regards
Link Building | | rikano0