Those are not the bots I was referring to. Search engine bots identify themselves as such in the user agent string. So they can be filtered out easily by any statistics software.
The bots that don't identify themselves are in the vast majority of visitors to any site that accepts user submitted content. That means just about any site that has a web form like a comments form, contact form or order form. I am of course referring to spam bots. These bots pretend to be legitimate users and can't be filtered out by server side stats. That is why you get inflated figures on server side statistics software and not on client side stat software.
I also said
"all sorts of bots". Here is a list of bots that my server side stats filtered out just yesterday, including a bunch of UNKNOWN BOTS and SPAM BOTS.
Unknown robot (identified by 'bot*')
BaiDuSpider
Googlebot
Unknown robot (identified by 'crawl')
Unknown robot (identified by 'robot')
Yandex bot
MSNBot
Unknown robot (identified by empty user agent string)
Unknown robot (identified by '*bot')
Unknown robot (identified by hit on 'robots.txt')
BSpider
Unknown robot (identified by 'spider')
SeznamBot
Mail.ru bot
WordPress
Turn It In
Java (Often spam bot)
Nutch
MJ12bot
Sogou Spider
legs
ichiro
Alexa (IA Archiver)
FaceBook bot
MSNBot-media
Python-urllib
Exabot
Perl tool
Yahoo Slurp
BlogPulse ISSpider intelliseek.com
Phantom
NG 1.x
SurveyBot
Netcraft
WGet tools
larbin
Powermarks
W3C Validator
CFNetwork
As you can see, awstats and other server side stats most certainly DOES filter out all sorts of robots.
The only bot that I know of that bothers to execute JS is google bot. As I pointed out google bot and other search engine bots are easily identified and filtered out by all analytics software.
Then you haven't looked very hard.
Here is an article from May 2008 (4 years old!) about bots executing javascript and screwing up analytics numbers.
Here is another article from June 2012 about bots showing up in analytics.
If you think your analytics are accurate because no bot executes javascript, you are mistaken. Lots of bots execute and parse javascript.
I guarantee that you have bots being counted as visitors.
There's also a ton of services out there that run as "real browser instances" ( Keynote, Gomez, AlertSite, Pingdom, etc etc). These will also be counted by both analytics and awstats unless filtered out.
Finally people like you who block analytics are in the minority. Less than 0.1% of users. Most users are not very technical. If you've ever run a successful website you would know that. For example many visitors end up on a site by googling the URL! They don't even know how to use the address/location bar of their web browser.
Of course I realize this, but NoScript and AdBlockPlus (and similar scripts) are a lot more common than you think, it's more like 3% than 0.1%.
Sites like mine that cater to a technical audience have a higher percentage of people blocking javascript which makes analytics even less accurate for me. I would rather have ALL the data, than MISS DATA because analytics is blocked or doesn't load for various reasons.
AWStats is based on log files and is intended to be a log analysis tool, while Google Analytics is intended to be a measure of website performance. Awstats doesn't analyze visitors as such, while analytics tries very hard to show you real visitor numbers. They have similar but slightly different uses and purposes. AWstats is meant more for system admins and is more technical. Google Analytics is meant more for Marketers and business analyts.
Google analytics will never be as accurate as awstats because of how it works (javascript on a third party server). It simply doesn't have ALL THE DATA.
Neither of them will be 100% accurate and both will have bots counted as visitors. If you're that anal about your stats, you really should USE BOTH or run an analytics type program with your server logs like piwik or mint.