What runelord999 should have added was this section found in one of the links:
QUOTE:
Did your favorite antivirus perform poorly in this test? There are a lot of factors that could have caused this:
Each AV program uses a different virus database. Some containing more malicious signatures than others, meaning some AV programs will have higher detection rates than others.
Each AV company has their own interpretation of what constitutes malware. Some AV companies only want their product to target primarily viruses and worms, and to a lesser degree Trojans and exploits, and to an even lesser degree (or not at all), spyware, hijackers and adware. For example, if you look at the attached chart, you may notice that several of the tested AV programs miss a significant number of the Trojans.
Considering there are roughly 100,000 (or more) unique infections in the wild, a population sample of 758 infected files may not accurately represent true detection rates of AV programs.
Could poor detection of certain AV programs be due to ‘zoo’ viruses in this test sample?
Not likely. First of all, many AV programs will detect zoo viruses. Second, all of these test files were obtained from within the ‘wild’, meaning that all of these files exist outside of laboratories and they have been (unfortunately) released out into the real world. ‘Zoo’ viruses are proof of concept viruses or otherwise unreleased viruses and generally do not exist outside of controlled laboratories. There are no known zoo viruses in these tests.
These test are NOT to determine which AV software is superior, this is just a test on 758 POSSIBLE Trojan, backdoor, and virus infected files. END QUOTE
Let's be fair when posting such AV evaluations in the future.