Author Topic: Av-Comparative Retrospective/Proactive Test May 2010  (Read 31047 times)

0 Members and 1 Guest are viewing this topic.

Offline Dch48

  • Massive Poster
  • ****
  • Posts: 3150
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #30 on: June 09, 2010, 07:40:25 AM »
It is just as important for an AV to keep up with new malware by updating it's signatures as it is to detect new things. They say in the test that they used the same signature base from February that they used back then for the other test. Also notice that the version of Avast! used is not the latest one and they admit that behavior analysis was not used. The second fact pretty much nullifies the test results for me. If they tested the latest version with the latest signatures, I'm sure every sample would have been detected. Even the latest version with the February signatures would probably do better. I don't think we can give these results much credibility when they used a version of Avast! that is at least two updates behind.
Avatar FX6327X desktop, FX-6300 CPU, RX 470 GPU, 8GB RAM, Windows 10 Home 64 bit
HP dv6-6140us laptop, A8-3500M APU, 8GB RAM, Windows 7 Home Premium 64 bit
RCA W101 v2 10" tablet, Intel Atom Bay Trail Z3735F processor, 2GB RAM, Windows 10 Home 32 bit

Offline sg09

  • Full Member
  • ***
  • Posts: 175
    • Current Technology Discounts
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #31 on: June 09, 2010, 08:06:52 AM »
It is just as important for an AV to keep up with new malware by updating it's signatures as it is to detect new things. They say in the test that they used the same signature base from February that they used back then for the other test. Also notice that the version of Avast! used is not the latest one and they admit that behavior analysis was not used. The second fact pretty much nullifies the test results for me. If they tested the latest version with the latest signatures, I'm sure every sample would have been detected. Even the latest version with the February signatures would probably do better. I don't think we can give these results much credibility when they used a version of Avast! that is at least two updates behind.
Please I think you guys are not understanding the problem. If they had to test with the latest engines of Avast, they will have to make all the viruses their own. Otherwise there will be no significance of this test.
These test shows how avast and all other AVs proactive defense were at Feb2010, not now. Also in these 3 months all the AVs have improved.
Anyone who knows how to loose can certainly learn how to win.

Offline SpeedyPC

  • Avast Evangelist
  • Massive Poster
  • ***
  • Posts: 3309
  • Avast shall conquer the whole world
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #32 on: June 09, 2010, 08:10:05 AM »
Does anybody noticed in the Retrospective/Proactive Test May 2010 they were using the old Avast version 5.0.396 ???

I'll bet ;) the latest Avast v5.0.545 might have improved the Retrospective/Proactive result we have to understand that not all AV testers out their keep the software up to date before the test result start, I have no problems Avast will keep getting better each day Vlk may release the next Avast version soon that may cover the problem who knows ???.

The real question is for everybody in here Who do you TRUST!

My vote is Avast ;) ;D
ASUS G75VX-T4153H - Avast Premium v21.1.2449 - Avast SecureLine VPN - Avast Secure Browser - Avast Driver Updater - W8.1 64bit - Firefox 64bit - Thunderbird 64bit - MBAM Premium - Adguard Premium - CryptoPrevent Premium - Privacy Eraser - MCShield - WinPatrol PLUS - Macrium Reflect Home Edition

Offline gery

  • Full Member
  • ***
  • Posts: 139
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #33 on: June 09, 2010, 08:49:57 AM »
I think Avast focusing on Program Version which fixing plenty of bugs and adding languages which they forgot to focus on Very basic function of AV's? Detecting, Removing and Blocking.

Im very sad to see the result.

Avast beaten by AVG which is crap!!!
Says who? the results do not say so!!!!!!!!!!! unfortunately Avast did bad but that is not the end of the world. We should not bash other product because avast did bad. grow  up people stop murmuring.
AvastInternet Security- MBAM PRO

Offline coper

  • Jr. Member
  • **
  • Posts: 74
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #34 on: June 09, 2010, 10:36:54 AM »
Sorry for my English skills but I need write my opinion.
Avast is good antivirus but one big problem for me is avast support is very slowly. I send often about one virus sample per day to avast lab but research durative about 1-2 days its too long. All samples I sent to virus@avast.com. Avast support must be quickly half day to release update on this sample. Avast has big problems with pdfka detections and fakeav detection. In nowadays very much infiltrations are focus on fake av products and pdfka / trojans. Avast must be more proactive. I am waiting on changes.

Good job for microsoft developers
Microsoft essentials is good free alternative product.
« Last Edit: June 09, 2010, 10:39:01 AM by coper »

Offline kubecj

  • Avast team
  • Advanced Poster
  • *
  • Posts: 1123
    • ALWIL Software
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #35 on: June 09, 2010, 11:05:23 AM »
The testing is getting more and more problematic. And on each AV conference there are multiple papers about how to do proper testing (not that I think that all of them make sense  8))

I have objections against all AV-Comparatives tests performed, also the Av-Test, but those are less 'documented', so it's hard to tell where the deficiencies lie.

The usual points about static testing are:
a) the tests are carried long after the real infection took place, so it's kind of useless from today's point of view
b) the tests are carried without any context state information. Such information - if there is file named "document.doc   .exe" in email, this is enough to ban the execution
c) the tests are carried only with the signature engines - they don't test the other generic protection engines the products may have
d) the tests don't know anything about the relationship of the samples. If you detect the dropper, you don't have to detect the dropped binary.
e) the tests are too binary-centric and have only small amount of script/pdf/flash malware, althought these are one of the main vectors of getting thru to your computer.
f) there is little of no info on how the testbeds are created. All these 99.1% and such scores are complete nonsense from my point of view. The overlap of the product's detections is not as great as clementi/marx tests suggest.

This is not an excuse, that's an explanation what your really should read from the static tests. Yep, it's nice to be on the first places, but the world does not end if you're not there.
Regarding the pro-active test, this is the most flawed test of them all. It does _NOT_ test the ability of the product to protect you from the unknown malware. It tests the ability of the signature engines to detect the samples Av-Comparatives got in the test's timeframe. For example, what if the engine authors already had the samples and wrote the detections and Av-Comparatives added them later? We're back again in the 'testedbed construction' problem.
Jindrich Kubec

Offline Maxx_original

  • Avast team
  • Super Poster
  • *
  • Posts: 1479
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #36 on: June 09, 2010, 11:14:05 AM »
coper: pdfka samples should be well covered, regarding our internal stats... fake av detections need to improve, that's right (but they're difficult to detect proactively - Mystic compresor e.g. - used to wrap some rogues - brings new anti-emu tricks in each generation)..

Offline coper

  • Jr. Member
  • **
  • Posts: 74
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #37 on: June 09, 2010, 11:25:19 AM »
I think Its about solidarity human who want help avast community with virus sample.

Offline kubecj

  • Avast team
  • Advanced Poster
  • *
  • Posts: 1123
    • ALWIL Software
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #38 on: June 09, 2010, 01:21:17 PM »
Coper, regarding the pdf detections... I dug out all pdf samples we got from beginning of january.

It was 10104 unique samples. Scanned with the latest signatures of command line scanners.
If we assume, that there are no false alarms, and we can say that 1 detection of any AV means the file is real malware (I know this is oversimplification), then 9226 files are detected.

AVDetectionsPercentage
Avast844191.5%
Kaspersky690874.9%
Bitdefender634068.7%
NOD32477251.7%
Symantec413644.8%
Microsoft404443.8%
Avira294631.9%
AVG216723.5%

Now tell me, where is your comment based on our pdfka problems based on?

And this is completely 'honest' without any deliberate messing with the testbed and using the latest signatures. Again - there is nonzero possibility that AVs with bad scores may have some generic anti-exploit protection techniques in their full fledged scanners and are able to protect their customer even contrary to the fact that they had 'bad score' in my 'test'.
Jindrich Kubec

Offline sanjose123

  • Newbie
  • *
  • Posts: 2
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #39 on: June 09, 2010, 01:28:17 PM »
I always read about Avast and Avira.

Avast is good on Pro-active protection not excellent on demand scan. I mean not very good on detection rates.
Avira is good on demand scanning but not good on Pro Active Protection which cause plenty of FP.

I hope next Pro Active Test Avast would came Top 5 for Av comparative.

I know Avast is not good on Detection rates so i always back up and anti malware software.

I always use Avast and MBAM.
Avast for protection.
MBAM for detecting and removing malware which avast missed it.

Offline coper

  • Jr. Member
  • **
  • Posts: 74
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #40 on: June 09, 2010, 01:42:01 PM »
Its not only on based pdfka files
Holding fingers to the futures

Offline Dch48

  • Massive Poster
  • ****
  • Posts: 3150
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #41 on: June 09, 2010, 09:17:13 PM »
All the results I read about say Avira has a slightly higher detection rate than Avast! but with far more FP's. I'll take the very slightly lower detection rate without the FP's any day.
Avatar FX6327X desktop, FX-6300 CPU, RX 470 GPU, 8GB RAM, Windows 10 Home 64 bit
HP dv6-6140us laptop, A8-3500M APU, 8GB RAM, Windows 7 Home Premium 64 bit
RCA W101 v2 10" tablet, Intel Atom Bay Trail Z3735F processor, 2GB RAM, Windows 10 Home 32 bit

Offline Henrique - RJ

  • Sr. Member
  • ****
  • Posts: 247
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #42 on: June 10, 2010, 12:33:30 AM »
No more sending samples to Alwil.

The samples are underutilized and even ignored.

I'm tired.

 :(

Offline DavidR

  • Avast Überevangelist
  • Certainly Bot
  • *****
  • Posts: 84586
  • No support PMs thanks
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #43 on: June 10, 2010, 02:09:18 AM »
One thing for sure not sending them at all will ensure there is zero possibility of anything being done at all.
Windows 10 Home 2004 64bit/ Acer Aspire F15/ Intel Core i5 7200U 2.5GHz, 8GB DDR4 memory, 256GB SSD, 1TB HDD/ avast! free 21.1.2449 (build 21.1.5968.561) UI-1.0.597/ WinPatrol+/ Firefox, uBlock Origin, uMatrix/ MailWasher Pro/ Avast! Mobile Security

Offline Lisandro

  • Avast team
  • Certainly Bot
  • *
  • Posts: 67255
Re: Av-Comparative Retrospective/Proactive Test May 2010
« Reply #44 on: June 10, 2010, 02:55:51 AM »
One thing for sure not sending them at all will ensure there is zero possibility of anything being done at all.
+1
Henrique, don't give up, please. I'm also interested in avast protection - as a lot of other Brazilians - and we're seeing that Avira is forward compared to avast. But, I still have hope...
The best things in life are free.