Avast WEBforum

Consumer Products => Avast Free Antivirus / Premium Security (legacy Pro Antivirus, Internet Security, Premier) => Topic started by: true indian on November 09, 2012, 03:41:23 PM

Title: AV-comparatives we are back in the game!!! [October 2012]
Post by: true indian on November 09, 2012, 03:41:23 PM
http://chart.av-comparatives.org/chart2.php

 8)

go avast!!! if this result remains constant or gets better for next 2 months..we will come ahead towards the top  ;D
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: RejZoR on November 09, 2012, 04:16:33 PM
Can someone from avast! team post what were the user dependent detections? I mean, what kind were they? File Reputation warning? Heuristic/Behavioral detection or a detection inside Auto Sandbox? I'm really curious to know what type were they...
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Charyb-0 on November 09, 2012, 08:09:43 PM
Is it strange to anyone how Symantec will allow PC Tools to be tested but won't agree with the testing using Norton? They don't want to see their flagship consumer product get beat in these tests.

Avast is getting better. I also would like to know what the user dependent decisions are.
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: DavidR on November 09, 2012, 09:58:19 PM
No sign of M S E either in the tests
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Dwarden on November 09, 2012, 11:22:42 PM
while it's nice to see improvement, i would like to point that Trend scored twice already 100% and Bitderender now has 100% too

so i guess there starting to be some leap ahead by these companies
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Aventador on November 10, 2012, 12:19:20 AM
It is against AVC policy to be posting the screen shot.
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Lisandro on November 10, 2012, 12:42:02 AM
Avast is getting better. I also would like to know what the user dependent decisions are.
What is BitDefender doing that better... It's always at the top... Can't we learn anything? Is that "only" hard work?
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Aventador on November 10, 2012, 01:42:12 AM
I used BD for a couple of months. I got a license for only $6. Its a great av but can be heavy at times. What Avast needs to work on is the user interaction. BD does EVERYTHING on its own.
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: true indian on November 10, 2012, 06:42:26 AM
I used BD for a couple of months. I got a license for only $6. Its a great av but can be heavy at times. What Avast needs to work on is the user interaction. BD does EVERYTHING on its own.

Avast is improving...from last month we have improved...last month we had a lot user dependent...but this month we lowered the user dependency as to what AV-C report signifies...For me,my clients now days hardly even care...they click the close button of the SB pop up and they hardy even care..they also have no problems hitting abort connection for low reputation downloads. ;D

Also,I have noticed that the sandbox analysis works lately..its began to work well  ;)

What is BitDefender doing that better... It's always at the top... Can't we learn anything? Is that "only" hard work?

Tested BD many times..its all their cloud blocking my malware links...it cant detect sh*t without its cloud in my experience  :o
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: ajey on November 10, 2012, 08:27:17 AM
Yeah that's nice to hear avast! come go beat others and protect us better :)  ;)
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: true indian on November 10, 2012, 09:25:49 AM
Yeah that's nice to hear avast! come go beat others and protect us better :)  ;)

AV-C or any other testing organizations cannot replicate real life usage where the infection follows a chain of malicious JS/PDF>>Malicious website>>malicious files...In such cases,avast is A+ with detecting malicious JS/PDF and malicious sites etc  8)

More ever,AV-C has no proof of what they actually tested...and hence we dont know what happens in the background  ;)

Its also funny in the way that AV-C or any other testing organizations show always the big dogs as top performers...malware changes daily and every AV has its bad days..so even the big dogs fall down..and its funny in these tests they are always pushed forward...something fishy indeed.  8)

Dont forget these statements given earlier by avast! team

The testing is getting more and more problematic. And on each AV conference there are multiple papers about how to do proper testing (not that I think that all of them make sense  8))

I have objections against all AV-Comparatives tests performed, also the Av-Test, but those are less 'documented', so it's hard to tell where the deficiencies lie.

The usual points about static testing are:
a) the tests are carried long after the real infection took place, so it's kind of useless from today's point of view
b) the tests are carried without any context state information. Such information - if there is file named "document.doc   .exe" in email, this is enough to ban the execution
c) the tests are carried only with the signature engines - they don't test the other generic protection engines the products may have
d) the tests don't know anything about the relationship of the samples. If you detect the dropper, you don't have to detect the dropped binary.
e) the tests are too binary-centric and have only small amount of script/pdf/flash malware, althought these are one of the main vectors of getting thru to your computer.
f) there is little of no info on how the testbeds are created. All these 99.1% and such scores are complete nonsense from my point of view. The overlap of the product's detections is not as great as clementi/marx tests suggest.

This is not an excuse, that's an explanation what your really should read from the static tests. Yep, it's nice to be on the first places, but the world does not end if you're not there.
Regarding the pro-active test, this is the most flawed test of them all. It does _NOT_ test the ability of the product to protect you from the unknown malware. It tests the ability of the signature engines to detect the samples Av-Comparatives got in the test's timeframe. For example, what if the engine authors already had the samples and wrote the detections and Av-Comparatives added them later? We're back again in the 'testedbed construction' problem.
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Lisandro on November 10, 2012, 11:57:43 AM
Clouding detection? So what? If it gives 100 per cent...
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: true indian on November 10, 2012, 12:03:53 PM
Clouding detection? So what? If it gives 100 per cent...

what about many users who hardly come online?? like many people here in india  ???
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: Lisandro on November 10, 2012, 12:40:12 PM
If you're not online, most probably you won't get infected. Infected.
By the way, WHAT THE HELL THEY GOT USER INTERACTION?
I never can get any...
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: DavidR on November 10, 2012, 12:52:03 PM
If you're not online, most probably you won't get infected. Infected.
<snip>

How do you think that those who haven't got Internet, get programs, USB/CD/DVD, etc. That still needs to be covered as do the programs as they are installed.

However, the important part of true indian's post "what about many users who hardly come online" so they still require full protection.
Title: Re: AV-comparatives we are back in the game!!! [October 2012]
Post by: iroc9555 on November 10, 2012, 02:36:06 PM
A DeLL forum member pointed out that MSE is not in the test because AV-C Real World Protection test is done with full security suites.

Also that Norton refuses to participate since a year a go because some impassed with AV-C on how Norton should be configured to run the test.