AV Engine Detection Rating Comparisons?

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
Are there any good reliable sources that show detection rating comparisons that might be listed regularly?

I've seen a few detection rating comparisons in the past, but I can't find any at the moment.

I was hoping places like, West Coast Labs, ICSA, or ViruBulletin would have something like this.
 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Originally posted by: DasFox
Originally posted by: Schadenfroh
Please see AntiVirus (malware) Detection Rate Thread

Thanks I've seen all these places on the Web already, I'm trying to find some new detection ratings for KAV7 and NOD32 3.0, there doesn't seem to be any yet.

If you wish, you can make your own detection ratings

Set up a test computer, thoroughly infect it by running Trojans and visiting malicious websites, collect all the malware files (and the exploits which got them onto the system), upload them to VirusTotal.com as quickly as practical, and see what the detection rates really are. If you think the NOD32 3.0 might have better detection rates, take the samples that VirusTotal's 2.0 engine missed and scan them with NOD32 3.0 afterwards.

Regarding signatures and heuristics, it might not make much difference. Until recently, the Kaspersky engine at VirusTotal was a 4.something engine, not even 5 or 6, for example. They went straight to 7 recently. I haven't noticed any major improvement in detection rate since then.

 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: mechBgon


Set up a test computer, thoroughly infect it by running Trojans and visiting malicious websites, collect all the malware files (and the exploits which got them onto the system), upload them to VirusTotal.com as quickly as practical, and see what the detection rates really are. If you think the NOD32 3.0 might have better detection rates, take the samples that VirusTotal's 2.0 engine missed and scan them with NOD32 3.0 afterwards.

Heh.

Regarding signatures and heuristics, it might not make much difference. Until recently, the Kaspersky engine at VirusTotal was a 4.something engine, not even 5 or 6, for example. They went straight to 7 recently. I haven't noticed any major improvement in detection rate since then.

Everyone knows that kaspersky's conventional heuristics are not that strong. They rely on fast updates, good static unpacker and prodefense module (not the same as the usual heuristics that people talk about) which can't be tested by merely scanning on demand.

 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Everyone knows that kaspersky's conventional heuristics are not that strong. They rely on fast updates, good static unpacker and prodefense module (not the same as the usual heuristics that people talk about) which can't be tested by merely scanning on demand.

I generally agree; I don't see the heuristics come into play very often on KAV7. Then again, that could be partly because their stronger-than-average signature-based detection leaves fewer opportunities. Maybe tonight I'll take a few months' worth of malware, roll back KAV7's databases as far as possible, and run a scan on those files to see how the heuristics do when the signatures are out-of-date.

 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
Originally posted by: John
Read this page first. Now look at the chart.

So how good is the quality of information for this chart? Is it possibly one of the better detection rating charts out there?

Personally all the AV apps that are rated higher then KAV and NOD32 I've never seen before, or would ever consider them to have a better detection rating.

Hmm
 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: mechBgon
Everyone knows that kaspersky's conventional heuristics are not that strong. They rely on fast updates, good static unpacker and prodefense module (not the same as the usual heuristics that people talk about) which can't be tested by merely scanning on demand.

I generally agree; I don't see the heuristics come into play very often on KAV7. Then again, that could be partly because their stronger-than-average signature-based detection leaves fewer opportunities. Maybe tonight I'll take a few months' worth of malware, roll back KAV7's databases as far as possible, and run a scan on those files to see how the heuristics do when the signatures are out-of-date.

A retrospective test is done by avcomparitives as you already know. For your original test, you were carrying out a different test from AVcomparativies based on a smaller sample of "new" malware (as far as i can make out based on your scanty methodology details anyway), so comparisons are not possible.

But if you are going to rollback signatures, you are actually duplicating the retrospective tests but with a smaller sample. So to some degree the results you get from this might be comparable.
 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: DasFox
Originally posted by: John
Read this page first. Now look at the chart.

So how good is the quality of information for this chart? Is it possibly one of the better detection rating charts out there?

Personally all the AV apps that are rated higher then KAV and NOD32 I've never seen before, or would ever consider them to have a better detection rating.

Hmm

What you don't consider is that those higher rated are also known to generate higher FPS due to aggressive heuristics... I can score 100% if i detect every file as malware....

In those charts, we are talking not of run of the mill malware randomly selected from a set of all malware. The sample instead consists of malware that has actually being reported as having infected users computers a lnumber of which computers are likely to be protected by avs. (castlecops will then unload it to otc) -

This means we are talking about a self selected sample of malware that has evaded at least 1 av, so it is clearly a tough test where most of the samples are likely to be new (something like what mechon is doing in his test, but even harder than his, since in his tests he is using infections found on unprotected machines/honeypots I think).

This kind of tests of new unknown malware favours antiviruses that have very speculative heuristics... But such heuristics usually come with higher FPs...

Webwasher is basically antivir + one of their own engines and because it is deployed at the server level, it can be more aggressive.

Antivir, has very aggressive heuristics particularly if you turn it on. A lot of its detections is packer based. e.g. files are flagged regardless of content as long as they are packed with a specific packer plus some other generic feature. This solves a lot of problems but creates new ones with FPS....

Forinet and ikarus tend to detect *everything* including clean files..

Panada has a poor reput , but my experience with them is that they are pretty underrated, of all the avs they have the best blend of technologies.

In short all the tests have different methodology, different conditions etc, they are all not comparable. And which test is most useful depending on what you are looking for,

IMHO, most important is to look at the test set. How did they find/choose the samples/ what is the sample size?

Most test results vary because this condition is very different.

What steps did they take to ensure the sample consists of real functioning malware as opposed to junk files?

What types of malware was chosen? trojans, worms, rootkits, adware , keyloggers etc Polymorphic malware?

How was the AV tested? - what settings were used?

Generally when i look at most of the test results, they "make sense" to me, based on my personal experience, reading about the antviruses and my personal understanding of the strength and weaknesses of each av.

Understanding of stasitics is critical, people obsess over minor differences that are well within statical sampling error.

 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Originally posted by: lusher
Originally posted by: mechBgon
Everyone knows that kaspersky's conventional heuristics are not that strong. They rely on fast updates, good static unpacker and prodefense module (not the same as the usual heuristics that people talk about) which can't be tested by merely scanning on demand.

I generally agree; I don't see the heuristics come into play very often on KAV7. Then again, that could be partly because their stronger-than-average signature-based detection leaves fewer opportunities. Maybe tonight I'll take a few months' worth of malware, roll back KAV7's databases as far as possible, and run a scan on those files to see how the heuristics do when the signatures are out-of-date.

A retrospective test is done by avcomparitives as you already know. For your original test, you were carrying out a different test from AVcomparativies based on a smaller sample of "new" malware (as far as i can make out based on your scanty methodology details anyway), so comparisons are not possible.

I know my original test resembles a detection-rate test, but as I said several times, that was not the point of the thread To restate the actual point of that test:

Originally posted by: mechBgon
My suggested layered defense works against 95 of the 95 samples

Basically, it's a pass-or-fail test where every product failed, leading to my point: antivirus software should be supplemented with other best practices (which many people already do, but perhaps they don't realize how important it's been, nor all the options available to them). The statistics were given mainly to open peoples' eyes that their own AV's detection rate on fresh malware might be a lot lower than they thought.

But if you are going to rollback signatures, you are actually duplicating the retrospective tests but with a smaller sample. So to some degree the results you get from this might be comparable.

I did my rollback test using definitions from June 27th, with about 5000 samples captured between June and November. After scanning with the virus definitions to weed out anything that would be identified by definitions, I had about 4500 left. Re-scanning these with heuristics at the Detail (maximum) level got about 250 more detections. That's approximately what I expected for KAV7.
 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: mechBgon
Originally posted by: lusher
Originally posted by: mechBgon
Everyone knows that kaspersky's conventional heuristics are not that strong. They rely on fast updates, good static unpacker and prodefense module (not the same as the usual heuristics that people talk about) which can't be tested by merely scanning on demand.

I generally agree; I don't see the heuristics come into play very often on KAV7. Then again, that could be partly because their stronger-than-average signature-based detection leaves fewer opportunities. Maybe tonight I'll take a few months' worth of malware, roll back KAV7's databases as far as possible, and run a scan on those files to see how the heuristics do when the signatures are out-of-date.

A retrospective test is done by avcomparitives as you already know. For your original test, you were carrying out a different test from AVcomparativies based on a smaller sample of "new" malware (as far as i can make out based on your scanty methodology details anyway), so comparisons are not possible.

I know my original test resembles a detection-rate test, but as I said several times, that was not the point of the thread

You know what, it looks to me like your "test" is not really a "test" particularly since you already decided you wanted to make a point... Sure looks like you decided on a conclusion before you even did the test.... Plus the fact that your recommendations just happen to get out 95 out of 95 makes me wonder......

Neverthless, your test is indeed a on demand test, using a sample of malware that has being selected in a particular way....

The fact that you follow it up by showing that a combination of several products would produce better results is an added bonus.

BTW One of Gimzo's past tests did a similar thing trying to figure out what combination of products would score 100% over some test set he was doing. And a guy called firefighter also does similar tests.

Basically, it's a pass-or-fail test where every product failed, leading to my point: antivirus software should be supplemented with other best practices (which many people already do, but perhaps they don't realize how important it's been, nor all the options available to them).

You want 100% detection? Obviously doesn't exist (not even for your layer, you just got lucky).

A pointless test if you ask me. Obviously combining more avs will get higher detection rates. But at the cost of higher rate of FPs also.

And of course the smaller the sample size used , the greater the variance in results.

Equally obviously, testing antivirus using brand new malware only, will yield poorer results..

The statistics were given mainly to open peoples' eyes that their own AV's detection rate on fresh malware might be a lot lower than they thought.

Unfortunately, you replace it with a misconception that *your* recommended layer is foolproof...


I did my rollback test using definitions from June 27th, with about 5000 samples captured between June and November. After scanning with the virus definitions to weed out anything that would be identified by definitions, I had about 4500 left. Re-scanning these with heuristics at the Detail (maximum) level got about 250 more detections. That's approximately what I expected for KAV7.

Do you notice a big flaw in your methodology?

 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
You know what, it looks to me like your "test" is not really a "test" particularly since you already decided you wanted to make a point... Sure looks like you decided on a conclusion before you even did the test.... Plus the fact that your recommendations just happen to get out 95 out of 95 makes me wonder......

The test is simply to show people why not to expect antivirus to make them invincible, and to point out that other best practices can have a profound impact on security. Raising awareness is the main goal. In retrospect (haha), I probably should've just stated a combined average detection rate by all antivirus programs, so people didn't get caught up in which brand was better.

You want 100% detection? Obviously doesn't exist (not even for your layer, you just got lucky).

I'm probably more interested in mitigation than detection per se, being a prevention-oriented guy. This is also why I am more interested in detection of new malware than old malware. However, I'm always interested in remote security risks that could work against the combined restrictions of a low-rights user account plus a disallowed-by-default Software Restriction Policy. If you have some nominees, let's hear them (real or hypothetical).

Do you notice a big flaw in your methodology?

There are several shortcomings to that approach, but it serves its intended goal, which is simply to illustrate what you said before: Kaspersky doesn't rely much on heuristics at this time. Don't read too much into it
 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: mechBgon
You know what, it looks to me like your "test" is not really a "test" particularly since you already decided you wanted to make a point... Sure looks like you decided on a conclusion before you even did the test.... Plus the fact that your recommendations just happen to get out 95 out of 95 makes me wonder......

The test is simply to show people why not to expect antivirus to make them invincible, and to point out that other best practices can have a profound impact on security. Raising awareness is the main goal. In retrospect (haha), I probably should've just stated a combined average detection rate by all antivirus programs, so people didn't get caught up in which brand was better.

And you shouldn't have being so proud that your recommended setup (which btw includes rubbish like spywareblaster) would have caught all 100%.

I always find it strange that people can say in one breath there is no security pancea (or that there is no 100% security), and then in the next breath imply that but if you use x,y,z that i recommend to you, you do indeed achieve that!

You want 100% detection? Obviously doesn't exist (not even for your layer, you just got lucky).

I'm probably more interested in mitigation than detection per se, being a prevention-oriented guy.

An empty position. I mean you are for "prevention". Who could be against that? lol . I never met a guy who was against prevention , have you? It's like saying I'm for motherhood or applepie (if american).

But in the context of this thread (antiviruses), detection is indeed the same as prevention.

This is also why I am more interested in detection of new malware than old malware.

I thought you were the prevention guy?

However, I'm always interested in remote security risks that could work against the combined restrictions of a low-rights user account plus a disallowed-by-default Software Restriction Policy. If you have some nominees, let's hear them (real or hypothetical).

Depends on what you mean by low-rights, many malware now work well enough without adminstrative rights. Software restriction policies can be worked around through the usual ways of course.

That's not to say that your setup is weak of course, obviously using software restriction policies (HIPS that many use are pretty much equal with more options) is safer than using just antiviruses , that is not even an argument.

But it's always a balancing act between convience and security....


There are several shortcomings to that approach, but it serves its intended goal, which is simply to illustrate what you said before: Kaspersky doesn't rely much on heuristics at this time. Don't read too much into it

Well given that you don't seem to care about the accuracy of your results, you might as well have faked all the results and saved some effort. After all as you say you aren't really interested in the results, but was just trying to make a obvious point.

 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Depends on what you mean by low-rights, many malware now work well enough without adminstrative rights. Software restriction policies can be worked around through the usual ways of course.

Do you have any specific examples?

After all as you say you aren't really interested in the results, but was just trying to make a obvious point.

As I have been doing here for years, yes. Just ask anyone

But in the context of this thread (antiviruses), detection is indeed the same as prevention.

Only if the detection occurs before infection happens. I recently had a Forum member send me a sample which successfully disabled two different self-protecting AVs that didn't recognize it yet, and also installed a rootkit to hide (when executed with an Administrator account, that is). The antiviruses in question do detect the malware now, but it's too late for that to help the victim.

This is also why I am more interested in detection of new malware than old malware.

I thought you were the prevention guy?

Yes. And the bad guys seem to use new malware in the wild, because they want to keep detection rates low in order to infect as many people as practical, and to make as much money as possible. So detecting new malware is what I logically think is most relevant, from a prevention standpoint.

Mitigating undetected malware by other methods goes along with that, and I prefer starting at the foundation with a low-rights user account if possible, as does the US CERT:

Consider using an account with only 'user' privileges instead of an 'administrator' or 'root' level account for everyday tasks. Depending on the OS, you only need to use administrator level access when installing new software, changing system configurations, and the like. Many vulnerability exploits (e.g., viruses, Trojan horses) are executed with the privileges of the user that runs them ? making it far more risky to be logged in as an administrator all the time.

That advice is echoed by others, including Microsoft themselves.
 

lusher

Member
Aug 17, 2007
86
0
0
Originally posted by: mechBgon
Depends on what you mean by low-rights, many malware now work well enough without adminstrative rights. Software restriction policies can be worked around through the usual ways of course.

Do you have any specific examples?

yes.

After all as you say you aren't really interested in the results, but was just trying to make a obvious point.

As I have been doing here for years, yes. Just ask anyone

Ah, so you admit to making up results to fit your point?

But in the context of this thread (antiviruses), detection is indeed the same as prevention.

Only if the detection occurs before infection happens. I recently had a Forum member send me a sample which successfully disabled two different self-protecting AVs that didn't recognize it yet, and also installed a rootkit to hide (when executed with an Administrator account, that is). The antiviruses in question do detect the malware now, but it's too late for that to help the victim.

True. But we are talking about *prevention* remember? Being infected and then the AV detecting it after updates is hardly *prevention* of infection is it? AV prevention effectivness is indeed directed related with how much it detects (maybe you can exclude behaviorial stuff),


This is also why I am more interested in detection of new malware than old malware.

I thought you were the prevention guy?

Yes.
[/quote]

I was just having some fun with you. You recognise as I do that for AV, prevention is ultimately the same as detection...

That advice is echoed by others, including Microsoft themselves.

If you say so. I'm no expert... lol.
 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Originally posted by: lusher
Originally posted by: mechBgon
Depends on what you mean by low-rights, many malware now work well enough without adminstrative rights. Software restriction policies can be worked around through the usual ways of course.

Do you have any specific examples?

yes.

Why don't you go ahead and name some, or describe how they work and what they can be used for.

After all as you say you aren't really interested in the results, but was just trying to make a obvious point.

As I have been doing here for years, yes. Just ask anyone

Ah, so you admit to making up results to fit your point?

I believe my results were corroborated (that means "verified by a second person") by John's testing with several AV and AS products using the same set of samples that I did. Maybe you should ease back on the trolling, and try to make yourself useful around here instead
 

lusher

Member
Aug 17, 2007
86
0
0
I believe my results were corroborated (that means "verified by a second person") by John's testing with several AV and AS products using the same set of samples that I did. Maybe you should ease back on the trolling, and try to make yourself useful around here instead

You are obviously pissed that I know more than you about antiviruses and you have being poking at me throughout this thread. I make valid points but you ignore them and try to bait me instead.

Okay I will leave you to play "expert" when you clearly have a lot more to learn. Just because you know a bit about one area of computers does not mean you know jack shit about another (antiviruses). Typical false authorithy syndrome , where a "expert" in one area oversteps his boundary and confidently makes claims about another domain he knows nothing about.




 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
Ok KIDDIES, LOL.

Now that you two seem done beating each other, can we PLEASE get back to my question?

Is there a detection rating chart that is fairly reliable from someone out there you can look at to base a decision off of?

I'm a PC geek I know a bit, but do I want to take the time to evaluate personally every AV out there and run my own tests? --> NO

I'm more interested in finding reliable sources from places like, AV comparatives, CastleCops, ICSALabs, West Coast Labs, Virus Bulletin, etc... to find hopefully good information to help me make a decision on.

So what do you say?

THANKS
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |