Serious Discussion Best AVs and Worst AVs in Behavioral Health

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
No longer exists. F-Secure no longer has DeepGuard. It uses Avira Sentry because beginning with F-Secure version 25.2 it uses Avira as the basis for its core protections.

DeepGuard has been replaced with "Behavioral Detection."

People who participated in this thread posted about DeepGuard.
If someone knows what the current meaning of F-Secure "Behavioral Detection" is, I can add it to my previous post.
 
Last edited:

LahiruRajinda

Level 4
Verified
Well-known
Jul 6, 2015
154
Being banned by the US government is not a deciding factor for me.
Therefore, I would gladly pay for Kaspersky (I live outside the US) as it consistently performs better.
Same goes with Huawei/Honor devices which are banned in the US, yet they provide the most cutting edge in technology. Plus, they tend to last longer than the new Samsungs, Apples and Pixels.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
It is possible that on MT and Wilders Security forums the most popular version of "Behavioral Protection" is as follows:
  1. It works offline (no need for real-time cloud backend).
  2. It is independent of malware signatures.
  3. It blocks suspicious/malicious actions of running executables (ignores files on the pre-launch stage).
That is how "Behavioral Protection" worked 10 years ago.
The modern version (in the wide meaning) can include the cloud backend, Big Data, and:
  • Heuristic Analysis (local + cloud)
  • Threat Intelligence Integration (cloud)
  • Machine Learning and AI (local + cloud)
Some protection layers may use Behavioral Protection as an integral part. For example, the Avast CyberCapture can be considered as a mix of local "Behavioral Protection" + Cloud Sandbox + Human Expert Analysis
 
Last edited:

Jonny Quest

Level 24
Verified
Top Poster
Well-known
Mar 2, 2023
1,316
People who participated in this thread posted about DeepGuard.
If someone knows what the current meaning of F-Secure "Behavioral Detection" is, I can add it to my previous post.
It really hasn't been explained on the forum how it's going to help us, or improve things (or how it's improved), but just a name change and what @bazang posted here.
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
So let's summarize what we have for now.

Behavioral protection:
Very good - Bitdefender, Kaspersky
Good (could be better) - Norton, AVG, Avast (same family), Eset, Fsecure (at least up to version 25.2)
Average - Avira, new version Fscure
Weak- Panda

Where to put the rest? I mean McAfee, Gdata, etc.

If someone disagrees, comments are welcome.

Your conclusion about Behavioral Protection (in the meaning from 10-15 years ago) is partially supported by AV-Test:

However, in the modern meaning, there is no reason to think that the Behavioral Protection of Bitdefender is better than Avast. Both analyze the behavior of a given program or process in a cloud virtual environment before it is executed. Furthermore, Avast gets top scores in all Real-World tests (for about two years).
https://support.avast.com/en-us/article/antivirus-cybercapture-faq/#pc
https://www.bitdefender.com/en-bz/oem/sandbox-service
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
I am not sure, but F-Secure may use the Avira Cloud Sandbox API:

If so, then there is no reason to think that Bitdefender or Avast can provide better Behavioral Protection (in a modern meaning) than F-Secure.
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
There are some hints that Avira can currently use Cloud Sandbox API:

Dynamic Analysis: Unmasking Zero-Day Threats

Avira's behaviour-based detection extends to dynamic analysis, a process that involves the execution of files in a controlled environment to observe their behaviour. This approach is particularly effective in unmasking zero-day threats – previously unknown and unclassified malware. By subjecting files to dynamic analysis, Avira can uncover malicious behaviour that may not be evident through static inspection alone.

If so, then there is no reason to think that Bitdefender, Avast, or F-Secure can provide better Behavioral Protection (in a modern meaning) than Avira.
Currently, all those AVs get top scores in Real-World tests.
Anyway, more information about "the execution of files in a controlled environment" would be required to make a definite conclusion. Dynamic Analysis may be unavailable in Avira home and business versions (EDR feature).
 
Last edited:

SumTingWong

Level 28
Verified
Top Poster
Well-known
Apr 2, 2018
1,792
Same goes with Huawei/Honor devices which are banned in the US, yet they provide the most cutting edge in technology. Plus, they tend to last longer than the new Samsungs, Apples and Pixels.
Xioami phones don't have all the bands to support the US mobile carriers 😭

I have OnePlus 12 and it is much better than Google Pixel and Samsung Galaxy. I would like to see Xioami, Huawei, Honor, and Poco phones to have US bands and fully support US mobile carriers. Getting tired of Samsung, Apple, and Pixel phones.
 

IceMan7

Level 2
Thread author
Mar 19, 2025
90
Your conclusion about Behavioral Protection (in the meaning from 10-15 years ago) is partially supported by AV-Test:
No, it is not. I base this on watching tests, on comments and on what you can read on this forum.
Everyone brags that they have protection, they are wonderful, they get certificates because they took part in the test on their own terms (AV-Test) (if they do not agree, they do not participate) and then in practice (the tests I wrote above) it looks completely different.
Avast is not at the level of Kaspersky and Bitdefender today and Fsecure - Sentra from Avira is not at the level of the Finnish DeepGuard.
Of course, I am not an oracle, I do not test AV but that much can be concluded from what you read on the web.

The topic was created so that someone who is looking for the really best behavioral solution, would not only have it on paper but also in real life. And there are users here who know this very well and if they have knowledge, let them add their observations and conclusions ;)
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
@IceMan7,

I did not say anything about what convinced you to make your conclusion (it may be true or not depending on what version of Behavioral Protection you choose). I had in mind only the conclusion itself and provided the link to old tests that could support this conclusion.
Do you know which version of Behavioral Protection we talk about here? I asked some questions that could help:
https://malwaretips.com/threads/best-avs-and-worst-avs-in-behavioral-health.135292/post-1121256

In two other posts, I presented the sources of possible misunderstandings:
https://malwaretips.com/threads/best-avs-and-worst-avs-in-behavioral-health.135292/post-1121255
https://malwaretips.com/threads/best-avs-and-worst-avs-in-behavioral-health.135292/post-1121291
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
Avast is not at the level of Kaspersky and Bitdefender today and Fsecure - Sentra from Avira is not at the level of the Finnish DeepGuard.

This contradicts two years (one year in the Avira case) of Real-World tests made by AV-Test, AV-Comparatives, SE Labs, and AVLab. :confused:
I think that some explanation would be required about how it is possible.
For 10 years I have participated in most discussions about behavior blockers, behavioral features, behavioral protection, etc. Such topics have always been full of misconceptions and misunderstandings. Many people just repeat what they hear, without caring about the facts and reliable sources. Let's do it better this time. (y)
 
Last edited:

IceMan7

Level 2
Thread author
Mar 19, 2025
90
The Av-Test test and others are not good tests. Each of these labs has some guidelines or agrees with the participants on what tests are performed. If a solution is not certain and does not want to come out badly, it does not participate. It has been like this for years.
AV Test used to (I do not know if it has changed) give a better AV score when it let in at least 1 sample, than what it detected 100% but had more false positives. In addition, a better score was given for a solution that did most of the work itself without user intervention - for example, confirming the removal of a sample.

On the web (mainly on YT) you can see many tests where Avast does not reach the level of Kaspersky or Bitdefender in behavioral protection. You only need to look at the Shadowra tests here, where Norton (a clone of Avast, AVG and probably soon Avira because it is the same family) does not have this level either. The same conclusions can be drawn about Avira. This can be seen even in the tests of Fsecure, which is a clone of Avira and does not achieve such good results when it had DeepGuard
In marketing, every AV flexes its muscles, shows what great solutions it has.

I would have written more quickly that Eset is closer to BD and Kaspersky, because it did tests with Shadowra, and you can also see on YT that Eset is neck and neck with BD and Kaspersky (I even saw a test where it beat Kaspersky).
But this is also just a conclusion. Eset seems promising, I'm waiting for more tests to be able to write that it is at the level or close to BD.. For now, Eset boasts in its advertising leaflets how good it is. Tests will show it

Especially since 9/10 of the testers on YT reveal in the comments that despite the great results of Kaspersky and Bitdefender, they use Eset on their computers. Eset also has Shadowra :D So for some reason they use Eset? But why :D?
:unsure:

Behavioral protection is not just marketing. Bitdefender (BD) and Kaspersky have been showing for years how strong they are when tested by running malicious samples. Now others have to prove themselves in this area to write/say that they are equally or similarly strong.
 
  • +Reputation
Reactions: roger_m

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
The Av-Test test and others are not good tests. Each of these labs has some guidelines or agrees with the participants on what tests are performed. If a solution is not certain and does not want to come out badly, it does not participate. It has been like this for years.

Surprisingly, Kaspersky and Avast participated in all the tests, and for the last two years, Avast was on the same level. A few years ago, It was not. If your supposition was true, Avast should have resigned two years ago or more.
Webroot has participated in SE Labs tests for years with poor results and does not complain.
Microsoft has participated in all tests for many years. In the beginning, the results were poor (Windows 8), but Microsoft did not resign. Microsoft could easily destroy any AV testing lab in court if your conspiracy theory was true.
The vendors who resigned did it for a reason. Some products were not on the level of popular AVs, others used security layers that did not fit to the testing methodology.

AV Test used to (I do not know if it has changed) give a better AV score when it let in at least 1 sample, than what it detected 100% but had more false positives. In addition, a better score was given for a solution that did most of the work itself without user intervention - for example, confirming the removal of a sample.

I did not notice any of the above. The AV that missed 0 samples in the test and other AVs that missed up to 4 samples are often treated as equally effective. This is the correct methodology that includes the statistical significance of the results. This is also explained in the testing methodology (if one bothers to read it).

On the web (mainly on YT) you can see many tests where Avast does not reach the level of Kaspersky or Bitdefender in behavioral protection.

We still do not know what exactly you have in mind by saying "behavioral protection". The situation would be clearer If you would like to post some examples of such tests.

You only need to look at the Shadowra tests here, where Norton (a clone of Avast, AVG and probably soon Avira because it is the same family) does not have this level either.

You wrongly interpret Shadowra tests. Furthermore, why did not you put Microsoft Defender on the same (or higher) level as Kaspersky and Bitdefender?
Did Microsoft Defender score worse in the Shadowra tests? You cannot pick up only those tests and AVs that fit your expectations.

Some words about tests.
Each test is like a football match (2 AVs tested) or like a distance run (more AVs tested). If Real Madrid beats Barcelona 2-0, it doesn't mean it has already won the league. It also does not necessarily mean that the Real Madrid players played better football.

Edit.
You may look here for more information:
https://malwaretips.com/threads/randomness-in-the-av-labs-testing.104104/post-905376
https://malwaretips.com/threads/the-best-home-av-protection-2019-2020.106485/post-927440
https://malwaretips.com/threads/the-best-home-av-protection-2021-2022.120387/post-1021873
https://malwaretips.com/threads/the...4-av-test-av-comparatives.134865/post-1117751
https://malwaretips.com/threads/how-antivirus-works-dynamic-and-behavioral-detection.134888/
https://malwaretips.com/threads/homemade-av-testing-a-suggestion.134815/post-1117709
 
Last edited:

IceMan7

Level 2
Thread author
Mar 19, 2025
90
Surprisingly, Kaspersky and Avast participated in all the tests, and for the last two years, Avast was on the same level. A few years ago, It was not. If your supposition was true, Avast should have resigned two years ago or more.
Webroot has participated in SE Labs tests for years with poor results and does not complain.
What don't you understand? If there are guidelines, known as boundary conditions, and the AV knows that its participation will not be bad for it, it participates in such tests. Especially since it pays off. If the AV believes that the results may be compromising for it, it does not participate.
Once on Avlab in the comments, Adrian wrote that not all AVs take part in every test, because not everyone likes the test method, which may ultimately be the reason for the poor result of their solution. If there is a solution that is still "in its infancy" or is not at a high level and is not at risk of violating sales, it will participate in order to develop the software itself.

What kind of AV-Test tests are these, when at least 70% of solutions get 3x6 in the assessment?

I haven't been excited about these tests for a long time, because you supposedly have a document on how they test, a described method, and then you haven't seen these tests (a film would be useful at least once) and so on - you don't know what samples they test. Can you vouch that half of these AVs that take part don't know these samples or don't have signatures anymore? They pay for these tests and don't want to come off badly.

The tests are as credible as the tests in the German motoring press, where German cars, German tires and everything German always win. Have you ever seen a revolt of Italian and French car brands or a protest of GoodYear, Pirelli, Michelin when Continental always wins?

I prefer Shadowra tests or those on YT channels, because the tests are recorded and you can see how a given AV behaves during the test. How it reacts to a threat, what processes fight it or where it gives up.
And I only trust AVLab from tests on tables.

Shadowra tests are not an oracle. Samples are different. One AV will detect all of them and another time it will give up on some.
It's about seeing how technologies and solutions used in a given AV deal with threats.

Behavioral protection, for me, is protection against 0-day samples. Against new ones that AV doesn't know and how it fights them, having its own solutions. And how it ends in the end.
And that's why I write that when it comes to this, Kaspersky and Bitdefender have been doing the best for years and have not been compromising in such tests. Norton and its clones are worse. And I would bet on Eset faster than on Avast, Avira, Norton. It's just a shame that Fsecure is now Avira :(

This is my opinion. You have a different one. But I respect it.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
What kind of AV-Test tests are these, when at least 70% of solutions get 3x6 in the assessment?
(...)
And I only trust AVLab from tests on tables.

Over 90% of AVs are awarded in AVLab tests, and apparently, it is not a problem for you. :)
The tables do not make AVLab more trustworthy. They are published after a month, so no one can check how was the real detection in the tests. The trustworthiness of all professional AV testing labs (including AVLab) follows from the fact that AV vendors have full access to the samples and can participate in discussions about problematic detections.

Why do you believe in the results of video-tests? Such videos can be easily biased by removing the samples that compromised the promoted AV. Furthermore, the AV vendors cannot do anything about it. The trustworthiness of testers follows from other things. The video is only a nice form of test demonstration. This causes many shortcomings that are avoided in professional tests. If one would like to make a video to demonstrate the professional test, the video would last a few days and you could watch it only on a monitor bigger than your room. Furthermore, most people could not understand it.

I prefer Shadowra tests or those on YT channels, because the tests are recorded and you can see how a given AV behaves during the test. How it reacts to a threat, what processes fight it or where it gives up.

So look at the Shadowra tests about Avast and AVG, starting from the year 2023:
App Review - Shadowra's Big Comparative - Episode 1 : Free Antivirus
https://malwaretips.com/threads/avg-internet-security-2024.131848/
App Review - Avast Free Antivirus 2024
https://malwaretips.com/threads/avast-one-free-antivirus-2023.122817/

The results are fully compatible with the results of (AV-Test, AV-Comparatives, AVLab, and SE Labs).

Behavioral protection, for me, is protection against 0-day samples. Against new ones that AV doesn't know and how it fights them, having its own solutions.

Many 0-day samples are detected by heuristics without using Behavior Monitoring. So, using the term "Behavioral protection" would be misguiding.
But thanks for clarifying what is this thread for.

And that's why I write that when it comes to this, Kaspersky and Bitdefender have been doing the best for years and have not been compromising in such tests.

There are no tests that include only 0-day samples. Any available test with 100+ in-the-wild samples uses more than 2/3 non-0-day samples. Kaspersky and Bitdefender were compromised many times in the available tests with fresh samples. Here is an example of compromising Bitdefender in the tests of your favorite AVLab:

1 fail: Recent Results In March 2024 » AVLab Cybersecurity Foundation
5 fails: Results May 2023 » AVLab Cybersecurity Foundation
5 fails Results In May 2023 » AVLab Cybersecurity Foundation

By the way, Avast did not miss a single sample in all tests.

You may believe in what you want, but please do it consistently and objectively based on available data. Of course, there is nothing wrong with thinking that Kaspersky, Bitdefender, or Eset can provide top protection against 0-day malware at home or SMBs. There are many convincing data on it. However, the same data shows that Avast, AVG, and Norton are not inferior at all.

From the above AVs, I prefer Kaspersky. It can be tweaked with @harlan4096 settings to apply the setup stronger than anything available in Bitdefender, Eset, Avast, AVG, and Norton. (y)
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
If there are guidelines, known as boundary conditions, and the AV knows that its participation will not be bad for it, it participates in such tests. Especially since it pays off. If the AV believes that the results may be compromising for it, it does not participate.
Once on Avlab in the comments, Adrian wrote that not all AVs take part in every test, because not everyone likes the test method, which may ultimately be the reason for the poor result of their solution. If there is a solution that is still "in its infancy" or is not at a high level and is not at risk of violating sales, it will participate in order to develop the software itself.

That is right. From this, it follows that the results of AVs which resigned can be biased. However, the test results posted by me are for those AVs that did not resign!
The vendors of those AVs do not complain, do not accuse AV testing labs of anything, and do not make conspiracy theories (even if the results are average). Why should one do all of this instead of AV vendors? Are they stupid or helpless?
Of course, you have the right to believe in your theory. One should not blindly believe also AV testing labs. I will not try to change your mind. I post here to show the full context.
 
Last edited:

IceMan7

Level 2
Thread author
Mar 19, 2025
90
Why do you believe in the results of video-tests? Such videos can be easily biased by removing the samples that compromised the promoted AV. Furthermore, the AV vendors cannot do anything about it. The trustworthiness of testers follows from other things. The video is only a nice form of test demonstration. This causes many shortcomings that are avoided in professional tests. If one would like to make a video to demonstrate the professional test, the video would last a few days and you could watch it only on a monitor bigger than your room. Furthermore, most people could not understand it.
So these videos and tests here, because tests are done by enthusiasts. It's their hobby and nobody pays them for it. These are not bloggers who are sponsored and recommend NordVPN or Norton all the time. The tests you refer to, everyone pays for them. Yes, I believe AVlab because I know how they do it.
Adrian is from my country and before this forum knew about him, I already knew his work on avlab well. Thanks to his work, I personally changed a lot of things in my own computer security.
And the results you provide from AvLab, from what I remember, are mainly done by the Firefox browser, and AV is supposed to protect against saving to disk. (if I remember correctly)

Many 0-day samples are detected by heuristics without using Behavior Monitoring. So, using the term "Behavioral protection" would be misguiding.
But thanks for clarifying what is this thread for.
If you perform an operation on a file, then behavioral protection usually comes into play.
We are not talking about scanning files. Note that some, such as Shadowra, first scan files and others only then perform an operation on the file. And there are many tests where there is no scanning at all and the samples are immediately subject to action. Then all the technologies that AV has must defend themselves so that the system does not get infected.
You focused on the entire AV package, on test results, etc. But in this topic we are focusing on behavioral protection such as ATD from BD. And not all of them are good at this, despite their solutions, patents, etc. And here is the biggest difference between current AVs.
We've known about heuristics for years. Eset was one of the first to use it, back when it was Nod32.

Behavioral analysis, also known as behavior analysis, focuses on observing and interpreting the program's behavior in the operating environment. Instead of focusing on how something looks (as in the case of signatures), behavioral analysis looks at how something works.
And then AV reacts like a cop to a killer.
The key to the effectiveness of behavioral analysis is its ability to respond quickly to new threats.
AV is designed to respond to maliciousness when signatures and heuristics fail.

There are no tests that include only 0-day samples. Any available test with 100+ in-the-wild samples uses more than 2/3 non-0-day samples. Kaspersky and Bitdefender were compromised many times in the available tests with fresh samples. Here is an example of compromising Bitdefender in the tests of your favorite AVLab:

Avast was "#####" for many years. Disgrace after disgrace. And if someone had a problem on their computer, Avast was usually installed. Yes, it has improved a lot in recent years. I don't deny it. BD and Kaspersky have been at the forefront for years. They usually don't compromise compared to the competition. They will fail like everyone else usually fails. And usually when they fail, they come out better at the finish line than the rest.
At the forefront for years. Avast looks good now and can it be placed next to BD or Kaspersky? I doubt it.
Eset also had worse years and it works well again.
The question is how long will Eset and Avast stay at the forefront. Because BD and Kaspersky have always been at the forefront for years.

I understand that you use Avast and you defend it so much :) Great. No one is defending you.
You will always find some explanation for your theory that you chose a good AV to confirm that you chose well :)
If you think that Avast is just as good as the rest currently, then why was Eset the AVcomparatives product of the year?
Here you do not deny where I placed it, but I placed it in the same place as Avast, AVG and Norton :) I also put Fsecure there :)
And on this forum, every second entry about Eset says that it is weak, has weak behavioral protection, is based on signatures and other things... and the best are always the two - Kaspersky and Bitdefender.
That is why I wrote it in points and invited you to the discussion. I summarized what you can read on the forum and that's where this whole topic was created.


You may believe in what you want, but please do it consistently and objectively based on available data. Of course, there is nothing wrong with thinking that Kaspersky, Bitdefender, or Eset can provide top protection against 0-day malware at home or SMBs. There are many convincing data on it. However, the same data shows that Avast, AVG, and Norton are not inferior at all.

From the above AVs, I prefer Kaspersky. It can be tweaked with @harlan4096 settings to apply the setup stronger than anything available in Bitdefender, Eset, Avast, AVG, and Norton. (y)
Laboratory tests, performed at the expense of others, have their own specifics. If you blindly believe in this, I won't take off your rose-tinted glasses ;) How much can I write that in these tests, few AVs will perform poorly :) This has been visible for years and will continue to be visible for years.
I'm not even mentioning that most also get marks for speed of operation on the system, and we know very well that some work even on a museum computer, others on average, and others need a "monster".
I didn't write that there is a huge overall gap between them. It's not a difference like between poles. But they are weaker. An AV solution is not only about effectiveness but also about operating the system, ease of use. Browser protection, banking protection. Like or dislike.
I focused solely on distinguishing AV in the matter of behavioral protection and not the entire package.

BTW
And as you write, you prefer Kaspersky because you know that most solutions don't stand a chance against it. Not all, but most.

Personally, I preferred Fsecure for years. It wasn't perfect, I knew its weaknesses, but it was top of the line.

That is right. From this, it follows that the results of AVs which resigned can be biased. However, the test results posted by me are for those AVs that did not resign!
The vendors of those AVs do not complain, do not accuse AV testing labs of anything, and do not make conspiracy theories (even if the results are average). Why should one do all of this instead of AV vendors? Are they stupid or helpless?
Of course, you have the right to believe in your theory. One should not blindly believe also AV testing labs. I will not try to change your mind. I post here to show the full context.
I've already explained it to you using the example of the automotive press. Do you have car brand revolts because of this?
They joined because it suited them. Because some are good at signatures, others are good at phishing, for example.
It's clear that you don't understand the mechanisms in this world. I don't believe in any theories. It just seems that I know better how this world is built ;) And that you believe in these tests is your business. But you have the right to do so ;)


Regards
 
Last edited:

bazang

Level 13
Jul 3, 2024
621
Laboratory tests, performed at the expense of others, have their own specifics.
AV test labs perform primarily a marketing function.

The only true measure of any security software's protection capabilities is to have it fully pentested by a team of highly skilled and experienced pentesters. Such tests are very expensive and, more importantly, AVs don't engage them because all of them - all AVs - get destroyed during the course of such testing.

You would be very disappointed in Bitdefender, Kaspersky, and others you favor. They will fail miserably under the full weight of truly capable pentesters.

Security is not software. Security is a process. There's is far, far too much emphasis on software at security forums.
 
  • Like
Reactions: simmerskool

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,881
I understand that you use Avast and you defend it so much :) Great. No one is defending you.

Do not take quick opinions. I mainly defended and criticized Microsoft Defender. I also criticized and defended other AVs like Avast, Comodo, and Eset. I have been on the MT forum for over 10 years and defended/criticized many things.:)

If you think that Avast is just as good as the rest currently, then why was Eset the avcomparatives product of the year?

We talked in this thread about protection which mainly depends on the results of Real-World tests (tests with many 0-day samples).
Eset took only a Bronze Award in Real-World tests, and Avast took the Gold Award.

That is why I wrote it in points and invited you to the discussion. I summarized what you can read on the forum and that's where this whole topic was created.

Thanks.:)

BTW
And as you write, you prefer Kaspersky because you know that most solutions don't stand a chance against it. Not all, but most.

I prefer the specific Kaspersky protection If I had to use only Kaspersky. However, I mainly use Windows built-in protection with enabled Application Control features (SRP, WDAC, Windows policies).

Personally, I preferred Fsecure for years. It wasn't perfect, I knew its weaknesses, but it was top of the line.

Good choice.


Dla ciebie też. Pozdrów Adriana, znamy się dobrze z moich krytycznych postów na temat testów AVLab.:)
 
Last edited:

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top