App Review McAfee: how bad is the worst antivirus?

It is advised to take all reviews with a grain of salt. In extreme cases some reviews use dramatization for entertainment purposes.
Content created by
TPSC
Tried Trellix Endpoint Security, but not McAfee before; TES was very bad compared to SEP, larger install size, slower, more false both positive and negative, with enigmatic settings.
For the sake of truth, McAfee webadvisor extension is excellent, better than Norton safeweb and Symantec browser protection.
 
Some of McAfee's bad reputation is deserved, but it has definitely improved in the last couple of years. I tried the basic McAfee antivirus product on the suggestion of a Reddit user and was pretty impressed. It is very light, one of the lightest AVs I have used. And was pretty cheap too, around $12 CAD for 1 PC. Bought it from a store near me. I have not tested its effectiveness, but @Shadowra tested McAfee Total last year and was fairly effective: App Review - McAfee Total Protection 2024

I can only assume it has improved since then.

I think the way Leo from TPSC tested McAfee was not the best way to do it. It probably can't detect malware from network locations as easily as right on the main PC.
 
The author wrongly thinks that the AV with poor detection on Network shares and with no post-launch behavior protection should be unrecommended for all users. But in fact, it is true mainly for Enterprise users.
McAfee Total Protection is a home antivirus, so the video is pretty much useless for home users. The author's comments do not help and can be misleading. We can see here a kind of trivial testing loop. Home AV was tested, and the results showed that it should not be used in Enterprises, which was known from the beginning.

However, the video can be helpful for MT readers interested in post-launch behavior protection (the test should also be repeated on malware stored on the hard and flash drives).

The testing methodology of Leo's videos depends on running malware (mainly payloads) from the Network shares. This type of attack vector is rare in the home environment (most users do not have Network shares). From the video, it appears that McAfee AV does not properly check files from Network shares, and its post-launch protection for payloads is inadequate. Such an AV should not be used in Enterprises. In other tests that take into account attacks on home users, McAfee AV scores among top AVs, due to its strong web and pre-launch protection (also behavior-based).
 
Last edited:
The author wrongly thinks that the AV with poor detection on Network shares and with no post-launch behavior protection should be unrecommended for all users. But in fact, it is true mainly for Enterprise users.
McAfee Total Protection is a home antivirus, so the video is pretty much useless for home users. The author's comments do not help and can be misleading. We can see here a kind of trivial testing loop. Home AV was tested, and the results showed that it should not be used in Enterprises, which was known from the beginning.

However, the video can be helpful for MT readers interested in post-launch behavior protection (the test should also be repeated on malware stored on the hard and flash drives).

The testing methodology of Leo's videos depends on running malware (mainly payloads) from the Network shares. This type of attack vector is rare in the home environment (most users do not have Network shares). From the video, it appears that McAfee AV does not properly check files from Network shares, and its post-launch protection for payloads is inadequate. Such an AV should not be used in Enterprises. In other tests that take into account attacks on home users, McAfee AV scores among top AVs, due to its strong web and pre-launch protection (also behavior-based).
By this logic, a seatbelt that only works if you crash in a sedan and not in an SUV should still get a 5-star safety rating – just don’t drive the wrong kind of car.
 
The author wrongly thinks that the AV with poor detection on Network shares and with no post-launch behavior protection should be unrecommended for all users. But in fact, it is true mainly for Enterprise users.
McAfee Total Protection is a home antivirus, so the video is pretty much useless for home users. The author's comments do not help and can be misleading. We can see here a kind of trivial testing loop. Home AV was tested, and the results showed that it should not be used in Enterprises, which was known from the beginning.

However, the video can be helpful for MT readers interested in post-launch behavior protection (the test should also be repeated on malware stored on the hard and flash drives).

The testing methodology of Leo's videos depends on running malware (mainly payloads) from the Network shares. This type of attack vector is rare in the home environment (most users do not have Network shares). From the video, it appears that McAfee AV does not properly check files from Network shares, and its post-launch protection for payloads is inadequate. Such an AV should not be used in Enterprises. In other tests that take into account attacks on home users, McAfee AV scores among top AVs, due to its strong web and pre-launch protection (also behavior-based).
K and B home versions have stronger post-launch behavior protection; actually, B, inspite of its inflated signatures size, is relying more on post-launch behavior protection, compared to K.
 
  • Like
Reactions: Sorrento
Tried Trellix Endpoint Security, but not McAfee before; TES was very bad compared to SEP, larger install size, slower, more false both positive and negative, with enigmatic settings.
For the sake of truth, McAfee webadvisor extension is excellent, better than Norton safeweb and Symantec browser protection.
McAfee had the websites rating and blocking features long before Norton SafeWeb was offered as add-on for Norton 2009 16.5 (it required additional installation and restart).

So yeah, McAfee overall still heavily relies on the websites blocking features to protect a machine. The pre-launch protection known as Artemis (which initially was just reputation analysis) was later on integrated with RealProtect (codename raptor). It now uses reputation and pre-execution static analysis/heuristics to detect malware.

Artemis/RealProtect use a large set of rules (not surprising given any heuristic system will be rule-based), for example 196612 detects files with suspicious reputation that are in the user directories, there are additional rules for executables with modified entropy and so on and so on. It looks like even the file location influences what aggressiveness level McAfee will use to scan the file.
There used to be a JSON file where all the rules were described, I think now it’s been removed.

Downside is, the home version has 0 settings whatsoever and is more about reducing false positives than anything else. The business versions have more settings. McAfee also scans only on-execution.
 
Last edited:
to reduce system impact at the expense of protection.
I wouldn’t say protection is reduced as the file is blocked until the analysis takes place. I am more unimpressed with the detection levels where accuracy is prioritised over aggressive detection. Also, McAfee is now fully cloud-based with absolutely no local definitions whatsoever.

Trend Micro has a better execution of both easy settings, you just switch between 2 modes and the mode controls a host of settings which individually could bamboozle users, and the cloud is mixed with local patterns and definition files.
 
I wouldn’t say protection is reduced as the file is blocked until the analysis takes place. I am more unimpressed with the detection levels where accuracy is prioritised over aggressive detection. Also, McAfee is now fully cloud-based with absolutely no local definitions whatsoever.

Trend Micro has a better execution of both easy settings, you just switch between 2 modes and the mode controls a host of settings which individually could bamboozle users, and the cloud is mixed with local patterns and definition files.
Cloud-based AV is not a bad one; K is partially cloud-based and its detection is better than B and ESET, famous of relying on signature (in addition to behavioral protection for B).
It is not about being cloud-based, it is about the algorithm used for cloud-based protection.
 
Cloud-based AV is not a bad one; K is partially cloud-based and its detection is better than B and ESET, famous of relying on signature (in addition to behavioral protection for B).
It is not about being cloud-based, it is about the algorithm used for cloud-based protection.
They are all partially cloud-based but I personally don’t like a fully-cloud based solution with 0 protection capabilities offline. It’s just a personal preference.
 
By this logic, a seatbelt that only works if you crash in a sedan and not in an SUV should still get a 5-star safety rating – just don’t drive the wrong kind of car.

The seatbelts intended by the manufacturer for use in a sedan should be used in a sedan.
The author took the seatbelts from a sedan and tried to use them in his truck. The latch seatbelts did not fit, and the seatbelts did not work.
The wrong conclusion: The seatbelts are not recommended.
The right conclusion: The seatbelts should be used as intended by the manufacturer.

The useful information from the test: Those seatbelts are not universal (which was expected because the manufacturer did not mention universality in the instructions). However, it does not hurt to check whether the manufacturer tells the truth.:)
 
Last edited:
The seatbelts intended by the manufacturer for use in a sedan should be used in a sedan.
The author took the seatbelts from a sedan and tried to use them in a truck. The latch seatbelts did not fit, and the seatbelts did not work.
The wrong conclusion: The seatbelts are not recommended.
The right conclusion: The seatbelts should be used as intended by the manufacturer.

The useful information from the test: Those seatbelts are not universal.
This happens very often when trying to evaluate the effectiveness of solutions, without having any knowledge of their internals and the design choices standing behind them.

Any non-standardised test should be seen merely for entertainment purposes and for nothing else.

Calling it “the worst antivirus” is also a little bit farfetched. Perhaps Leo needs to try Protegent.
 
K and B home versions have stronger post-launch behavior protection; actually, B, inspite of its inflated signatures size, is relying more on post-launch behavior protection, compared to K.
Yes. Different vendors use different security approaches. Some vendors prefer a preventive and pre-launch approach. Some others like a more universal approach that includes post-launch protection. The first approach can increase the rate of false positives.
 
Yes. Different vendors use different security approaches. Some vendors like a preventive and pre-launch approach. Some others like a more universal approach that includes post-launch protection. The first approach usually increases the rate of false positives.
There is pretty much no vendor nowadays that hasn’t utilised all possible approaches one way or another. It’s just the execution that is different.
 
Yes. Different vendors use different security approaches. Some vendors like a preventive and pre-launch approach. Some others like a more universal approach that includes post-launch protection. The first approach can increase the rate of false positives.
and the second approach if not prompt, can increase the rate of data exfiltration and/or encryption.
 
and the second approach if not prompt, can increase the rate of data exfiltration and/or encryption.
Yes, relying too much on Behavioural Blocking a-la Sophos and Webroot is not amazing. By the time behavioural blocking is triggered and completes remediation, damage may already have been done.