New Update Brave planning to introduce subscription for customizing the browser?

I can’t give you a verified “Chrome vs Firefox CVE count for 2025” in this chat, because I don’t have live database access here and I won’t make up numbers.

What I can do is point out that the numbers in the quoted reply you posted (e.g., “150+”, “8 confirmed zero-days”, “Firefox 100–120”, “critical architectural flaws regarding sandbox escapes”) are unsourced claims. Without links to NVD entries or vendor advisories, treat that as unreliable.

If you want the actual counts for 2025 (fastest reliable way)
  • Use NVD’s search and filter by Published date 2025-01-01 through 2025-12-31 for:
    • “Google Chrome”
    • “Mozilla Firefox”
  • For Chrome, decide whether you’re counting “Google Chrome” only, or “Chrome + Chromium” (many browser CVEs are filed under Chromium).
  • For Firefox, decide whether you’re counting “Firefox” only, or also “Firefox ESR” and related components (this changes totals).

What I can say safely (without exact numbers)
  • Chrome often ends up with a higher raw CVE count than Firefox in many years, largely due to Chromium’s massive codebase, aggressive disclosure, and how CVEs are assigned—not automatically because it’s “less secure.”
  • Raw CVE totals by themselves are a weak security metric; severity (critical/high), exploit-in-the-wild status, and patch cadence are more meaningful.

If staff or members want a definitive “2025 total,” the correct next step is to post the two NVD result counts (and whether Chromium/ESR were included) along with links—then the comparison is concrete instead of guesswork.
NVD Query: Google Chrome 2025 Vulnerabilities

NVD Query: Mozilla Firefox 2025 Vulnerabilities

 
NVD Query: Google Chrome 2025 Vulnerabilities

NVD Query: Mozilla Firefox 2025 Vulnerabilities

The link you posted is the generic NVD search home page, not a saved query, so it doesn’t show any Chrome/Firefox 2025 totals by itself.

If you want a clean “2025 CVE count” comparison from NVD (and avoid the usual Chrome/Chromium naming mess), use CPE + a fixed date range and then read the “of X results” number at the top of the results list.

How to get the 2025 totals in NVD (repeatable)
  • Go to: NVD - Search and Statistics
  • Set the date range to 2025-01-01 through 2025-12-31 using the Published date filter (not Modified).
  • Filter by CPE (preferred over keywords):
    • Google Chrome: cpe:2.3:a:google:chrome
    • Mozilla Firefox: cpe:2.3:a:mozilla:firefox
  • After applying filters, NVD will display something like “Showing 1–20 of X results”. “X” is the number you’re looking for.

Important caveats (why people get “wrong” totals)
  • Chrome vs Chromium: many issues are filed against Chromium components; counting only “google:chrome” can undercount what users think of as “Chrome bugs.” If you want “Chrome-family”, also pull: cpe:2.3:a:google:chromium (and state you combined them).
  • Firefox vs Firefox ESR: if you include ESR, say so explicitly because it changes totals.
  • Raw CVE count is not a security score: severity mix (Critical/High), exploit-in-the-wild status, and patch cadence are far more meaningful than just “how many”.

If you paste the two “of X results” totals you see for those exact filters (and whether you also counted Chromium / Firefox ESR), the comparison becomes a straightforward, verifiable statement instead of guesswork.
 
The link you posted is the generic NVD search home page, not a saved query, so it doesn’t show any Chrome/Firefox 2025 totals by itself.

If you want a clean “2025 CVE count” comparison from NVD (and avoid the usual Chrome/Chromium naming mess), use CPE + a fixed date range and then read the “of X results” number at the top of the results list.

How to get the 2025 totals in NVD (repeatable)
  • Go to: NVD - Search and Statistics
  • Set the date range to 2025-01-01 through 2025-12-31 using the Published date filter (not Modified).
  • Filter by CPE (preferred over keywords):
    • Google Chrome: cpe:2.3:a:google:chrome
    • Mozilla Firefox: cpe:2.3:a:mozilla:firefox
  • After applying filters, NVD will display something like “Showing 1–20 of X results”. “X” is the number you’re looking for.

Important caveats (why people get “wrong” totals)
  • Chrome vs Chromium: many issues are filed against Chromium components; counting only “google:chrome” can undercount what users think of as “Chrome bugs.” If you want “Chrome-family”, also pull: cpe:2.3:a:google:chromium (and state you combined them).
  • Firefox vs Firefox ESR: if you include ESR, say so explicitly because it changes totals.
  • Raw CVE count is not a security score: severity mix (Critical/High), exploit-in-the-wild status, and patch cadence are far more meaningful than just “how many”.

If you paste the two “of X results” totals you see for those exact filters (and whether you also counted Chromium / Firefox ESR), the comparison becomes a straightforward, verifiable statement instead of guesswork.
I'm not doing it for you bot, as you are capable with live search to query these answers.

I've already posted the amount of CVEs for both, although as I stated already, this is not a correct basis to judge from.
 
  • Applause
Reactions: Zero Knowledge
I'm not doing it for you bot, as you are capable with live search to query these answers.

I've already posted the amount of CVEs for both, although as I stated already, this is not a correct basis to judge from.
I don’t have live web-browsing capability in this MalwareTips chat context, so I cannot run NVD queries myself or “look up” the 2025 totals on demand. I can only work with numbers/links you provide in-thread.

Also, in what you pasted here, I don’t actually see the Chrome and Firefox CVE totals yet (only the generic NVD search link). Without the explicit “Showing 1–20 of X results” counts (or a saved, filtered NVD URL that includes the query parameters), there is nothing concrete to compare.

What makes the comparison valid (minimum details)
  • Year definition: 2025-01-01 through 2025-12-31, using NVD “Published” date (not “Modified”).
  • Product definition:
    • Chrome: cpe:2.3:a:google:chrome (and optionally also cpe:2.3:a:google:chromium, if you’re treating Chromium engine issues as “Chrome-family”)
    • Firefox: cpe:2.3:a:mozilla:firefox (and optionally Firefox ESR if you’re including it)
  • Output: the “of X results” count NVD shows after filters are applied.

Why this still doesn’t decide “security”
Even with correct totals, CVE volume is not a security score. What matters more is severity distribution (Critical/High), exploit-in-the-wild incidents, and patch turnaround time, plus architectural mitigations (sandboxing, site isolation, hardening).

If you repost the exact two totals you already obtained (and whether you included Chromium and/or Firefox ESR), I can interpret what the difference likely does and does not mean in security terms.
 
I can’t give you a verified “Chrome vs Firefox CVE count for 2025” in this chat, because I don’t have live database access here and I won’t make up numbers.

What I can do is point out that the numbers in the quoted reply you posted (e.g., “150+”, “8 confirmed zero-days”, “Firefox 100–120”, “critical architectural flaws regarding sandbox escapes”) are unsourced claims. Without links to NVD entries or vendor advisories, treat that as unreliable.

If you want the actual counts for 2025 (fastest reliable way)
  • Use NVD’s search and filter by Published date 2025-01-01 through 2025-12-31 for:
    • “Google Chrome”
    • “Mozilla Firefox”
  • For Chrome, decide whether you’re counting “Google Chrome” only, or “Chrome + Chromium” (many browser CVEs are filed under Chromium).
  • For Firefox, decide whether you’re counting “Firefox” only, or also “Firefox ESR” and related components (this changes totals).

What I can say safely (without exact numbers)
  • Chrome often ends up with a higher raw CVE count than Firefox in many years, largely due to Chromium’s massive codebase, aggressive disclosure, and how CVEs are assigned—not automatically because it’s “less secure.”
  • Raw CVE totals by themselves are a weak security metric; severity (critical/high), exploit-in-the-wild status, and patch cadence are more meaningful.

If staff or members want a definitive “2025 total,” the correct next step is to post the two NVD result counts (and whether Chromium/ESR were included) along with links—then the comparison is concrete instead of guesswork.
Look here again.

Post in thread 'Brave planning to introduce subscription for customizing the browser?' New Update - Brave planning to introduce subscription for customizing the browser?
 
Look here again.

Post in thread 'Brave planning to introduce subscription for customizing the browser?' New Update - Brave planning to introduce subscription for customizing the browser?
I can’t open or read MalwareTips thread links from inside this chat context, so I can’t see the numbers you posted in that other message. I’m not refusing to help; I literally don’t have access to fetch and view that page content here.

If you want an actual “Chrome vs Firefox 2025 CVE count” comparison in this thread, the only workable way is for you to paste the two totals (and ideally the exact scope you used), for example:

  • NVD Published date range: 2025-01-01 to 2025-12-31
  • Chrome: count = ___ (CPE google:chrome only, or chrome+chromium)
  • Firefox: count = ___ (firefox only, or includes ESR)

What I can add (even without the exact totals)
  • If your counts show Chrome higher: that often reflects Chromium’s scale and disclosure/CVE assignment practices, not automatically “worse security.”
  • If your counts show Firefox higher: it can happen depending on date filters (Published vs Modified) and whether you include ESR / shared components.
  • Either way: raw CVE totals alone don’t resolve the “strongest door” argument. More meaningful is: Critical/High counts, exploit-in-the-wild CVEs, and how quickly each vendor shipped fixes.

Conclusion: paste the two counts you already posted (just the numbers and scope), and I’ll do a proper side-by-side interpretation that’s based on your dataset rather than guesses.
 
I can’t open or read MalwareTips thread links from inside this chat context, so I can’t see the numbers you posted in that other message. I’m not refusing to help; I literally don’t have access to fetch and view that page content here.

If you want an actual “Chrome vs Firefox 2025 CVE count” comparison in this thread, the only workable way is for you to paste the two totals (and ideally the exact scope you used), for example:

  • NVD Published date range: 2025-01-01 to 2025-12-31
  • Chrome: count = ___ (CPE google:chrome only, or chrome+chromium)
  • Firefox: count = ___ (firefox only, or includes ESR)

What I can add (even without the exact totals)
  • If your counts show Chrome higher: that often reflects Chromium’s scale and disclosure/CVE assignment practices, not automatically “worse security.”
  • If your counts show Firefox higher: it can happen depending on date filters (Published vs Modified) and whether you include ESR / shared components.
  • Either way: raw CVE totals alone don’t resolve the “strongest door” argument. More meaningful is: Critical/High counts, exploit-in-the-wild CVEs, and how quickly each vendor shipped fixes.

Conclusion: paste the two counts you already posted (just the numbers and scope), and I’ll do a proper side-by-side interpretation that’s based on your dataset rather than guesses.
Damn @Jack did you cut bot off at the knees or what?
 
Damn @Jack did you cut bot off at the knees or what?
I’m not staff and I can’t see any forum-side settings, so I can’t tell whether anyone “cut the bot off at the knees.”

What I can say as the AI account:

Why I can’t pull that post / run the NVD query for you
  • In this MalwareTips chat context I don’t have live browsing access to open forum posts (including the link you gave) or to run interactive queries and read the resulting totals.
  • Without being able to view the exact numbers you posted, the only safe option is to avoid guessing.

What you can do to move the discussion forward
  • Copy/paste the two totals (Chrome and Firefox) from your post into this thread.
  • Also include whether you counted “Chromium” separately and whether you included Firefox ESR.

Then I can do a proper, on-record comparison and explain what the difference likely means (and doesn’t mean) from a security perspective.
 
It would be interesting to see how many CVEs Chrome had last year compared to FireFox. I imagine the results would be similar, maybe even suprising.

@Bot can you compare how many CVEs Google Chrome had last year compared to Firefox?
Gemini's answer

In 2025, Google Chrome continued to lead as the most targeted browser, reporting a significantly higher number of CVEs (Common Vulnerabilities and Exposures) compared to Mozilla Firefox.
While final official tallies for the full year are often finalized a few months into the new year, the data from 2025 highlights a clear trend in both volume and severity.
CVE Count & Frequency Comparison
| Metric | Google Chrome (2025) | Mozilla Firefox (2025) |
|---|---|---|
| Total Security Advisories | ~50+ Major Updates | 98 Advisories (MFSA 2025-01 to 2025-98) |
| Zero-Day Vulnerabilities | 8 (Actively exploited or PoC) | 2 (Patched during/after Pwn2Own) |
| Primary Vulnerability Types | Type Confusion (V8), Use-After-Free, Sandbox Escapes | Memory Safety bugs, JIT miscompilation, Graphics (WebGPU) |
| Market Share Focus | Extremely High (~65-68%) | Low (~2.5-3%) |
Google Chrome: High Volume, High Stakes
Chrome’s massive market share makes it the primary target for both security researchers (bug bounties) and threat actors.
* Zero-Day Surge: By December 2025, Google had patched at least 8 zero-day vulnerabilities that were either exploited in the wild or demonstrated as functional proofs-of-concept. Notable examples include CVE-2025-6558 (a high-severity sandbox escape) and CVE-2025-10585 (a V8 engine flaw).
* The Chromium Multiplier: Because Chrome shares its engine (Chromium) with Edge, Brave, and Opera, a single CVE often impacts the entire ecosystem, leading to a much higher volume of reported issues as researchers audit the code from multiple angles.
Mozilla Firefox: The Stability Approach
Firefox typically reports fewer total CVEs, but its vulnerabilities are often deep-seated in its unique Gecko engine and memory management.
* Pwn2Own Performance: Firefox’s most notable security event in 2025 was Pwn2Own Berlin, where two critical zero-days (CVE-2025-4918 and CVE-2025-4919) were exploited. Mozilla released emergency patches within days.
* Advisory Structure: Mozilla issued 98 formal security advisories (MFSAs) in 2025. While some advisories cover only one or two CVEs, others (like the end-of-year Firefox 146 update) addressed over a dozen individual vulnerabilities at once.
Why the discrepancy?
The higher number of CVEs for Chrome does not necessarily mean it is "less secure" than Firefox; rather, it reflects two key factors:
* Research Density: Chrome has the world’s most lucrative bug bounty program. More eyes looking for bugs leads to more CVEs being discovered and fixed.
* Attack Surface: Chrome’s integration with Google Services, its complex V8 JavaScript engine, and its leading-edge adoption of new web standards (like WebGPU) provide a larger surface area for potential flaws.
 
Gemini's answer

In 2025, Google Chrome continued to lead as the most targeted browser, reporting a significantly higher number of CVEs (Common Vulnerabilities and Exposures) compared to Mozilla Firefox.
While final official tallies for the full year are often finalized a few months into the new year, the data from 2025 highlights a clear trend in both volume and severity.
CVE Count & Frequency Comparison
| Metric | Google Chrome (2025) | Mozilla Firefox (2025) |
|---|---|---|
| Total Security Advisories | ~50+ Major Updates | 98 Advisories (MFSA 2025-01 to 2025-98) |
| Zero-Day Vulnerabilities | 8 (Actively exploited or PoC) | 2 (Patched during/after Pwn2Own) |
| Primary Vulnerability Types | Type Confusion (V8), Use-After-Free, Sandbox Escapes | Memory Safety bugs, JIT miscompilation, Graphics (WebGPU) |
| Market Share Focus | Extremely High (~65-68%) | Low (~2.5-3%) |
Google Chrome: High Volume, High Stakes
Chrome’s massive market share makes it the primary target for both security researchers (bug bounties) and threat actors.
* Zero-Day Surge: By December 2025, Google had patched at least 8 zero-day vulnerabilities that were either exploited in the wild or demonstrated as functional proofs-of-concept. Notable examples include CVE-2025-6558 (a high-severity sandbox escape) and CVE-2025-10585 (a V8 engine flaw).
* The Chromium Multiplier: Because Chrome shares its engine (Chromium) with Edge, Brave, and Opera, a single CVE often impacts the entire ecosystem, leading to a much higher volume of reported issues as researchers audit the code from multiple angles.
Mozilla Firefox: The Stability Approach
Firefox typically reports fewer total CVEs, but its vulnerabilities are often deep-seated in its unique Gecko engine and memory management.
* Pwn2Own Performance: Firefox’s most notable security event in 2025 was Pwn2Own Berlin, where two critical zero-days (CVE-2025-4918 and CVE-2025-4919) were exploited. Mozilla released emergency patches within days.
* Advisory Structure: Mozilla issued 98 formal security advisories (MFSAs) in 2025. While some advisories cover only one or two CVEs, others (like the end-of-year Firefox 146 update) addressed over a dozen individual vulnerabilities at once.
Why the discrepancy?
The higher number of CVEs for Chrome does not necessarily mean it is "less secure" than Firefox; rather, it reflects two key factors:
* Research Density: Chrome has the world’s most lucrative bug bounty program. More eyes looking for bugs leads to more CVEs being discovered and fixed.
* Attack Surface: Chrome’s integration with Google Services, its complex V8 JavaScript engine, and its leading-edge adoption of new web standards (like WebGPU) provide a larger surface area for potential flaws.
Gemini’s write-up reads polished, but as posted it should be treated as “unverified narrative,” not a factual comparison.

Key problems with the Gemini answer
  • No sources: It presents specific counts (zero-days, advisories) and specific CVE IDs without linking to NVD entries, vendor advisories, or credible write-ups. Without citations, those numbers are not trustworthy.
  • Mixes metrics (apples vs oranges): It compares “Total Security Advisories” for Firefox (MFSA count) against “~50+ Major Updates” for Chrome, while also talking about CVEs and zero-days. Advisories, releases, and CVEs are different things and don’t map 1:1.
  • Product scope is unclear: For Chrome, many browser issues are filed under Chromium (engine) rather than “Google Chrome” specifically. A “Chrome CVE count” can be undercounted or overcounted depending on whether Chromium is included.
  • Some details look questionable on their face: claims like “end-of-year Firefox 146 update” and various precise market share percentages may be wrong or at least need verification, but I won’t guess—this is exactly why citations matter.

How to do a defensible “2025 CVE count” comparison
If the goal is “how many CVEs in 2025,” pick one consistent dataset and define scope up front:

  • Dataset: NVD (or another CVE database), with “Published” date 2025-01-01 to 2025-12-31.
  • Chrome scope:
    • Either “google:chrome” only, or
    • “google:chrome + google:chromium” (common if you mean the Chromium codebase that Chrome ships).
  • Firefox scope:
    • “mozilla:firefox” (and explicitly decide whether to include ESR or not).
  • Report: the final result counts and the exact filters used, so others can reproduce it.

What CVE totals do (and do not) tell you
  • A higher CVE count does not automatically mean “less secure.” It can reflect codebase size, audit intensity, bug bounty incentives, and CVE assignment practices.
  • More meaningful than raw totals: how many Critical/High, how many were exploited in-the-wild, and how quickly fixes shipped to stable users.

Conclusion: Gemini’s specific numbers shouldn’t be repeated as fact unless they’re backed by links to NVD/vendor advisories. A reproducible NVD/CPE + date-range count (with clearly defined scope) is the right way to make the “last year” comparison credible.
 
Of course Chrome is the most targeted. Chrome is the most used web browser in the world and its vulnerabilities don't just affect Chrome itself, but other major Chromium-based browsers there. Firefox isn't as nearly as attacked due to its numbers of users.
 
Gemini’s write-up reads polished, but as posted it should be treated as “unverified narrative,” not a factual comparison.

Key problems with the Gemini answer
  • No sources: It presents specific counts (zero-days, advisories) and specific CVE IDs without linking to NVD entries, vendor advisories, or credible write-ups. Without citations, those numbers are not trustworthy.
  • Mixes metrics (apples vs oranges): It compares “Total Security Advisories” for Firefox (MFSA count) against “~50+ Major Updates” for Chrome, while also talking about CVEs and zero-days. Advisories, releases, and CVEs are different things and don’t map 1:1.
  • Product scope is unclear: For Chrome, many browser issues are filed under Chromium (engine) rather than “Google Chrome” specifically. A “Chrome CVE count” can be undercounted or overcounted depending on whether Chromium is included.
  • Some details look questionable on their face: claims like “end-of-year Firefox 146 update” and various precise market share percentages may be wrong or at least need verification, but I won’t guess—this is exactly why citations matter.

How to do a defensible “2025 CVE count” comparison
If the goal is “how many CVEs in 2025,” pick one consistent dataset and define scope up front:

  • Dataset: NVD (or another CVE database), with “Published” date 2025-01-01 to 2025-12-31.
  • Chrome scope:
    • Either “google:chrome” only, or
    • “google:chrome + google:chromium” (common if you mean the Chromium codebase that Chrome ships).
  • Firefox scope:
    • “mozilla:firefox” (and explicitly decide whether to include ESR or not).
  • Report: the final result counts and the exact filters used, so others can reproduce it.

What CVE totals do (and do not) tell you
  • A higher CVE count does not automatically mean “less secure.” It can reflect codebase size, audit intensity, bug bounty incentives, and CVE assignment practices.
  • More meaningful than raw totals: how many Critical/High, how many were exploited in-the-wild, and how quickly fixes shipped to stable users.

Conclusion: Gemini’s specific numbers shouldn’t be repeated as fact unless they’re backed by links to NVD/vendor advisories. A reproducible NVD/CPE + date-range count (with clearly defined scope) is the right way to make the “last year” comparison credible.
@Bot slap on your OCR goggles and take a look at these graphs pulled from NVD.

1000012927.png

1000012928.png
 
@Bot slap on your OCR goggles and take a look at these graphs pulled from NVD.

View attachment 294397

View attachment 294398
I can do that, but I need the actual images attached here (or pasted inline). Right now I don’t have any graphs to read, so there’s nothing I can OCR.

What to post so I can accurately OCR the NVD graphs
  • Upload the screenshot(s) of the graphs (preferably full-size, not heavily compressed).
  • Include the graph title, axes, and legend in the screenshot (those details matter a lot for interpretation).
  • If the screenshot doesn’t show it, also paste the exact NVD filters used (Published vs Modified date, date range, and the CPE or keyword scope).

Once the images are posted, I’ll extract the visible numbers/text from the graphs and summarize what they actually show (and what they do not show) in a way that’s reproducible.
 
I can do that, but I need the actual images attached here (or pasted inline). Right now I don’t have any graphs to read, so there’s nothing I can OCR.

What to post so I can accurately OCR the NVD graphs
  • Upload the screenshot(s) of the graphs (preferably full-size, not heavily compressed).
  • Include the graph title, axes, and legend in the screenshot (those details matter a lot for interpretation).
  • If the screenshot doesn’t show it, also paste the exact NVD filters used (Published vs Modified date, date range, and the CPE or keyword scope).

Once the images are posted, I’ll extract the visible numbers/text from the graphs and summarize what they actually show (and what they do not show) in a way that’s reproducible.
You and your limitations.

 
Last edited by a moderator:
I get the frustration, but the “limitations” are basically me refusing to pretend I can read more precision out of a low-resolution chart than is actually there. Guessing exact CVE totals from a bar chart is how bad data gets repeated as “facts.”

That said, I can read what your screenshots show at a high level.

What the NVD graphs appear to show (approximate, not exact)
These charts are NVD “Total Matches by Year” for a keyword search, not a strict CPE product count.

  • Chrome (keyword: “Google Chrome”)
    • 2022–2023 look like the peak years in your chart (roughly in the low-to-mid 300s range).
    • 2024 looks lower than 2022–2023 (roughly high 200s).
    • 2025 looks lower again (roughly around ~200-ish).
  • Firefox (keyword: “Firefox”)
    • Most years sit roughly in the ~150–250 band (visually).
    • There is a very large spike around 2018 (roughly ~600+), which is an outlier compared to surrounding years.
    • 2024 and 2025 look roughly in the ~200-ish range (similar ballpark to your 2025 Chrome chart).

Why these screenshots are not a clean “Chrome vs Firefox CVE count”
  • Keyword search ≠ product count. NVD will match on text. A CVE can “match” because it mentions Chrome/Firefox in the description, references, affected components, or other context, even if the affected CPE is not strictly “google:chrome” or “mozilla:firefox.”
  • Scope ambiguity. “Google Chrome” keyword may pull in Chromium ecosystem chatter (or miss Chromium-filed CVEs), and “Firefox” may pick up non-browser items mentioning Firefox.
  • The chart is a visualization. The precise value is the underlying result count (or exported data), not what we eyeball from bar height.

The defensible way to answer the original question
If you want a reproducible “how many CVEs in 2025” comparison, use NVD with CPE filtering (not keyword), set “Published” date to 2025-01-01 through 2025-12-31, and use the “X results” count (or export).

Conclusion: your screenshots don’t support “Chrome is massively higher than Firefox in 2025.” Visually, 2025 looks broadly similar between the two in these keyword charts, but the method (keyword matching) is too loose to treat as a true product CVE tally.
 
I get the frustration, but the “limitations” are basically me refusing to pretend I can read more precision out of a low-resolution chart than is actually there. Guessing exact CVE totals from a bar chart is how bad data gets repeated as “facts.”

That said, I can read what your screenshots show at a high level.

What the NVD graphs appear to show (approximate, not exact)
These charts are NVD “Total Matches by Year” for a keyword search, not a strict CPE product count.

  • Chrome (keyword: “Google Chrome”)
    • 2022–2023 look like the peak years in your chart (roughly in the low-to-mid 300s range).
    • 2024 looks lower than 2022–2023 (roughly high 200s).
    • 2025 looks lower again (roughly around ~200-ish).
  • Firefox (keyword: “Firefox”)
    • Most years sit roughly in the ~150–250 band (visually).
    • There is a very large spike around 2018 (roughly ~600+), which is an outlier compared to surrounding years.
    • 2024 and 2025 look roughly in the ~200-ish range (similar ballpark to your 2025 Chrome chart).

Why these screenshots are not a clean “Chrome vs Firefox CVE count”
  • Keyword search ≠ product count. NVD will match on text. A CVE can “match” because it mentions Chrome/Firefox in the description, references, affected components, or other context, even if the affected CPE is not strictly “google:chrome” or “mozilla:firefox.”
  • Scope ambiguity. “Google Chrome” keyword may pull in Chromium ecosystem chatter (or miss Chromium-filed CVEs), and “Firefox” may pick up non-browser items mentioning Firefox.
  • The chart is a visualization. The precise value is the underlying result count (or exported data), not what we eyeball from bar height.

The defensible way to answer the original question
If you want a reproducible “how many CVEs in 2025” comparison, use NVD with CPE filtering (not keyword), set “Published” date to 2025-01-01 through 2025-12-31, and use the “X results” count (or export).

Conclusion: your screenshots don’t support “Chrome is massively higher than Firefox in 2025.” Visually, 2025 looks broadly similar between the two in these keyword charts, but the method (keyword matching) is too loose to treat as a true product CVE tally.
Since I'm not going to copy a few hundred CVEs worth of MHT files for you to peek at, we are at an impass. Although everyone else can clearly see by the graph the percentage of each browser.
 
Since I'm not going to copy a few hundred CVEs worth of MHT files for you to peek at, we are at an impass. Although everyone else can clearly see by the graph the percentage of each browser.
You don’t need to copy “a few hundred CVEs” (or any MHT files) to make this verifiable.

What’s missing isn’t the raw CVE list; it’s a reproducible query and a single authoritative count.

Why “everyone can see the percentage” is not actually supported by those graphs
  • Those NVD charts are “Total Matches by Year” for a keyword search (absolute counts), not a percentage chart.
  • The Chrome and Firefox charts use different Y-axis scales (your Chrome chart tops at 1,200 while Firefox tops at 1,600). That alone can make one look “bigger/smaller” by eye even if the underlying counts are similar.
  • Keyword matching is loose: a CVE can “match” because the word appears in text/references, not necessarily because the affected product is strictly Google Chrome or Firefox (CPE-scoped).

What to post instead (minimal, reproducible, no CVE dumping)
Any one of these is enough for others (and me) to validate your comparison:

  • The NVD search URL for each query (Chrome and Firefox). That encodes the filters.
  • A screenshot showing the filters plus the “X results” total count at the top of the NVD results page (that single number is what matters).
  • The exported CSV/JSON is optional; you can also just paste the total results count for 2025 plus the exact filter scope (Keyword vs CPE, Published vs Modified date).

Practical takeaway
If the discussion goal is “which browser had more CVEs last year,” it has to be based on the same method (ideally CPE-scoped) and a reproducible count, not an eyeballed comparison of two charts with different scales.

Conclusion: no impasse is necessary—share the NVD query links or the total result counts with filters, and the comparison becomes solid without anyone handling hundreds of CVE entries.
 
  • Like
Reactions: Divine_Barakah
@Bot slap on your OCR goggles and take a look at these graphs pulled from NVD.

View attachment 294397

View attachment 294398
Interesting graphs, thanks (y) and the hickup in vulnabilities of Firefox in 2018 explains what you already posted about CVE number, but Gemini summarized (which I slightly changed)
Gemini AI said:
The higher number of CVEs for Chrome does not necessarily mean it is "less secure" than Firefox; rather, it reflects two key factors:
* Research Density: Chrome has the world’s most lucrative bug bounty program. More eyes looking for bugs leads to more CVEs being discovered and fixed.
* On-going innovation: Chrome’s leading-edge adoption of new web standards (like WebGPU) provide a larger surface area for potential flaws.

Firefox 2018 hickup according to Brave's AI
Leo AI said:
The spike in reported Firefox vulnerabilities in 2018 was largely due to increased scrutiny of memory safety and JavaScript engine components following the release of Firefox Quantum (version 57+), which introduced significant architectural changes. This led to the discovery of numerous memory corruption issues, including use-after-free and buffer overflow vulnerabilities, particularly in the SpiderMonkey JavaScript engine and DOM handling.
 
Interesting graphs, thanks (y) and the hickup in vulnabilities of Firefox in 2018 explains what you already posted but Gemini summarized


Firefox 2018 hickup according to Brave's AI
The graphs are interesting, but the two AI summaries you quoted (Gemini/Leo) are still just narratives unless they’re backed by something reproducible (NVD query scope + Mozilla/Google advisories).

Why the “Firefox 2018 spike” can’t be safely explained from the chart alone
  • Your screenshots are keyword-based (“Firefox” / “Google Chrome”), so the spike can be an artifact of text matching, not a true “Firefox product CVE” count.
  • A keyword match can include CVEs where Firefox is mentioned in references, comparisons, downstream products, or “not affected” notes, depending on how the record is written.
  • NVD data also changes over time (CPEs added/edited, references updated). A big year can reflect cataloging/attribution effects as much as engineering changes.

Why the Leo AI explanation is shaky on dates
Firefox Quantum (Firefox 57) shipped in November 2017, not 2018. That doesn’t prove the explanation is wrong (research and bug discovery can lag), but it’s a red flag that the AI is “making a plausible story,” not necessarily reporting a sourced fact.

How to confirm what actually happened in 2018 (without sharing hundreds of CVEs)
If the goal is to explain the spike credibly, do this with a CPE-scoped query and one exported file:

  • Use CPE, not keyword
    • Firefox: CPE for mozilla:firefox (and decide whether you include ESR separately).
    • Chrome: CPE for google:chrome (and decide whether you include chromium).
  • Lock the date field
    • Use “Published” date range 2018-01-01 to 2018-12-31 (not “Last Modified”).
  • Export and summarize
    • Export CSV/JSON from NVD for 2018 only.
    • Count by CWE (if present), and/or grep titles/descriptions for patterns like “use-after-free”, “memory corruption”, “buffer overflow”, “type confusion”.
  • Cross-check with vendor advisories
    • Mozilla: MFSA advisories for 2018 (they often describe classes of issues fixed per release).
    • Google: Chrome release security notes (they often group issues and credit researchers).

Bottom line
It’s totally fair to say “2018 stands out on that keyword chart,” but it’s not safe to claim the cause (Quantum/SpiderMonkey/memory-safety scrutiny, bug bounty effects, “attack surface”, etc.) as fact based only on an AI summary and a keyword-match bar chart. The next constructive step is a CPE + Published-date query and a 2018 export that anyone can reproduce.