Hot Take Hold Software Makers Liable for Selling Insecure Tech - CISA 2023


Thread author
Staff Member
Malware Hunter
Jul 27, 2015

Quote: " I would submit to you that these cyber-intrusions are a symptom, rather than a cause, of the vulnerability we face as a nation. The cause, simply put, is unsafe technology products. And because the damage caused by these unsafe products is distributed and spread over time, the impact is much more difficult to measure. But like the balloon, it’s there.

It’s a school district shut down; one patient forced to divert to another hospital, a separate patient forced to cancel a surgery; a family defrauded of their savings; a gas pipeline shutdown; a 160-year-old college forced to close its doors because of a ransomware attack. And that’s just the tip of the iceberg, as many—if not most—attacks go unreported. As a result, it’s enormously difficult to understand the collective toll these attacks are taking on our nation or to fully measure their impact in a tangible way. The risk introduced to all of us by unsafe technology is frankly much more dangerous and pervasive than the spy balloon, yet we’ve somehow allowed ourselves to accept it. As we’ve integrated technology into nearly every facet of our lives, we’ve unwittingly come to accept as normal that such technology is dangerous-by-design:

We’ve normalized the fact that technology products are released to market with dozens, hundreds, or thousands of defects, when such poor construction would be unacceptable in any other critical field. We’ve normalized the fact that the cybersecurity burden is placed disproportionately on the shoulders of consumers and small organizations, who are often least aware of the threat and least capable of protecting themselves. We’ve normalized the fact that security is relegated to the “IT people” in smaller organizations or to a Chief Information Security Officer in enterprises, but few have the resources, influence, or accountability to incentivize adoption of products in which safety is appropriately prioritized against cost, speed to market, and features. And we’ve normalized the fact that most intrusions and cyber threats are never reported to the government or shared with potentially targeted organizations, allowing our adversaries to re-use the same techniques to compromise countless other organizations, often using the same infrastructure.

This pattern of ignoring increasingly severe problems is an example of the “normalization of deviance,” a theory advanced by sociologist Diane Vaughan in her book about the ill-fated decision to launch the space shuttle Challenger in 1986. Vaughan describes an environment in which “people become so accustomed to a deviant behavior that they don't consider it as deviant, despite the fact that they far exceed their own rules for elementary safety.” "

Quote: " In sum, we need a model of sustainable cybersecurity, one where incentives are realigned to favor long-term investments in the safety and resilience of our technology ecosystem, and where responsibility for defending that ecosystem is rebalanced to favor those most capable and best positioned to do so. What would such a model look like? It would begin with technology products that put the safety of customers first. It would rebalance security risk from organizations—like small businesses—least able to bear it and onto organizations—like major technology manufacturers—much more suited to managing cyber risks.

To help crystalize this model, at CISA, we’re working to lay out a set of core principles for technology manufacturers to build product safety into their processes to design, implement, configure, ship, and maintain their products. Let me highlight three of them here:

First, the burden of safety should never fall solely upon the customer. Technology manufacturers must take ownership of the security outcomes for their customers.

Second, technology manufacturers should embrace radical transparency to disclose and ultimately help us better understand the scope of our consumer safety challenges, as well as a commitment to accountability for the products they bring to market.

Third, the leaders of technology manufacturers should explicitly focus on building safe products, publishing a roadmap that lays out the company's plan for how products will be developed and updated to be both secure-by-design and secure-by-default.

So, what would this look like in practice?

Well, consumer safety must be front and center in all phases of the technology product lifecycle—with security designed in from the beginning—and strong safety features, like seatbelts and airbags— enabled right out of the box, without added costs. Security-by-design includes actions like transitioning to memory-safe languages, having a transparent vulnerability disclosure policy, and secure coding practices. Attributes of strong security-by-default will evolve over time, but in today’s risk environment sellers of software must include in their basic pricing the types of features that secure a user’s identity, gather and log evidence of potential intrusions, and control access to sensitive information, rather than as an added, more expensive option.

In short, strong security should be a standard feature of virtually every technology product, and especially those that support the critical infrastructure that Americans rely on daily. Technology must be purposefully developed, built, and tested to significantly reduce the number of exploitable flaws before they are introduced into the market for broad use. Achieving this outcome will require a significant shift in how technology is produced, including the code used to develop software, but ultimately, such a transition to secure-by-default and secure-by-design products will help both organizations and technology providers: it will mean less time fixing problems, more time focusing on innovation and growth, and importantly, it will make life much harder for our adversaries. In this new model, the government has an important role to play in both incentivizing these outcomes and operationalizing these principals. Regulation—which played a significant role in improving the safety of automobiles—is one tool, but—importantly—it’s not a panacea. "

Full source:

Zero Knowledge

Level 20
Top Poster
Content Creator
Dec 2, 2016
The problem is you could not prove intent in 99% of cases. Are they negligible for creating bad software or was it just an honest mistake in development?

The only way I see a law being successful in this case is in relation to medical incidents. Where a bug/exploit cause injury or death.

And the way software development will improve (besides A.I.) is when a big company gets sued for buggy software and has to pay millions in compensation.
  • Like
Reactions: ForgottenSeer 98186

ForgottenSeer 98186

The only way I see a law being successful in this case is in relation to medical incidents. Where a bug/exploit cause injury or death.
There have already been such cases. Software that controlled X-ray machines caused the equipment to "dose" the patient with a lethal level of radiation.

With regard to civil cases involving software and security, the law is far behind the technology. Anyway, the system of regulations and laws will never hold a software publisher liable for what a user does or does not do under general circumstances. Plaintiffs have been trying for decades to place blame and culpability onto software and device manufacturers. Most all that effort is for naught.

Interestingly, because of proximity of Carnegie Mellon and all the expertise in that region - where Jen gave this presentation - the greatest number of software legal cases at the federal level are brought before The United States District Court for the Western District of Pennsylvania. Lots of case law and legal precedents (stare decisis) established in the legal forums there.

The law as it applies to software is a fascinating subject matter.

And the way software development will improve (besides A.I.) is when a big company gets sued for buggy software and has to pay millions in compensation.
"sued for buggy software"

Many have tried. Nobody has succeeded, at least not in the context that you are referring to. Software is governed by product and consumer protections laws. It is virtually impossible to prove software publisher tort (contracts) liability for bugs - even ones that cause financial losses. Most importantly, there is no willingness on the part of state and federal lawmakers to make software publishers assume responsibility and liability. If they did do that, then they would basically regulate the software out of existence.
Last edited by a moderator:
  • Like
Reactions: Zero Knowledge

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.