Open Source and Apple Pie

Software security can be achieved through liability; companies must be responsible for software they choose to adopt.

Open Source and Apple Pie

*This essay was sparked by a WYSK section on open source deletions.

The White House recently held a meeting—reported on by The Verge—to address security of open source software:

The White House will meet with leaders of major tech companies including Apple, Google, Amazon, Meta, IBM, and Microsoft on Thursday to discuss the security of open-source software. The issue has become urgent in the wake of the extremely serious Log4j vulnerability, discovered in December 2021.

According to CNN, White House national security adviser Jake Sullivan described open source software as "a key national security concern" because it is used broadly but maintained by volunteers.

It is telling that the responsibility for security is laid at the feet of the open source developers (who provide products for free) rather than on the companies who volitionally decide to adopt that software.

The current paradigm the White House appears to be using is that the national security concern comes from the fact that open source software is "maintained by volunteers" who may not develop it securely, rather than because it is "used broadly" by companies and individuals who are unwilling to invest in auditing the code they blithely ingest.

This is absolutely the wrong approach. Companies need to be considered responsible for the software they choose to adopt. Let's walk through this with an analogy...

A thought exercise:

The Tree:

Imagine that you plant an apple tree and allow your community to eat as many apples as they want (let's stretch our imaginations and imagine that there is a limitless quantity of apples, and that they can be eaten by anyone anywhere).

Some people will complain that the apples are green and not red, or that they have too many or too few seeds, but everyone is free to eat these apples or to not. And they are free to use the seeds to plant their own apple trees and grow them differently as they please.

The Pies:

Now imagine that a pie shop wants to start selling some apple pies, and so they use some of your apples in their pies. Fair enough, you've said anyone can use them!

Calamity!

Someone finds a worm in the apple pie they just purchased! Very gross; something needs to be done about this...

The Decision:

Is the consumer's experience your fault, or the pie shop's?

Yes, the worm originated in your apple. And maybe you didn't even follow gardening best practices or treat your trees against worms. But then again you didn't sell the apple, make the pie, sell the pie, or do anything other than offer your apples for people to freely use, copy, or modify if they want to. You may not even have been aware your apples were being used in pies.

But then we have the pie shop: if a business views its pie-making with such carelessness as to lack pie-making standards and fail to engage in due-diligence to inspect its ingredients to make sure they are up to snuff, then it has chosen that behavior as a business decision. And as I've written previously, fixing that business calculus is the right problem to solve to achieve better security:

Ransomware Get-well plan
The US Government should pursue five policies through legislation, investment, and action, to drive the right incentives for asset-owners to improve security.

The Solutions We Don't Need:

The solution to better, safer pies is not by applying more burdens and restrictions to apple-growers—after all, apple growers don't decide if their apples are going to be used only by a few friends, or by hundreds of global pie-makers and fruit basket sellers—but by requiring higher standards from pie makers.

To end the analogy, we will achieve higher software security by requiring software developers to either accept liability for their products or to make all of their software open source so users can accept or disable any capabilities.

Good Solutions:

There are two main actions that should be taken.

First, a Software Bill of Materials (SBOM) is a good and necessary thing to be required of all developers (open source as well as closed source).

Second, Dan Geer provided a specific framework for handling software liability and responsibility for software security in his 2014 Blackhat keynote. It is still the best proposal that I have seen, and I've included the relevant text as collapsible content here.

Dan Geer's Keynote

3. Source code liability -- CHOICE

Nat Howard said that "Security will always be exactly as bad as it can possibly be while allowing everything to still function,"[NH] but with each passing day, that "and still function" clause requires a higher standard. As Ken Thompson told us in his Turing Award lecture, there is no technical escape;[KT] in strict mathematical terms you neither trust a program nor a house unless you created it 100% yourself, but in reality most of us will trust a house built by a suitably skilled professional, usually we will trust it more than one we had built ourselves, and this even if we have never met the builder, or even if he is long since dead.

The reason for this trust is that shoddy building work has had that crucial "or else ..." clause for more than 3700 years:

If a builder builds a house for someone, and does not construct it properly, and the house which he built falls in and kills its owner, then the builder shall be put to death. -- Code of Hammurabi, approx 1750 B.C.

Today the relevant legal concept is "product liability" and the fundamental formula is "If you make money selling something, then you better do it well, or you will be held responsible for the trouble it causes." For better or poorer, the only two products not covered by product liability today are religion and software, and software should not escape for much longer. Poul-Henning Kamp and I have a strawman proposal for how software liability regulation could be structured.

.......................

0. Consult criminal code to see if damage caused was due to intent or willfulness.

.......................

We are only trying to assign liability for unintentionally caused damage, whether that's sloppy coding, insufficient testing, cost cutting, incomplete documentation, or just plain incompetence. Clause zero moves any kind of intentionally inflicted damage out of scope. That is for your criminal code to deal with, and most already do.

.......................

1. If you deliver your software with complete and buildable source code and a license that allows disabling any functionality or code the licensee decides, your liability is limited to a refund.

.......................

Clause one is how to avoid liability: Make it possible for your users to inspect and chop out any and all bits of your software they do not trust or want to run. That includes a bill of materials ("Library ABC comes from XYZ") so that trust has some basis, paralleling why there are ingredient lists on processed foods.

The word "disabling" is chosen very carefully: You do not need to give permission to change or modify how the program works, only to disable the parts of it that the licensee does not want or trust. Liability is limited even if the licensee never actually looks at the source code; as long has he has received it, you (as maker) are off the hook. All your other copyrights are still yours to control, and your license can contain any language and restriction you care for, leaving the situation unchanged with respect to hardware-locking, confidentiality, secrets, software piracy, magic numbers, etc.

Free and Open Source Software (FOSS) is obviously covered by this clause which leaves its situation unchanged.

.......................

2. In any other case, you are liable for whatever damage your software causes when it is used normally.

.......................

If you do not want to accept the information sharing in Clause 1, you fall under Clause 2, and must live with normal product liability, just like manufactures of cars, blenders, chain-saws and hot coffee.

How dire the consequences, and what constitutes "used normally" is for your legislature and courts to decide, but let us put up a strawman example:

A sales-person from one of your long time vendors visits and delivers new product documentation on a USB key, you plug the USB key into your computer and copy the files onto the computer. This is "used normally" and it should never cause your computer to become part of a botnet, transmit your credit card number to Elbonia, or copy all your design documents to the vendor. If it does, your computer's operating system is defective.

The majority of today's commercial software would fall under Clause 2 and software houses need a reasonable chance to clean up their act or to move under Clause 1, so a sunrise period is required. But no longer than five years -- we are trying to solve a dire computer security problem here.

And that is it really: Either software houses deliver quality and back it up with product liability, or they will have to let their users protect themselves. The current situation -- users can't see whether they need to protect themselves and have no recourse to being unprotected -- cannot go on. We prefer self-protection (and fast recovery), but other's mileage may differ.

Would it work? In the long run, absolutely yes. In the short run, it is pretty certain that there will be some nasty surprises as badly constructed source code gets a wider airing. The FOSS community will, in parallel, have to be clear about the level of care they have taken, and their build environments as well as their source code will have to be kept available indefinitely.

The software houses will yell bloody murder the minute legislation like this is introduced, and any pundit and lobbyist they can afford will spew their dire predictions that "This law will mean the end of computing as we know it!"

To which our considered answer will be:

Yes, please! That was exactly the idea.

Software security does need to be improved. But allowing companies to continue escaping responsibility for their business decisions is not the way to do it. The US needs to codify product liability for software, and stop shielding businesses from the market consequences of their own decisions to minimize their security investments in attempts to maximize their profits.


💡
Have you liked this content? Want more? Subscribe today!

Photo by Kavya P K on Unsplash