decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books
Your contributions keep Groklaw going.
To donate to Groklaw 2.0:

Groklaw Gear

Click here to send an email to the editor of this weblog.


To read comments to this article, go here
A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs
Sunday, July 04 2004 @ 07:16 AM EDT

A Short Note on Secure Operating Systems.
~ by Victor Yodaiken, CEO, FSMLabs

Remarks attributed to Gene Spafford and Cynthia Irvine in EE-Times and a marketing offensive by Green Hills against Linux don't provide an accurate picture of software security issues for operating systems and, in fact, add to the confusion. In what follows, I want to try to move the discussion to a less emotional and more balanced basis. One of my points is that security certifications have serious limitations and costs. That's not to say that certifications are bad or useless -- far from it -- but certification is not a cure-all or without problems, and people need to be able to distinguish between marketing and actual engineering.

1. Professor Spafford's complaint about the "provenance" of code in Linux's open development model is unfounded. There is no assurance that any software development effort is free from people who have bad intent or who just write lousy software. It's not even clear that it is easier to get code into Linux than it is to get code into other operating systems. In fact, because Linux code is developed on an open model and is tracked by a comprehensive source control system (see www.bkbits.com ) it may be relatively harder to smuggle malicious code into Linux. In any case, "provenance" is a side issue, one that is easily turned into cheap fear-mongering and xenophobia ( http://www.ghs.com/news/20040426_linux.html). For the same reason, I very much dislike the terminology "subversive code" which is an emotionally charged substitute for the standard term "malicious code". The real issues are whether software is designed and tested for security and whether the development process assures certain levels of quality.

2. The state of the art should not be overestimated. When, according to EE-Times, Professor Irvine contrasts Linux with "'high-assurance' operating systems with the smarts to prove that subverting code doesn't exist" we should understand that there are no, none, zero, existing operating systems that can prove that they don't contain malicious code or other security flaws. The Common Criteria is a "best practices" standard developed by agencies of several governments (in the US, the NSA and NIAP) but the standard is relatively new and untested. It seems as if it should produce more secure software, but we don't know for sure. Experts like Bruce Schneier argue that standards themselves are of limited use. When Linux is criticized for not having Common Criteria certifications, we should note that there are no operating systems certified at the highest level of the Common Criteria and not even any widely accepted Common Criteria specifications for the security of embedded operating systems. In fact, there is considerable disagreement among researchers over best methods, a shortage of empirical data, and limits to what can be verified at the highest level.

3. The Common Criteria standard defines "evaluation assurance levels" (EAL) from 1 to 7, with 7 being the highest. These levels are being used in a grossly misleading manner to try to sell software. There are no existing EAL7 certified operating systems - not a single one (except maybe hidden in some NSA lab).Furthermore, the EAL is just a measure of the level of effort and rigor put into proving that a software program satisfies a security specification. If the specification does not correctly identify the actual threats to the system, the software is not secure, no matter what the level of evaluation assurance. A rigorous proof that the Titanic is unsinkable in the Caribbean Sea is no comfort on a voyage through the North Atlantic iceberg belt. Flooding attacks that can cause a real-time operating system to have scheduling delays are among the threats we consider in our design and test process at FSMLabs. Green Hills is currently embarked on a EAL6 (not 7) effort against a specification that does not have a single real-time requirement. Proof that an operating system meets a specification which does not address the flooding threat provides no actual security assurance if the threat is possible. One of the virtues of the Common Criteria is that it repeatedly warns against claims like "my software is more secure than yours". The Common Criteria framework encourages people to define "more secure" against the actual type and role of the software and a clear threat analysis. It would be unfortunate indeed if the fundamental message of the Common Criteria was ignored, and it was used only as a tool for frightening people into purchasing or not purchasing software based on meaningless buzzwords. (See http://eros.cs.jhu.edu/~shap/NT-EAL4.html for some similar comments and a very different point of view.)

4. Even the higher EALs are not ironclad, but they are very costly. EAL7 only requires a "semi-formal" specification of the low level implementation. Formal specification of complex software is not solved problem. At the top, EAL7, level there are significant limitations on the practicability of meeting the requirements, partly due to substantial cost impact on the developer and evaluator activities, and also because anything other than the simplest of products is likely to be too complex to submit to current state-of-the-art techniques for formal analysis. (Cf: http://niap.nist.gov/cc-scheme/cc_docs/cc_introduction.pdf)

5. Despite Professor Spafford's complaints about the intrusion of mere cost considerations into software purchase decisions, in the real-world resources are limited and tradeoffs are inescapable. Developers will limit functionality in an effort to limit the costs of certification or just to make certification practical. If a limited certified operating system causes the complexity of applications to increase and the reliability of those applications to decrease, use of that software may have a negative effect on the security of the whole system. Is it more or less dangerous to use Linux to control a power plant than it is to use an EAL5 (say) OS? Suppose the EAL5 OS comes with no device drivers, costs enough to reduce the amount of test time that can be used in development or the number of trained operators used to monitor the plant, and requires application developers to produce their own math library. Suppose the Linux system is not connected to the network. Suppose the EAL5 evaluation is against a specification that does not cover the most likely threats. The answer is: you'd better do a real whole systems security analysis instead of relying on buzzwords.

6. Finally, I want to point out that the mere existence of a "certified" version of some company's operating system does not mean anything about the other software produced by that company. For both Common Criteria and the FAA's DO-178 reliability certification, the general practice is to set up a separate development team and a separate, limited, product line. Most projects, even the military projects that Green Hills CEO Dan O'Dowd cites, are unable to bear the costs and/or the limitations of the certified product lines and so purchase the standard commercial versions which receive PR benefits, but not necessarily any reliability or security benefits, from the certified product lines. The interesting comparison between Linux or some other solution is to the actual products being used, not the highest assurance component sold by the vendor. Software security requires strong engineering and solid cost/benefit analysis, even though that is probably not the best marketing tactic and it means we have to admit that there are no magic bullets to make the problem go away.

İVictor Yodaiken, FSMLabs


  View Printable Version


Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )