decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books
Your contributions keep Groklaw going.
To donate to Groklaw 2.0:

Groklaw Gear

Click here to send an email to the editor of this weblog.


Contact PJ

Click here to email PJ. You won't find me on Facebook Donate Paypal


User Functions

Username:

Password:

Don't have an account yet? Sign up as a New User

No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Sunday, April 11 2004 @ 07:39 AM EDT

FUD hit embedded Linux this week, when Green Hills Software CEO Dan O'Dowd said Linux is a national security risk. The open source process should rule it out of defense applications, he claimed, because anyone can contribute, even developers in Russia and China, who might disguise their identities and slip in some subversive code. His company sells a competing proprietary RTOS.

He wasn't satisfied with just giving a speech to the Net-Centric Operations Industry Forum in Virginia saying so, but his company also put out a press release summarizing his speech. So he meant to say this, and he tried to give his remarks the widest possible audience. Dr. Inder Singh, CEO of Lynuxworks, and Victor Yodaiken, CEO of FSMLabs, have provided Groklaw with statements responding to Mr. O'Dowd's FUD.

WHAT HE SAID

First, the press release:

"The Linux operating system is being developed by an open source process -- a cooperative effort by a loose association of software developers from all over the world. 'The very nature of the open source process should rule Linux out of defense applications,' O'Dowd said. 'The open source process violates every principle of security. It welcomes everyone to contribute to Linux. Now that foreign intelligence agencies and terrorists know that Linux is going to control our most advanced defense systems, they can use fake identities to contribute subversive software that will soon be incorporated into our most advanced defense systems.'

"In addition, developers in Russia and China are also contributing to Linux software. Recently, the CEO of MontaVista Software, the world's leading embedded Linux company, said that his company has 'two and a half offshore development centers. A big one in Moscow and we just opened one in Beijing -- so much for the cold war.' Also, the CEO of LynuxWorks, another embedded Linux supplier, acknowledged that his company has a development center in Moscow.

"Linux software, including contributions from Russia and China, is spreading rapidly through the Defense Department because it can be freely downloaded from the Internet without a license agreement or up-front fees, bypassing legal, purchasing and security procedures. A recent survey conducted over a two-week period by the Mitre Group, found 251 Department of Defense deployments of Linux and other open source software.

"Linux has been selected to control the functionality, security and communications of critical defense systems including the Future Combat System, the Joint Tactical Radio System and the Global Information Grid. 'If Linux is compromised, our defenses could be disabled, spied on or commandeered. Every day new code is added to Linux in Russia, China and elsewhere throughout the world. Every day that code is incorporated into our command, control, communications and weapons systems. This must stop,' O'Dowd said.

"'Linux in the defense environment is the classic Trojan horse scenario—a gift of "free" software is being brought inside our critical defenses. If we proceed with plans to allow Linux to run these defense systems without demanding proof that it contains no subversive or dangerous code waiting to emerge after we bring it inside, then we invite the fate of Troy,' O'Dowd said.

"Advocates of the Linux operating system claim that its security can be assured by the openness of its source code. They argue that the 'many eyes' looking at the Linux source code will quickly find any subversions. Ken Thompson, the original developer of the Unix operating system—which heavily influenced Linux—proved otherwise. He installed a back door in the binary code of Unix that automatically added his user name and password to every Unix system. When he revealed the secret 14 years later, Thompson explained, 'The moral is obvious. You can't trust code that you did not create yourself. No amount of source-level verification or scrutiny will protect you from using untrusted code.'

"'Before most Linux developers were born, Ken Thompson had already proven that "many eyes" looking at the source code can't prevent subversion,' O'Dowd noted.

"'Linux is being used in defense applications even though there are operating systems available today that are designed to meet the most stringent level of security evaluation in use by the National Security Agency, Common Criteria Evaluation Assurance Level 7 (EAL 7),' O'Dowd said. 'We don't need cheaper security. We need better security. One "back door" in Linux, one infiltration, one virus, one worm, one Trojan horse and all of our most sophisticated network-centric defenses could crumble. We must not abandon provably secure solutions for the illusion that Linux will save money. We must not entrust national security to Linux,' O’Dowd concluded."

THE CEO's RESPOND

Dr. Inder Singh, CEO of Lynuxworks, provided Groklaw this response:

"The shrill broadside of FUD by Dan O’Dowd against the use of Linux in defense systems is in my view just a reflection of the pain that vendors of proprietary systems with closed interfaces are experiencing as the embedded world moves towards Linux. Linux is rapidly becoming the open multi-vendor standard across the embedded industry including the defense market. The fact is that embedded Linux is an unstoppable force -- the largest of the vendors of proprietary RTOS's, Wind River, has already switched from Linux bashing to embracing Linux.

"The release is full of inaccurate statements and wild generalizations. Mr. O’Dowd would have us believe that every foreign developer working with Linux is a spy or terrorist, contributing subversive software to the Linux sources, and that their contributions are automatically included in Linux by Linus Torvalds and the Linux kernel team without any scrutiny. Further he implies that all the professionals in the military and defense industry would blindly use it for mission-critical programs without addressing security concerns! His reference to Ken Thompson’s backdoor in Unix ignores the difference between a binary versus source.

"According to Mr. O'Dowd, 'Linux is being used in defense applications even though there are operating systems available today that are designed to meet the most stringent level of security evaluation in use by the National Security Agency, Common Criteria Evaluation Assurance Level 7 (EAL7).' In fact there is no operating system today that is EAL-7 certified. He is presumably referring to their MILS Separation Kernel (see http://www.omg.org/docs/realtime/03-01-26.pdf - MILS Architecture) which is 'designed to meet .' EAL7 requirements although it is not there yet; so this is, at the very least, rather misleading. Now, their Separation Kernel is a small microkernel with proprietary interfaces, which would require all applications code to be written from scratch, and would only be suitable for relatively simple deeply embedded systems. You could not reuse the large body of existing Linux software in the implementation of large complex mission critical systems, including command and control systems, and you would be locked into the one vendor of the kernel, which is of course financially attractive for the vendor but not the best use of our tax dollars.

"At LynuxWorks, we are also working on a MILS Separation Kernel which is designed to meet EAL7 requirements. However, with our focus on open standards and Linux, we have made sure that our Separation Kernel supports multiple execution environments including Linux or SELinux (Security Enhanced Linux – see http://www.nsa.gov/selinux/) running within a partition. By implementing critical security related functionality in partitions separate from the partitions running Linux or SELinux, the overall system is capable of being evaluated at EAL7 while running existing Linux applications. This approach provides both EAL7 security assurance as well as the compelling benefits of Linux."

Victor Yodaiken, CEO of FSMLabs added this:

"Mr. O'Dowd certainly knows that huge numbers of foreign students and immigrants have helped create the US software industry and that all the giant US technology companies depend on engineers based all over the world. His alarm over outsourcing by small Linux companies is hard to credit as genuine given these well-known facts. Furthermore, Ken Thompson did not introduce a back door into UNIX or assert that open source software was more or less secure than closed source. Mr. O'Dowd has presented a very peculiar take on the famous Turing award lecture by Thompson. One lesson that can be drawn from Thompson's lecture is that the security of software cannot be separated from the trustworthiness of the vendor. A Linux company that develops software globally and that stresses integrity, solid engineering methods and organizational processes to assure quality and security should inspire some trust. A company that depends on factually challenged and emotional appeals to fear of foreigners should inspire some caution."

Jim Ready, CEO of MontaVista, earlier was quoted in Alexander Wolfe's article in EE Times:

"'Mr. O'Dowd makes the common mistake of confusing obscurity with security,' said Ready. 'Open Source is actually more secure than closed source proprietary software because the oversight of technology content is broader and deeper. Instead of just one company monitoring its own contributions — or potentially hiding security holes and exploits — a worldwide community of interested parties actually oversees Linux to make it strong and secure. That's why the NSA — the most security-conscious organization in the world — chose to standardize on Linux, and even supplies its own version of secure Linux.'"

AN ANALYSIS - IS HE RIGHT ABOUT INTERNATIONAL DEVELOPMENT BEING A THREAT?

First, does Mr.O'Dowd view outsourcing as a national security threat? No? What is the difference? Outsourcing is a fact of life now in the industry, and it's on the rise. Prior to massive outsourcing, the US proprietary software industry "in-sourced", importing software developers on H1-B visas. If there is a danger from international software developers, it's the Alamo for the US no matter how you look at it. It's a little late to worry about who writes the software, and all those outsourced jobs had better be state-sided this exact minute. Not going to happen? Why not? Is he not right that this is a threat? Surely if it is, the open source methodology, where you can at least check to see what someone has written, is an advantage. And by the way, if I recall correctly, aren't all those chips inside PC's these days being made in China/Taiwan?

Next, he misunderstands how open source software is written. If faking your identity was all it took to insert subversive code, he might have a point. But take another look at OSDL's press release where they explain the Linux kernel development process and click on the graphic. As you will see, masking your identity to slip in some evil code isn't likely to work, because the code itself is examined, piece by piece:

"The Linux operating system kernel is the result of the efforts of its creator, Linus Torvalds, and thousands of dedicated software developers from around the world. These developers are self-organized into specific subsystems defined by a developer's interests and technical expertise (for example, I/O, storage, networking). Each of these subsystems has a domain expert developer, called the subsystem maintainer, who oversees the work of others. Subsystem maintainers review the code submitted to them and orchestrate broader peer review of code to ensure its quality.

"All Linux code, both the current version and that submitted for future inclusion, is also available on-line for public examination. This allows literally thousands of interested parties to scrutinize submitted code in what amounts to a massive code review. Only when a subsystem maintainer accepts software code is it passed along to one of the two developers at the top of the Linux hierarchy, Torvalds himself or Andrew Morton.

"Torvalds maintains the 'development kernel' where new features and bug fixes are tested. Morton maintains the 'production kernel' which is the version release for public use. Torvalds is the final arbiter of what is included in Linux. OSDL, with the help of Torvalds and Morton, created a simplified Linux Development Process graphic to help illustrate these key points. The graphic is available at http://www.osdl.org/newsroom/graphics/linux_dev_process_final.png ."

You may remember that in November of 2003, someone tried to do what O'Dowd posits, attempt to bypass the normal submission procedures for Linux code in an attempt to get a back door incorporated into the kernel. Alert Linux coders quickly spotted the alterations in a routine file integrity check and picked up on their hidden intent, despite the clever way they were coded to obfuscate their purpose, before the code got anywhere near the kernel, and the attempt failed.

IS HE RIGHT ABOUT KEN THOMPSON?

If you would like to read Ken Thompson describe his back door, you can do so right here. Eric Raymond's account provides some interesting details:

"In this scheme, the C compiler contained code that would recognize when the login command was being recompiled and insert some code recognizing a password chosen by Thompson, giving him entry to the system whether or not an account had been created for him.

"Normally such a back door could be removed by removing it from the source code for the compiler and recompiling the compiler. But to recompile the compiler, you have to use the compiler — so Thompson also arranged that the compiler would recognize when it was compiling a version of itself, and insert into the recompiled compiler the code to insert into the recompiled login the code to allow Thompson entry — and, of course, the code to recognize itself and do the whole thing again the next time around! And having done this once, he was then able to recompile the compiler from the original sources; the hack perpetuated itself invisibly, leaving the back door in place and active but with no trace in the sources.

"Ken Thompson has since confirmed that this hack was implemented and that the Trojan Horse code did appear in the login binary of a Unix Support group machine. Ken says the crocked compiler was never distributed."

You really need to read the entire Thompson paper, with graphics, to see how it worked, but what he says he set out to prove was a problem with compilers written in their own language:

"The C compiler is written in C. What I am about to describe is one of many 'chicken and egg' problems that arise when compilers are written in their own language."

If we were to draw a conclusion, that would be the one. Anyway, it wasn't widely available for a lot of eyeballs, to start with. It also was binary, not source, code that was involved. And UNIX, I believe The SCO Group's Darl McBride will inform you, is not open source but proprietary software.

IS HE A HYPOCRITE?

A brief visit to Green Hills' website will demonstrate that the company supports Red Hat Linux, although they call it "Linux Redhat". Green Hills announced Linux support for its MULTI-IDE development tool in early 2003, and for RTLinux in December, 2001. Here's their explanation for why they support Linux, from their Linux FAQ:

"Q: Does Green Hills Software support Linux?

"Yes, Green Hills Software has extensive support for Linux. Green Hills Software's entire family of development tools runs on Linux. Products that run on Linux or are used when developing from Linux include:

• INTEGRITY and ThreadX embedded and real-time operating systems
• MULTI and AdaMULTI integrated development environments
• TimeMachine 4-D debugger
• Optimizing C, C++, and Ada compilers
• SuperTrace, Green Hills, and Slingshot probes

"In addition, many of Green Hills Software's development tools support Linux as a target operating system:

• Optimizing C, and C++, and Ada compilers - with the GNU C compatibility in our compilers, we have reduced the code size of the Linux kernel by up to 35%. Read more.
• MULTI and AdaMULTI integrated development environment - with support for multi-threaded application-level development as well as Linux kernel and driver development.
• SuperTrace, Green Hills, and Slingshot probes - these hardware debug devices are Linux Memory Management Unit (MMU) aware, allowing them to be used effectively for kernel-level software development.

Q: Why does Green Hills Software support Linux as a target when it has its own operating systems?

As the leading vendor of embedded software development tools, Green Hills Software is committed to providing and supporting an open development environment on which our customers can standardize on across a wide variety of projects, whether embedded or general-purpose, legacy or new. Consequently, we support not only our own operating systems but also customers' homegrown solutions, Linux, and commercial real-time operating systems such as VxWorks.

Here is a joint press release from Green Hills and FSMLabs in 2001 about MULTI working with RTLinux. I think, though, that when the FAQ says Green Hills is committed to Linux, we should take that with a grain of salt, and maybe the government and other companies should factor in Green Hills' forked tongue when deciding who to work with. As Thompson pointed out, you need to be able to trust your vendors. You may have noticed that, despite Ken Thompson's cautionary tale, Green Hills was not scared away from C compilers.

IS LINUX TOO INSECURE BY DESIGN TO EVER BE CERTIFIED?

The company makes another statement in that FAQ that is pertinent:

"With Linux, its size, monolithic implementation, and lack of formal, traceable design mean that it can never be certified or trusted in applications with high reliability requirements."

Is that true? Adam Doxtater, co-author of Snort 2.0 Intrusion Detection and co-founder and Chief Technology Editor of Mad Penguin, responded like this to O'Dowd's remarks:

"Is he also stating that Linux software isn't now or will in the future be incapable of meeting security standards put in place by the government in EAL7? Since the EAL is only applicable to specialized products, does Linux, Windows, or any other OS need to meet the criteria out of the box? The answer is no. If a product is to meet the EAL7 level, it will become a specialized product by default due to the modifications necessary to meet the requirements. We must also take into consideration that Mr. O'Dowd is speaking strictly of government applications, not real world Linux. That said, it has no bearing on the operating system at all. It pertains to special circumstances and must be viewed as such. Is the DoD deploying specialized versions of Linux and/or other Open Source applications? I can't be sure, but one thing I can be sure of is this: If they are, you can say with a fair amount of certainty that these deployments are secure... probably more secure than you or I will ever know."

Anyway, IBM announced last year that it would "work with the Linux community to enter the Common Criteria certification process for the Linux operating system early this year and proceed with a progressive plan for certifying Linux at increasing security levels through 2003 and 2004":

"The Common Criteria (CC) is an internationally endorsed, independently tested and rigorous set of standards used by the Federal government, and other organizations around the world, to evaluate the security and assurance levels of technology products.

"'With Linux experiencing significant traction among governments around the world, securing Common Criteria certification for Linux will demonstrate that Linux is secure for government applications,' said Jon 'Maddog' Hall, President and Executive Director of Linux International. 'The Linux community is actively working on security enhancements to make Linux even more secure than it is today, which will enable progressively higher levels of certification in the future.'

"'Linux is the fastest-growing operating system in the world today, and we see governments and customers across all industries worldwide adopting it at an ever rapid pace because it frees them from dependence on a proprietary approach,' said Jim Stallings, IBM General Manager, Linux. 'This investment represents the next step in IBM's ongoing commitment to accelerate the development of Linux as a secure, industrial strength operating system.'

"The United States Federal government requires that all commercially-acquired information technology products used in national security systems be independently certified by approved testing facilities against the Common Criteria, and many other countries adhere to similar standards."

For O'Dowd to say it can't be done is obviously nonsense. You probably remember Jim Stallings' announcement at Novell's Brainshare conference that some of those milestone goals have already been reached. And did you notice that the federal government requires that all commercially-acquired information technology products used in national security systems be independently certified by approved testing facilities against the Common Criteria? Do you think it is likely that Mr. O'Dowd does not know this? Or does he hope that we don't?

IS HE RIGHT THAT PROPIETARY SOFTWARE DEVELOPMENT IS MORE SECURE?

"Windows has more holes than a sieve." So writes Andrew Briney, editorial director of Information Security Magazine, in the current issue.

Even some insurance companies, who don't make such decisions on a hunch, have announced they would charge more for using certain Windows products. What does that tell you? Frankly, it terrifies me to think that any security-related function might run on Windows. I hope they at least use Linux as a firewall, as Jay Beale suggests for those who can't wean themselves from Redmond. Beale is the lead developer of Bastille Linux and the editor of Syngress Publishing's Open Source Security series.

If you are really interested in a comparison of closed and open operating systems, head on over to David Wheeler's collection and read the Security subheading, which is number 6. He tells about a couple of examples of deliberate back doors in proprietary software, and he also has links to many security experts stating that FOSS has security advantages over proprietary software.

Whitfield Diffie, the co-inventor of public key cryptography, who is listed in that page, wrote that the key to better security is less reliance on secrets:

"As for the notion that open source's usefulness to opponents outweighs the advantages to users, that argument flies in the face of one of the most important principles in security: A secret that cannot be readily changed should be regarded as a vulnerability.

"If you depend on a secret for your security, what do you do when the secret is discovered? If it is easy to change, like a cryptographic key, you do so. If it's hard to change, like a cryptographic system or an operating system, you're stuck. You will be vulnerable until you invest the time and money to design another system.

"It isn't that secrets are never needed in security. It's that they are never desirable. This has long been understood in cryptography, where the principle of openness was articulated as far back as the 1870s (though it took over a century to come to fruition). On the other hand, the weakness of secrecy as a security measure was painfully evident in World War II, when the combatants were highly successful at keeping knowledge of their cryptosystems out of general circulation but far less successful at keeping them from their enemies.

"Today, at least in the commercial world, things are very different. All of the popular cryptographic systems used on the Internet are public. The United States recently adopted a new, very public system as a national standard and it is likely that before long, this Advanced Encryption Standard--based on an internationally accepted algorithm--will be used to protect the most sensitive traffic.

"It's simply unrealistic to depend on secrecy for security in computer software. You may be able to keep the exact workings of the program out of general circulation, but can you prevent the code from being reverse-engineered by serious opponents? Probably not.

"The secret to strong security: less reliance on secrets."

You may remember the time it was reported Microsoft admitted its programmers deliberately planted a secret password, along with the comment "Netscape engineers are weenies". They were fired, but it wasn't Microsoft that discovered the problem. It was finally discovered by two security experts three years after it had been planted. The Wall Street Journal's account of the incident said the file, called "dvwssr.dll'', was planted on Microsoft's Internet-server software with Frontpage 98 extensions. "A hacker may be able to gain access to key Web site management files, which could in turn provide a road map to such things as customer credit card numbers," The Journal reported. Later, there were clarifications issued that it was a cypher key, not a password, and they admitted to a bug, not a back door, which didn't make everyone feel better. Contrast that sorry tale (or others, such as this one involving Intel) with the above-mentioned failed attempt to insert a backdoor into the Linux kernel. Which methodology proved more secure? There is no need for theories when real-world events have already eloquently spoken.

You might enjoy to fortify your ability to counter this type of FUD by reading John Viega and Bob Fleck of Secure Software's rebuttal [PDF] to some remarkably similar FUD from the Microsoft-funded Alexis de Tocqueville Institution, which released a white paper in 2002 entitled "Opening the Open Source Debate". Here's the rebuttal to one of the myths in that paper:

"Myth #4: If you want secure software, you shouldn’t trust free software, because it is written by untrusted people, and has not been audited by a trusted source. The implication of this myth is that non-free software is written by trusted people, and that it perhaps is audited. While many development organizations may like to think their staff is fully trustworthy, the reality of the security industry is that software developers are not to be fully trusted. Furthermore, many development organizations have nonexistent or inadequate security auditing procedures for software. In the past few years, several back doors that were unknown to management were found in proprietary electronic commerce applications, including a well-publicized case where a back door was found in Microsoft’s FrontPage 98. Similarly, many developers waste company time and resources building unauthorized 'easter eggs', such as the pinball game in Microsoft Word, or the flight simulator in Microsoft Excel. At least when source code is available, those organizations that are highly security conscious can pay qualified, trusted auditors to do the work. Without the source code, there are far fewer trusted sources for audits due to the extra skill required."

John Viega at the time the paper was written was a Senior Research Scientist at the Cyberspace Policy Institute, the CTO of Secure Software, Inc., and an Adjunct Professor of Computer Science at Virginia Tech. Bob Fleck was listed as a Research Scientist at the Cyberspace Policy Institute, the Director of Methodology Development at Secure Software, Inc., and the co-author of the book Deploying Secure Wireless Networks.

There is also a very thorough rebuttal [PDF] by Julião Duartenn, Director, Security Skill Center Oblog Software, who brings out an interesting point:

"Another consideration for the U.S. government is that all source code developed under the GPL could have mirrored availability to the public. As far as nondeliberate disclosure (codestealing or accidental disclosure) is concerned, there is no difference from closed source software to open source software. If there is a difference, it is in favor of open source. Open source is, by definition, designed to be world-readable, so no secrets are embedded in it. Closed source, on the other hand, comes with the temptation to embed secret data, and to achieve any degree of security by means of clever tricks, and not through proven algorithms."

Bruce Perens has also written about such back door incidents, including one where the back door wasn't discovered for six years, and not until the product was open sourced:

"So, if you don't publish your source, expect that only black hats, and the few people inside of your company who work on the product, will look at your code. Apparently, the black hats are very successful at finding security flaws this way, and the folks on the inside aren't very effective at stopping them. . . .

"One great example in this regard is Borland's Interbase database server, because it was both proprietary and open source, and had an undisclosed security problem during its transition from one to the other.

"Interbase is an enterprise-class database product that ran airline reservation systems and other mission-critical applications of large companies. Certainly Borland had the funds to do security reviews on the product. But some time between 1992 and 1994, an employee at Borland inserted an intentional back door into the database. The back door completely circumvented the security of both the database and the operating system hosting it--in some cases, the back door would have allowed an outsider to gain a system administrator login. The back door was not well hidden. I assume that it was done maliciously, and not on orders of Borland's executives.

"Anyone could have found this back door by running an ASCII dump of the Interbase executable, for example, by using the 'strings' command on Unix or Linux. But if anybody found it, they kept it to themselves, and perhaps used the exploit for their own gain. The back door remained in the product for at least six years. At least one person knew of it, and could have exploited it, for this entire time. How many friends did he tell?

"Borland released Interbase to open source in July 2000. An open-source programmer who wasn't looking for security flaws discovered the back door by December 2000, and reported it to CERT."

There is a new issue involving X-Micro that has just appeared on Bugtrac that some might be interested in following. IS HE RIGHT ABOUT ANYTHING?

This is the same man who said the industry recession was over in February of 2003 and who in January of 2004 predicted the death of the Linux embedded tools market.

Kevin Dankwardt, President, K Computing Education Chair, Embedded Linux Consortium, provided an answer to O'Dowd's editorial:

"The majority of O'Dowd's article revolves around developer tools and the business model of tool purveyors. The article implies that an important feature for embedded developers is that the tools should lead to smaller or more efficient implementations. This is clearly a narrow and backward-facing definition of the role of developer tools. While this role may have had support in the proprietary, closed past with seemingly insurmountable hardware limits, times have most definitely changed. To conclude as O'Dowd does that there is no market for tools because the old business model no longer functions in an open source world is just backward facing logic. Just as Linux has proven to be a disruptive technology in operating systems, open source has proven to be a disruptive technology for software in many areas including development tools and methodologies. Many vendors have created successful new business models. Revenue success has been achieved, for example, even when the product (the operating system) is given away."

THE MITRE REPORT SAID SECURITY WOULD BE DAMAGED IF FOSS WERE BANNED

Mr. O'Dowd quoted from the Mitre Report on FOSS that was released in January of 2003. He forgot to mention that it already studied the question of whether FOSS should be banned from DoD use and concluded that, on the contrary, banning FOSS would damage US security. Here is a snip from the Executive Summary:

"The main conclusion of the analysis was that FOSS software plays a more critical role in the DoD than has generally been recognized. FOSS applications are most important in four broad areas: Infrastructure Support, Software Development, Security and Research. One unexpected result was the degree to which Security depends on FOSS. Banning FOSS would remove certain types of infrastructure components (e.g., OpenBSD) that currently help support network security. It would also limit DoD access to -- and overall expertise in -- the use of powerful FOSS analysis and detection applications that hostile groups could use to help stage cyberattacks. Finally, it would remove the demonstrated ability of FOSS applications to be updated rapidly in response to new types of cyberattack. Taken together, these factors imply that banning FOSS would have immediate, broad, and strongly negative impacts on the ability of many sensitive and security-focused DoD groups to defend against cyberattacks. . . .

"Neither the survey nor the analysis supports the premise that banning or seriously restricting FOSS would benefit DoD security or defensive capabilities. To the contrary, the combination of an ambiguous status and largely ungrounded fears that it cannot be used with other types of software are keeping FOSS from reaching optimal levels of use."

So, in conclusion, it appears that Mr. O'Dowd is suggesting that Linux is a security threat, not because it is true, but because Linux is affecting his bottom line. The solution to that problem, Mr. O'Dowd, is to follow Wind River's example.


Murry Shohat, Executive Director of the Embedded Linux Consortium, was instrumental in the preparation of this article. The Embedded Linux Consortium, Inc. is conducting a broad study of Linux in government toward proliferation of ELC platform standardization activities. Paul Iadonisi, known as LinuxLobbyist on Groklaw, helped research the article. Paul has worked as a System Administrator for several software companies over the past eighteen years, and he is currently building his own business around providing complete IT solutions to K-12 schools using GNU/Linux and other Free and Open Source Software. Credit goes to rjamestaylor also, for finding the CSC press release on offshore development.


  


CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD | 308 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
Corrections Here Please
Authored by: PJ on Sunday, April 11 2004 @ 09:44 PM EDT
Please record my mistakes for posterity here, so I can find them easily. I also

tried a new way of showing what is a quotation instead of long chunks of
italicized quotations. Let me know if you don't like it or if you have better
ideas. I can change it.

[ Reply to This | # ]

Even Ken Thompson's compiler trojan scenario can by greatly mitigated by cross vendor bootstrap
Authored by: NZheretic on Sunday, April 11 2004 @ 09:59 PM EDT
Thanks to the portability of GCC compiler source code [Quoting myself]
Other than manual inspection of the resulting compiler binary, a solution for this is to use many third party C compilers and enviroments for the original bootstrap compiler build and compare the resulting code after the resulting compiler has rebuild itself for the third time. If the result greatly differs then manualy inspect the generated code where it differs.

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: Anonymous on Sunday, April 11 2004 @ 10:05 PM EDT
Here's an example of how proprietary software can be just as dangerous as this
guy is claiming open source is.

http://securityresponse.symantec.com/avcenter/venc/data/promail.trojan.html

"The Promail.Trojan is a full function POP client which allows you to
obtain your email from your designated POP server(s). However, in addition to
the documented functions, the program also sends your account information
including your password to an anonymous email address.

"The program Promail has been widely distributed on freeware and shareware
repositories. The file is generally distributed as a zip file named
proml121.zip. This program is touted as a completely free POP client that
provides many standard email client functions. This zip file uncompresses into
the file promail.exe. This is the executable that provides POP client services.
These POP client services include the ability to retrieve mail from multiple POP
servers."

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: Anonymous on Sunday, April 11 2004 @ 10:15 PM EDT
I'll make an attempt to be fair and provide an example of how poor security in
open source projects can be dangerous.

http://securityresponse.symantec.com/avcenter/security/Content/6171.html

"It has been announced that the server hosting tcpdump and libpcap,
www.tcpdump.org, was compromised recently. It has been reported that the
intruder made modifications to the source code of tcpdump and libpcap to include
trojan horse code. Downloads of the source code of tcpdump and libpcap from
www.tcpdump.org, and numerous mirrors, likely contain the trojan code."

"Additionally, the trojan displays similarity to those found in irssi,
fragroute, fragrouter, BitchX, OpenSSH, and Sendmail."

[ Reply to This | # ]

Disingenuous or ignorant?
Authored by: rjamestaylor on Sunday, April 11 2004 @ 10:20 PM EDT
Quick Google search uncovers the hidden source of proprietary DoD sofwtare development: Off-Source outsourcing.

Computer Sciences Corporation Off-Shore Development Funded By the DoD!

EL SEGUNDO, Calif., May 19 - Computer Sciences Corporation (NYSE: CSC) today announced that its Indore, India, center has attained the Software Engineering Institute's (SEI) Software Capability Maturity Model (CMM) Level 5 rating, the highest level possible. ...

CSC's center in Indore is part of a network of company-owned operations in lower-cost countries such as India, South Africa, Canada, Ireland, Malaysia and Australia. Augmented by CSC's third-party alliances in Russia, Belarus, Bulgaria, India, Canada and Mexico, these operations provide flexibility in delivering applications management, information technology (IT) outsourcing and business process outsourcing services to CSC's clients in higher-cost regions such as the United States and Western Europe.

"This achievement of a Level 5 rating by our team in India underscores CSC's commitment to developing and maintaining the highest quality global resources and capabilities on behalf of our clients," said Jim Cook, president of CSC's Financial Services Group.

About the Software Engineering Institute

The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense through the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics [OUSD (AT&L)]. The SEI contract was competitively awarded to Carnegie Mellon University in December 1984. The SEI technical program is executed by approximately 225 members of the technical staff from government, industry, and academia with an average of 10+ years of experience in the field of software engineering.

So exactly what goes into these programs written by the subversive elements in other countries? How many reviewers review each line -- in and out of context -- to insure no intentional or unintentional security flaws exist? Once its compiled and approved functionally, how many people check the code closely? Frankly, I would require all these off-shore development projects be open source for the very reason that scrutiny is good!

I guess we need to rely on the proprietary vendors to make secure software. Ones like Microsoft, that use off-shore developers (in India, at least), who deliver robust, secure OS implementations routinely and without fail.

Hahahahaha!

---
SCO delenda est! Salt their fields!

[ Reply to This | # ]

Transcript of Internet Caucus Panel Discussion: Interesting fallout from Clipper chip discussion
Authored by: NZheretic on Sunday, April 11 2004 @ 10:25 PM EDT
On September 28, 1999, an Internet Caucus Panel Discussion was held to discuss the issues surounding the Clipper chip and export restrictions on encryption in general.

Congressman Curt Weldon raise a couple of interresting questions:

Schwartz: Congressman Weldon, thank you very much for being here. Do you have any questions.

Rep. Curt Weldon: Thank you. Let me see if I can liven things up here in the last couple of minutes of the luncheon. First of all, I apologize for being late. And I thank Bob and the members of the caucus for inviting me here.

Pardon me if I seem a little bit confused to our panel, but, I am, and have been, with the change in direction which has occurred. But before I begin, let me say at the outset one of my biggest projects for the past four years has been to build what is becoming the first smart region in America, linking up all of the institutions within a four state region -- Pennsylvania, Delaware, New Jersey, and Maryland -- _____. In fact, over the weekend, I hosted the Minister _____, who is the Minister of Information Technology for Malaysia. As we signed an ____ with them for uplink downlink ties between our hub initiative in the four states, and the new Malaysian super-computing corridor project that they are building in Malaysia. So, I am a strong advocate for the use of information technology.

But my other hat is to chair the Research Committee for National Security. And when Bob introduced his bill three years ago, my door was pounded incessantly by the Defense Secretary and his staff, by the Director of the CIA, and by the head of the NSA, and I would note for the record neither the CIA nor the NSA is here today.

Who is actually speaking for them today, I might add? OK.

NSA and CIA came in, and in a very intense way, lobbied me personally, and I am not a computer expert, nor am I a lawyer, and they asked me to give access to my subcommittee and the full Armed Services Committee to look at the security implications of the change in Bob's legislation. I respect Bob. I think that he is an outstanding member. But I felt that I owed it to my committee, and my responsibility to Congress to listen to what the administration was going to tell me.

We arranged a series of classified hearings and briefings. And, as with any Member of Congress expressing concern about the ability for our forces involved in a hostile environment to be able to respond quickly, ____ back to 1991 in Desert Storm where my understanding is that our commanders in the field had Saddam Hussein's commands before his own command officers had them, because of our ability to intercept and break the codes of Saddam's military. I want to make sure that we have that capability in the future. I responded in a very positive way to the argument that was being made by the CIA, by the NSA, and by DOD. And we took some very tough positions.

In fact, Ron Dellums and I offered the amendment last year that had only one dissenting vote in the House, and this year passed by a vote of 48 to 6.

In the past year none of those briefings have changed. And the people who have come to me as a Member of the National Security Committee, there has been no lessening of their impression of the threat. Yet all of a sudden I am told, and John Hamre, I think, he made the courtesy of calling me in advance, that there was a change.

Now, I agree with the gentleman from the White House, for the administration, that it was coincidence that this happened the day before Vice President Gore went to Silicon Valley. I agree that that was just a coincidence.

But the point is that when John Hamre briefed me, and gave me the three key points of this change, there are a lot of unanswered questions. He assured me that in discussions that he had had with people like Bill Gates and Gerstner from IBM that there would be, kind of a, I don't know whether it's a, unstated ability to get access to systems if we needed it. Now, I want to know if that is part of the policy, or is that just something that we are being assured of, that needs to be spoke. Because, if there is some kind of a tacit understanding, I would like to know what it is.

Because that is going to be subjected to future administrations, if it is not written down in a clear policy way. I want to know more about this end use certificate. In fact, sitting on the Cox Committee as I did, I saw the fallacy of our end use certificate that we were supposedly getting for HPCs going into China, which didn't work. So, I would like to know what the policies are. So, I guess what I would say is, I am happy that there seems to be a comming together. In fact, when I first got involved with NSA and DOD and CIS, and why can't you sit down with industry, and work this out. In fact, I called Gerstner, and I said, can't you IBM people, and can't you software people get together and find the middle ground, instead of us having to do legislation.

But I am not convinced that what we are doing here is necessarily logical. And I am not convinced that all of us, in fact, have the same understanding of what it is that you are coming out with in terms of a new policy position. And I guess we won't know that until the terms of the December 15th regulations are spelled out, and then we can debate the fine points, which is part of what Bob's question alluded to today

I don't want to hurt industry. In fact, I have advocated that we give significant new tax breaks to the encryption and software industry in this country to give them more incentive to stay in America and do their work here. But, I am also, as a senior member of the Security Committee, as a Chairman of the Research Committee, to seeing 47 billion dollars a year of our tax money going to Pentagon's IT systems, I want to be absolutely certain that in terms of our ability to deal with intelligence overseas, to be able to have information dominance overseas, to be able to use the kinds of tools that the CIA and the Defense Department needs in adversarial relationships that we are in fact providing that through this new policy.

So, I guess the devil is in the details, the proof is in the pudding, and I am going to withhold my support for what you have done until I have seen the details that you are supposedly going to review for us on December 15.

My question is also why wasn't the head of the NSA and CIA invited to appear? Was that the panel? Or, was that the decision of the administration?

Jerry Berman: [He said he invited the administration to send whoever they wanted.]

Weldon: My only question is, since, the administration used the CIA, and the NSA, to come to me as a Member of Congress to argue their position for the past two years. I would like to have had the NSA and the CIA here at the table so I could ask them the same questions that I am posing you. And I am not going to be happy until I get that opportunity.

______?: Congressman, we will make that opportunity available to you.

Weldon: I think it should have been done though in a public forum.

______?: Thank you.

Swire: Just one small, in the announcement on the 16th that Deputy Secretary Hamre spoke for Defense and national security, Attorney General Reno spoke for Justice and law enforcement. Secretary Daley for Commerce. I was asked to speak on privacy, as a representation of important goals that we were trying to meld together for this overall policy.

Weldon: I understand that. And John Hamre told me that when he called me a of couple of days before the announcement was going to be made. My point is, that when the administration wanted people to carry their water up on the Hill, they sent the head of the CIA and the head of NSA to see us personally. They did not have John Hamre do it. Although John did part of that. And I think that we should be hearing from the CIA and NSA directly because they are the people I am concerned, in terms of being able to break into systems of foreign adversaries, of both real and potential adversaries. I want to hear from them.

And I think we owe it to the public, as we have had an about face in this policy, and that is what I think that it is. I want to hear what has changed, and whether or not they are satisfied. Once again, I am not an information technology expert. I am not a lawyer. But, I want to hear from them. I want to get them to look me in the eye to tell me they are satisfied, and they are satisfied because what we have done here is consistent with their ability to provide the kind of level of security that we need in the future.

Wells: If I could say Congressman, one of the piece of the rollout was that the national security community will need additional tools. And, we look forward to the Congress to support that with appropriations.

Weldon: And we will do that. We have given, for the past five years, more money for the issue of information dominance in our defense bill, than the administration's request in each year. In fact, both ______ and John Hamre have had full and unequivocal support for all of their needs, as well as the needs of the CIA and the FBI, I mean the CIA and the NSA.

Schwartz: Congressman, I didn't really think we headed off into dull before, but when you said you were going to liven it up, you sure delivered on your promise.

So can you trust the software from Microsoft or even IBM?
The moral is, unless you can have access to all the source code and have the right to recompile and compare the binaries, you cannot verify that the software you are using is free of backdoors.

If you do not have the resources to examin every line of source code, then you best bet is to use source code that is fully open to peer inspection.

In my opinion, an open source license, opens up the code to true peers in the industry, people who work with the source code to build solutions.

[ Reply to This | # ]

Wow, Greenhills Admitts It Is Not Trustworthy!!!!
Authored by: Anonymous on Sunday, April 11 2004 @ 10:29 PM EDT
Clearly since we cannot verify Greenhills software it cannot be trusted! I am
well aware of the early trojan in early Unix and we all knew the simple approach
to avoid the reinsertion of password backdoors into our code. But this raises
the intersting delemma of what trojans the Greenhill and MSVC compilers are
inserting into our software! I can fool my GCC compiler into building code
without it knowing its target and thereby avoid trojans but Greenhill compilers
have no visability. I will remember this on our next embedded cross compiler
purchase!

[ Reply to This | # ]

  • Mitigating the compiler concern - Authored by: Anonymous on Monday, April 12 2004 @ 12:17 PM EDT
    • No. - Authored by: Anonymous on Monday, April 12 2004 @ 05:56 PM EDT
      • No. - Authored by: Anonymous on Monday, April 12 2004 @ 06:05 PM EDT
      • Yes - Authored by: Anonymous on Monday, April 12 2004 @ 06:39 PM EDT
        • No - Authored by: Anonymous on Monday, April 12 2004 @ 08:20 PM EDT
          • Yes - Authored by: xtifr on Monday, April 12 2004 @ 11:02 PM EDT
          • No - Authored by: Anonymous on Tuesday, April 13 2004 @ 07:27 AM EDT
It's the code, stupid.
Authored by: tbdavis on Sunday, April 11 2004 @ 10:33 PM EDT

they can use fake identities to contribute subversive software that will soon be incorporated into our most advanced defense systems.

This is an example of an all too common problem in our society of confusing the metaphor with the object. The idea that credentials are more important than the person they purport to identify.

In our personal worlds, in our most intimate circles, we don't do that. We recognize our friends voice on the phone, but if they say bizarre things, we may question their identity or at least their state of mind. The open source community is a bit like that. There are so many avenues for review, that it's the code that matters, not the submitter. If Bob writes a bad piece of code, it won't be included as-is. No matter what he submitted before, or after. So pretending to be Bob doesn't get you anywhere.

Unfortunately (for security anyway), the proprietary development process often follows the opposite concept. If you have a card clipped to your clothes that says you have a right to be in the building, and you know the password to the right computer, you could conceivably add whatever code you desired. Even if you happen to be a foreign agent, not employeed by the company which never expected to be the victim of espionage activity.

[ Reply to This | # ]

Open source vrs. out-source
Authored by: brian-from-fl on Sunday, April 11 2004 @ 10:46 PM EDT
"Green Hills Software CEO Dan O'Dowd said Linux is a national security
risk. The open source process should rule it out of defense applications, he
claimed, because anyone can contribute, even developers in Russia and China, who
might disguise their identities and slip in some subversive code."

Earth to Dan. Earth to Dan. Come in, Dan. Do you read me?

US software companies are already outsourcing their proprietary software
development jobs to India, Russia, and China. Those foreign programmers have
already had the chance to slip in subversive code to all our software,
proprietary and open source alike.

So, Dan, if you're right about the untrustworthines of foreign programmers, the
choice becomes: Do we want Russian and Chinese code hidden inside proprietary
products where we cannot hope to see it, or openly hidden inside open source
products where we at least have a chance to find it?

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: inode_buddha on Sunday, April 11 2004 @ 11:05 PM EDT
A few points to ease your mind:

1: ISTR that SuSE Linux met the Common Criteria last year, with IBM funding, putting it at least on a par with military installations of WinNT. Sorry, no links handy at the moment.

2: The entire question is rendered moot by the usual DoD acquisition process, which requires its own rather thorough auditing regardless of the vendor's development philosophy.

3: Are people truly so naive as to think that systems like missile launch controllers are COTS?

---
"When we speak of free software, we are referring to freedom, not price." -- Richard M. Stallman

[ Reply to This | # ]

Tiny fix suggestion
Authored by: Anonymous on Sunday, April 11 2004 @ 11:31 PM EDT
There are "funny quotes" in one part of the main text where you mention the title of a paper:

"?Opening the Open Source Debate?."

[ Reply to This | # ]

Darl speaks to Spain (spreading the good FUD)
Authored by: vruz on Sunday, April 11 2004 @ 11:36 PM EDT

I just got the news from barrapunto.com (the slashdot version of those who speak en español).

Today, the sunday edition of Diario El Pais de Madrid weighs more than normal because of an additional quote of FUD Darl McBride is spreading in Europe.

The comments from our fellow spanish slashdotters are nothing but justified indignation

That newspaper section is for subscribers only, but I translate from Barrapunto:


El Pais, sunday edition, in its Economy supplement, includes today an extense interview (pay-per-view, two pages long) to Darl McBride, SCO's ineffable CEO. The interview (well informed, incisive and correct, by Patricia Fernandez de Lis) took place in Salt Lake City and it starts with a threatening title: Linux Users: think about it well

But it doesn't contain anything new for slashdotters, just the same affirmations that have already been responded from within our community. It's not even new the pretension of repeating a lie many times so that it becomes a truth, or pretending to present themselves as a "victim", misunderstood in their particular crusade against companies, users and institutions that make use of free software.

One thing is true, it sounds even more tough when you read the same false claims and half-truths in castillian [spanish] and printed in salmon colour.

So reading it is not recommended for sensitive people, or those who believe that the principle of veracity still has some validity.

For example, McBrice tries to convince us that not only the Linux kernel contains "more than 80% of the code" of his property, but that THE WHOLE Operating System is property of SCO !! There's also FUD and pearls like this one against the GPL:

[And follows an excerpt from the newspaper:]

Patricia: You have also said that the GPL is anti-american and puts systems security into risk.

McBride: We can't export to Iran, North Korea or Cuba. Open software has a free-pass. Someone in North Korea can download the latest version of Linux and create a supercomputer that will allow them to do all kind of dangerous things. We're not saying Linux is bad, but the GPL is the destruction of this industry.

But not only the GPL. He also mentions Lessig, CC and that people:

Lawrence Lessig defended his case in the Supreme Court a year ago, and he lost. The Court said that copyright is important, and that the Constitution of the US "celebrates" benefit as an incentive.

The Court used that words: "to celebrate". This brings us back to the basic question, that is: How do you make money ? If the possibility of reaping benefits from your investment is gone, the world is going to be very different.

Among the few truths he expresses, one of them is the acknowledgement of their good relationship with Microsoft,commercial agreements and strategic identification in their objectives. (defense at all cost of an intellectual property model based in the restriction of the rights of citizens).

He also agrees with Microsoft in the old arguments used against free software ("[FOSS] is not good to make money"), the [Microsoft CEO] Steve Ballmer reminded us of, a few months ago during his visit to Spain.

Well, it is the rigurous opposite to virtuous and rational thinking, just like was deined by the physician Robert Boyle:

"he who knows dignity, can recognize a crazy man" (or, a childish someone who has to grow up, according to Linus)

Maybe it should be appropriate to ask El Pais the possibility to respond to such a collection of false statements (that were published in a medium that sells more than 1.5 million newspapers on sundays) or at least they should publish an open letter to the director.


Sorry for the many translation mistakes that may have occured. This was translated from english into spanish, and then back into english.

You can find the report in Barrapunto And the original article: El Pais - Spain (subscribers only)

---
--- the vruz

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: chaz_paw on Sunday, April 11 2004 @ 11:59 PM EDT
I have one question. Do the people that make important decisions concerning
software purchases really believe this sort of FUD?

O'Dowd = Wormtongue

Charles

---
United we stand.

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: bonzai on Monday, April 12 2004 @ 12:00 AM EDT
I'm sure Green Hills Software sales offices/distributors outside the USA will not like this press release, especially the 3 offices selling Green Hills products in China.

[ Reply to This | # ]

The headline we didn't read
Authored by: Anonymous on Monday, April 12 2004 @ 12:46 AM EDT
Slashdot just posted a blurb about a Yahoo/Washington Post article

http://story.news.yahoo.com/news?tmpl=story&cid=1804&ncid=1804&e=1&
amp;u=/washpost/20040410/tc_washpost/a349_2004apr9
(ugh)

Apparently the latest Netsky worms are designed to attack P2P company's websites
and websites offering tools to bypass copy controls as well as swamping peer to
peer networks.

That's not so interesting by itself but what I did find fascinating is that we
aren't seeing headlines about how the music industry community are a bunch of
immature zealots who write viruses and use other illegal methods to viciously
attack their competitors and how the P2P company officers fear for their lives
and need to hire armed body guards and sharp-shooters for public apperances.
I'm not saying any of those things are true but those were all things said about
the Linux community after the alleged DDoS attacks against SCO's website.
Where's the equal treatment by reporters? Why do we get singled out for abusive
coverage?

[ Reply to This | # ]

Such silliness
Authored by: Anonymous on Monday, April 12 2004 @ 01:02 AM EDT
"If Linux is compromised, our defenses could be disabled, spied on or
commandeered." That statement is utterly ludicrous.

It takes people, such as the Walkers and Jerry Whitworth, in positions of trust
and authority, as the Walkers and Whitworth held, to 'disable, spy or
commandeer' our military communication network.

It also takes secrecy. And GNU/Linux is anything but secret.

But what about the Walkers? And their cohort in espionage Jerry Whitworth? All
doing *life* sentences in federal prisons.

Why? Because for over 15 *years* (starting around 1969 - 1970) these scumbags
*sold* U.S. crypto secrets to the Soviet Union.

These secrets enabled the USSR at the time to break our purported secure record
communications into plain language which the USSR then passed on to the North
Vietnamese government.

We lost, as in KIA, B-52 flight crews (last Rolling Thunder campaign) because of
these vermin.

The Walkers and Whitworth were *all* honorably retired from the U.S. military.
One or both of the Walkers were officers and Jerry Whitworth was a Chief
Radioman (U.S. Navy).

Yet all of them *conspired* to sell and *did* sell to our cold war enemy (USSR)
the most sensative and tightly guarded secrets of the U.S. government.

And, let me tell you, the only reason and I do mean the only reason these
slimeballs were caught is that the wife of one of the Walkers called the FBI and
reported it (but she put up with it for over *15* years).

GNU/Linux is *open*. People can see it. People can correct it. People can
improve it. It is not a secret. And that is a good thing.

krp

[ Reply to This | # ]

Hasn't closed source software been used for that purpose in the past?
Authored by: Thomas An. on Monday, April 12 2004 @ 01:51 AM EDT
Not very familiar with the details of the story... ...wasn't this closed source software that did the damage to the Trans-Siberian gas line ?

Here is the Related Story

[ Reply to This | # ]

Foreign Coders
Authored by: dmscvc123 on Monday, April 12 2004 @ 03:03 AM EDT
I guess O'Dowd has been living under a rock for the past 20 years to miss how
big an issue outsourcing software projects (primarilly proprietary software) to
other countries is, so if this guy has any issues with foreign coding of
software, it's ridiculous to think this is a matter of open versus closed source
since such esteemed companies as SCO and Microsoft code offshore - SCO even has
a job open in India on their website.

[ Reply to This | # ]

Embedded Windows is a security risk because it doesn't follow security rules too well IMO
Authored by: Anonymous on Monday, April 12 2004 @ 03:31 AM EDT
A core principle of secure software is building on secure foundations. You
start with a small, controlled core which you examine until it creaks, and then
you add functionality until you have what you want. At each stage you're thus
controlling the risk profile of what you add + the interactions of new code with
old (well, OK, attempting to withing commercially possible situations). Does
this look like kernel + code? Yep. See BSD where this is even stronger
enforced than Linux.

Now look at 'embedded Windows' and what do you see? Bits have been removed to
make it 'safer'. But that still doesn't mean you know quite sure what
interactions (read: risks) you have left.

And in the military domain security is as important as in the medical world: get
it wrong, and people die. Is it NOT an office environment. Ask the NSA - they
made their test security archirecture on Linux (SE LInux, see www.nsa.gov). I
guess they'll have a good laugh if you would suggest using Windows ...

[ Reply to This | # ]

Who could introduce trojans where
Authored by: Anonymous on Monday, April 12 2004 @ 03:43 AM EDT
You have correctly pointed out the extent to which people audit FLOSS code at source level and mr O'Dowd goes on about such matters in closed code.

What you (and possibly he) are not aware of is that such is probably nearly impossible with MS code. Apparantly there is no central code base anywhere at MS, each developer has their own private one and contributes semi-compiled code to the overall linkage team.,p> That's how those people got away with the netscape weenies comment for so long.

This may have changed with the security initiative last year, but I doubt it.

[ Reply to This | # ]

Trojans
Authored by: jmc on Monday, April 12 2004 @ 05:31 AM EDT

I should think that it would be well-nigh impossible to put a trojan into GCC as building GCC involves building a first version with the manufacturer-provided compiler. The build suite a version of the compiler built with that version and then build yet another version built with the second version. Finally the object files for the final two versions are compared.

In order for the trojan to be propagated it would have to be in the manufacturer-provided compiler and that would not only have to insert the trojan if it was compiling itself but if it was compiling GCC or some other arbitrary compiler.

Also the trojan would have to be applied to the compilation of all sorts of things which accept passwords, not just login and su, which was all Ken Thomson had to worry about, but all sorts of web browsers, FTP clients etc.

I don't see how it could be done myself, or at least how it could be done without some huge and sophisticated logic which would make itself obvious. And what if it gave a "false positive" and started inserting trojan code into some innocent program which happened to look a bit like a login program?

[ Reply to This | # ]

  • Trojans - Authored by: Anonymous on Monday, April 12 2004 @ 12:54 PM EDT
  • Trojans - Authored by: Anonymous on Monday, April 12 2004 @ 01:00 PM EDT
  • Trojans - Authored by: jjs on Monday, April 12 2004 @ 02:27 PM EDT
EE Times US Navy
Authored by: SilverWave on Monday, April 12 2004 @ 06:25 AM EDT
EE Times US Navy

O'Dowd's digs at Linux appear to already be having some effect. "We've had five or six people calling us up saying we were thinking of using Linux, and now they're thinking again," he said. O'Dowd mentioned that one of those potential customers was the U.S. Navy, but his public relations representative cut in and cautioned him not to talk about that any further.

---
Oxymoron of the day is … “SCO's Ethics”

LOL!

I dare you to say it and not laugh!!!


[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: blacklight on Monday, April 12 2004 @ 06:52 AM EDT
"The Linux operating system is being developed by an open source process --
a cooperative effort by a loose association of software developers from all over
the world."

There is no cause and effect relationship between loose association and poor QA.
In fact, Microsoft is one example of a disciplined commercial organization whose
QA. practices have historically been poor. Linux has tight QA processes, and has
the benefit of QA of probably the largest contingent of top UNIX code experts in
the world. Needless to say, their input would have been physically impossible
were Linux code closed. In addition, Linux's greatest strength is the QA
contribution from expert users throghout the world, including IBM, SGI, the
national labs, DoD and allied defense agencies throught the world and of course,
the security experts from NSA.

As a counter example, BSD is universally recognized as the most secure UNIX
anywhere. Yet, BSD is open source software.


"Every day new code is added to Linux in Russia, China and elsewhere
throughout the world. Every day that code is incorporated into our command,
control, communications and weapons systems."

I presume that DoD runs line by line audits of every piece of code that it uses.
If DoD cannot understand the functionality of a particular piece of code, the
DoD can contact either Linux Torvald's team or the copyrights owner of that code
or the NSA. It's not a situation where someone can introduce code without a full
accounting of is functionality, or without accountability: if Linus Torvald's
experts can't see it, if the corporate code experts from IBM and SGI can't see
it, if the military code experts from DoD and allied nations throughout the
globe can't see it, if the security experts from NSA can't see it, then chances
very vanishingly small that there is something there.

Yes, Russia has some of the world's most talented crackers but she also has some
of the most talented software developers in the world. To cut Russia and China
out because of this likage makes about as much sense as cutting the United
States out because she also has some of the world's most talented crackers.


"Linux software, including contributions from Russia and China, is
spreading rapidly through the Defense Department because it can be freely
downloaded from the Internet without a license agreement or up-front fees,
bypassing legal, purchasing and security procedures"

The statement presumes that DoD does not have security policies in place that
specifically address the modalities of introducing software into DoD, including
Open Source software. And that if these security policies exist, then they are
breached by those who are supposed to comply with them. If security could be
bought by simply paying license fees, those of are Microsoft license holders
should feel safer than almost anybody else.


"Ken Thompson, the original developer of the Unix operating
system—which heavily influenced Linux—proved otherwise. He
installed a back door in the binary code of Unix that automatically added his
user name and password to every Unix system."

Ken Thompson kept his code quarantined in his own lab, according to ESR. It's
impossible for others to audit code that they don't have physical access to.

[ Reply to This | # ]

US slips code into Russian Pipeline
Authored by: managementboy on Monday, April 12 2004 @ 07:07 AM EDT
I thought I had read something like this. Quick search on Google revealed it:
www.w ashingtonpost.com
looks like you can't trust your own people... ohh, if the Russians had used Linux back then it would have been so much easyer to get code into their systems! ;-)

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: blacklight on Monday, April 12 2004 @ 07:11 AM EDT
There is no nice way to say it, so I'll be blunt: Dan O'Dowd destroyed his
professional credibility, his credibility as an official of his company, and his
company's credibility in one fell swoop. I don't think anybody in his or her
right mind wants to do with fools, or outfits that are run by fools. Dan O'Dowd
may ontest his designation as a fool: unfortunately, his speech is
incontrovertible proof that he is a fool - and he oblinglingly provided the
proof through his own hand - on his own company's website.

The reason we rely on processes rather than people for security is that people
have been known to fail - even good, reliable people: Linux has both processes
and people.

Dan O'Dowd says we can't trust processes, that we have to trust people. Well,
let's look at Dan O'Dowd: did Dan O'Dowd as a person demonstrate the
trustworthiness of his judgment just now? Is making arguments based on
innuendoes, deliberate misstatements of verifiable facts and outright lies to be
considered a badge of trustworthiness? Is Dan O'Dowd's firm the kind of
contractor that the DoD should do trust enough to do business with?

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: eggplant37 on Monday, April 12 2004 @ 07:12 AM EDT
"In addition, developers in Russia and China are also contributing to Linux software. Recently, the CEO of MontaVista Software, the world's leading embedded Linux company, said that his company has 'two and a half offshore development centers. A big one in Moscow and we just opened one in Beijing -- so much for the cold war.' Also, the CEO of LynuxWorks, another embedded Linux supplier, acknowledged that his company has a development center in Moscow.
The cold war ended nearly 15 years ago and this idiot's raving about it still? Paranoia will destroy ya.

[ Reply to This | # ]

Trojan horse metaphor
Authored by: Anonymous on Monday, April 12 2004 @ 07:28 AM EDT
---
To use the trojan horse metaphor,

Let
City of Troy = High Security application
Trojan Horse = Possible High Security application
Greeks = backdoor/malicious code

I'm sure that the Trojans would have just let the Greeks into Troy if they had
blue prints to the horse and notice a big hollow cavity and the label
"place Greeks here".

The reason the Trojan Horse actually worked was because no one knew that a small
army was located inside. However, closed source/binaries is were these things
tend to get slipped in. With open source if you don't trust that particular
wooden horse, you can alway build another on a similar framework and leave out
the Greeks since you have the blue prints.
---

PJ, thanks for all you have done and will do.Get some rest. The timing of some
your posts make me wonder if you actually get any rest.

Sennith
Anonymous Coward
Groklaw Lurker for the past year or so
AJtLtSUfaA
(Actually just too lazy to sign up for an account)

[ Reply to This | # ]

OT: A Thought on the Sun / MS deal
Authored by: Anonymous on Monday, April 12 2004 @ 08:08 AM EDT
I was just thinking about this deal and it occurs to me that $1.95Bn is not just
pocket change even to Microsoft. Only $48Bn left in the kitty (4%).

The purpose of this settlement seems to be directed at undercutting Open
Source.

The question for me is why settle if they think that the SCO case is going to
succeed?

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: Anonymous on Monday, April 12 2004 @ 08:08 AM EDT
"One "back door" in Windows, one infiltration, one virus, one
worm, one Trojan horse and all of our most sophisticated network-centric
defenses could crumble. We must not abandon provably secure solutions for the
illusion that Linux will save money. We must not entrust national security to
Linux"

I wonder if the phrase above was actually referring to Windows... how many back
doors, infiltrations, viruses, worms, trojan horses on Windows would be needed
for abandon the illusion that Windows could be used as a secure platform.

Of course, he is not comparing with Windows but to his own OS but still the
comparasion is interesting.



[ Reply to This | # ]

Tin Hat Speak
Authored by: Anonymous on Monday, April 12 2004 @ 08:47 AM EDT
It could be argued that M$ could in theory plant 'crash code' in their products
which could be activated by, for example, a certain sequence of characters
appearing (in memory, on screen, on disk, or on a network frame).

Such activation could be at the request of the US government, and methods could
exist for a 'master key' to exist so that US & its military allies would not
be affected.

Also in the event of 'activation' the code could attempt to spread like a virus,
trigering other computers. It is interesting so suppose that M$ development
tools could automatically place the code all product created with them!

With M$ products being used extensively around the world, *OTHER COUNTRIES* have
every reason to use Open Source for their own security (against the interests of
the US perhaps).

This story just is exactly the same, but with players reversed.

With open source though, you do get the chance to examine the code and make
changes before you deploy it!

Paul, UK

[ Reply to This | # ]

Open Security
Authored by: Anonymous on Monday, April 12 2004 @ 09:01 AM EDT
I'm remembering a scene in Flemming's [i]You Only Live Twice[/i]:

James Bond, on assignment to Japan, receives a classified briefing from Tanaka,
Japan's intelligence chief, at Tanaka's home. Tanaka explains one of the
cultural differences between Japan and the western world:

In the West, he explains, people who wish to discuss secrets hide themselves in
closed rooms with locked doors. They close the windows, and use heavy curtains,
to make certain the information stays inside the room. In Japan, where many
walls were traditionally made of paper, people who wish to discuss secrets begin
my opening all of the windows and walls, so they can me certain that there is no
one hiding behind the wall, listening.

[ Reply to This | # ]

OT: request for comments - My SCO page
Authored by: soronlin on Monday, April 12 2004 @ 09:42 AM EDT
Could people please check this page for inaccuracies, confusion and so forth, please. Any other suggestions will be gratefully received. Thanks.

[ Reply to This | # ]

M$ worms etc.
Authored by: Anonymous on Monday, April 12 2004 @ 10:09 AM EDT
Could somebody explain why most of the viruses, worms etc. that hit M$ systems
only perform some relatively trivial function, and do not do something really
malicious like reformatting the hard drives of the affected machines?

[ Reply to This | # ]

  • M$ worms etc. - Authored by: archanoid on Monday, April 12 2004 @ 10:58 AM EDT
  • Why damage something useful? - Authored by: rjamestaylor on Monday, April 12 2004 @ 11:02 AM EDT
  • Resources - Authored by: Anonymous on Monday, April 12 2004 @ 11:07 AM EDT
  • Simple, actually... - Authored by: Anonymous on Monday, April 12 2004 @ 11:32 AM EDT
  • M$ worms etc. - Authored by: Anonymous on Monday, April 12 2004 @ 12:39 PM EDT
    • OR... - Authored by: Anonymous on Monday, April 12 2004 @ 06:20 PM EDT
  • M$ Update - Authored by: Anonymous on Monday, April 12 2004 @ 12:40 PM EDT
    • M$ Update - Authored by: Anonymous on Monday, April 12 2004 @ 04:16 PM EDT
  • M$ worms etc. - Authored by: Anonymous on Monday, April 12 2004 @ 02:25 PM EDT
    • M$ worms etc. - Authored by: Anonymous on Monday, April 12 2004 @ 05:39 PM EDT
Now I get the argument against binary drivers
Authored by: Anonymous on Monday, April 12 2004 @ 10:15 AM EDT
This article really opened my eyes to one hell of a reason
to absolutely refuse binary drivers (even NVidia ones),
they are potential backdoors.

The only way to ensure that there is no backdoor in kernel
level stuff is to review the code, and the fact that NVidia
is able to 'insert' a piece of unreviewed code in what
otherwise is a certified clean and peer reviewed system
taints that system as a whole.

Keep in mind that any driver linked into the kernel by
extension automatically has root priviliges. Even an
'insmod' would do.

So, if you want a secure system you'd better stay away
from any binary-only closed source drivers. (or hardware
that requires them), you could easily be unwittingly
installing someone's backdoor into your trusted facility.

Jacques Mattheij.

[ Reply to This | # ]

This does raise an important issue
Authored by: rben13 on Monday, April 12 2004 @ 10:28 AM EDT

This article has actually reassured me greatly about Linux security, but it does spark some interesting thoughts. It seems to me that it might make sense for there to be a group that randomly tests the security of Open Source projects that desire security testing.

Such a group would try to introduce bogus code or find some other way to compromise the target project at random intervals as a test of the security for the project. The target project and the testing group would agree to a reasonable interval of time within which the target group should detect the attempt. After that time has passed, the testing group should notify the target project and post the results of the tests.

It's important that the testing group introduce code that is only used for the purposes of the test and does not introduce any new security flaws within the target project.

I believe that a system like this would provide several benefits. It would provide clear evidence of how well the tested projects do at discovering compromising code. This should increase public confidence in the Open Source process and help the projects increase their level of security awareness. It should also increase overall knowledge and awareness of security procedures in the community of Open Source developers.

For all I know, though, someone may already be doing this.

[ Reply to This | # ]

On secrets
Authored by: archanoid on Monday, April 12 2004 @ 10:47 AM EDT
I am by no means a crypto or security expert, but I am an avid follower. Secrets are difficult things to keep. The fewer you have to keep, the easier it is to keep them. Of course, the danger with secrets is their discovery. And when you're relying on fewer of them, the risk in their discovery goes up. That's why Diffie says you have to keep your secrets to only the ones you can easily change.

Bruce Schneier wrote this essay over a year ago. It is not specifically about open source vs. closed source, but it fits. This sentence in particular:
This position in the debate ignores the fact that public scrutiny is the only reliable way to improve security -- be it of the nation's roads, bridges and ports or of our critically important computer networks.
I don't know about you; but, I trust what Bruce has to say in regards to security.

[ Reply to This | # ]

Backdoors and trojans article
Authored by: edumarest on Monday, April 12 2004 @ 10:49 AM EDT
I found this interesting article about backdoor programs and trojans at

http://www.spirit.com/Network/net0702.html



---
...if you cannot measure it then you cannot troubleshoot it, you can only
guess...
SuSE 9.0 on hp pavilion ze 4560us

[ Reply to This | # ]

"Without a license agreement" ???
Authored by: Anonymous on Monday, April 12 2004 @ 10:58 AM EDT
"Linux software, including contributions from Russia and China, is
spreading rapidly through the Defense Department because it can be freely
downloaded from the Internet without a license agreement or up-front fees,
bypassing legal, purchasing and security procedures...."


What planet is this guy from? Freely downloaded from the Internet without a
license agreement? That would only be true of PUBLIC DOMAIN software. Linux is
definitely not public domain. It is a copyrighted work which includes lots of
other copyrighted works. You generally do need a license to make copies of
Linux. Fortunately you have one--the GPL (GNU Public License), given to you by
the developers and contributors to Linux. You had better *not* bypass your
"legal, purchasing and security" procedures when you start running
your business or IT system on Linux. You should treat it like any other
acquisition--run it by your lawyers, work out what your rights and
responsibilities are, and decide if it meets your needs better than a
proprietary solution (which is likely).

[ Reply to This | # ]

Security by Obscurity
Authored by: Anonymous on Monday, April 12 2004 @ 11:11 AM EDT
The whole premise of him argument is basically security by obscurity,
unfortunately is will also obscures flaws until it's too late.

As a corollary, how does DOD keep classified information secure? Is simply not
distributing to the general public sufficient? What about Contractor's who work
with these systems? Is that considered publication under GPL?

[ Reply to This | # ]

Colinux
Authored by: Anonymous on Monday, April 12 2004 @ 11:12 AM EDT
I wonder what the M$ execs are making of this news?

Poo and Pants would seem appropriate.

[ Reply to This | # ]

OT - Rocket Scientists discovered?
Authored by: codswallop on Monday, April 12 2004 @ 11:27 AM EDT
Our slightly paranoid but always interesting friend Stats_for_all thinks he has
found Darl's MIT rocket scientists. They are ex-MIT and are in fact rocket
scientists. They're also old buddies of Darl - "PointServe, a Austin, Tx
company", which he says Darl ran at one point. Interestingly, he includes a
newpaper article quote about a lawsuit angaint them by a company called Brazen
for disclosing source code in violation of a merger agreement.

The case was filed in 2003, but the agreement was signed in 2000. The case is
"Travis County District Court, Cause No. GN-1300836", for those who
can easily get such things. It looks from the articles I googled that it's
basically a dispute about being paid in private company stock that was
hyped. The question is was it a failure of Brazen's due diligence or was it
fraud.


http://messages.yahoo.com/bbs?.mm=FN&action=m&board=1600684464&tid=c
ald&sid=1600684464&mid=122059&um=50&.sig=wnot6cMQBLLsj1dyD12WxA-
-

The yahoo message number is 122059 under SCOX.

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: Anonymous on Monday, April 12 2004 @ 11:41 AM EDT
Dan O’Dowd = Chicken Little

'nuff said!

[ Reply to This | # ]

Linux vs Common Criteria
Authored by: swillden on Monday, April 12 2004 @ 12:05 PM EDT

O'Dowd is right in one way, sort of, when he says that Linux can not be certified to the highest levels of the Common Criteria standards, but that is the fault of the standards, not the fault of Linux.

The problem is that CC standards have an underlying premise which, while not false, is not the whole truth, either. That premise is that security can be achieved through rigorous design and development processes, with each phase being carefully checked to ensure that it fulfills the requirements in the previous phase, and with all developed materials being kept under extremely tight security throughout the entire process.

This means that to get an EAL-7 certification, you have to start with security-focused requirements specifications, thoroughly documented and reviewed. Then you take those requirements and produce design documentation, verifying carefully and thoroughly that the the design precisely fulfills the requirements and does nothing else. Then you implement the code, faithfully and rigorously adhering to the design documentation, occasionally verifying back to the original requirements specifications. Next, you test, and any discovered defect must be analyzed, not only to fix the defect but also to find and correct the flaw in the process that permitted that defect, going back to previous phases to see what other discrepancies might have been introduced by the process flaw. Finally, the build process must be tightly controlled and verified and the process of delivering the binary to the customer must be similarly secured. At every stage, documentation must be thorough and access to all documentation and code must be rigorously controlled, lest the bad guys be able to make some modifications.

That process can produce secure software, but the faulty premise, which lies not in Common Criteria but in the way the world uses it, is to suppose that *only* such a process can produce secure software (or that the process guarantees security -- it doesn't, it only helps to ensure that the end product meets the requirements, and that any flaws discovered can be traced to a source, who can be held accountable).

Open source provides a completely different approach to security. Open source uses the democratic approach, where openness and transparency are used to ensure that no malicious changes can be made, because they'd be noticed. In contrast, the CC approach to security presumes that there is insufficient manpower available to adequately screen modifications, so malicious changes must be prevented via restricted access.

Both models have their strengths and weaknesses, but both are good routes to security. Unfortunately, they are almost completely incompatible.

To obtain CC certification for Linux, or anything developed without the CC-specified process, is difficult for two reasons. First, it's hard because it requires reverse-engineering all of the code to build the requisite requirements and design documentation. That is at least as difficult as building the software from scratch. Second, it's not clear how to go about certifying such a package, since the higher-level CC specifications do not contemplate such an approach. Getting low-level CC certifications EAL-1 through EAL-3 is easy to do, because they don't require much more than that the software be provided by a reputable supplier. Going beyond that, though, gets very, very hard.

Even if you manage, through tremendous effort, to get such a package certified, the software is going to be very hard to change. Where new features and capabilities flow smoothly into Linux, a CC-certified Linux will have to apply the same extremely rigorous -- and redundant, since everything has already been thoroughly scrutinized -- analysis to every new piece of software that is to be added.

All of which raises the question that if CC certification is so much effort, and might not even be possible without changing the CC standards, why bother with it? It won't really change the security of the operating system. The only thing CC certification gives you is CC certification. In the most cynical view, CC certification is a sales and marketing tool, pure and simple, and has nothing to do with security.

But, it's an *important* sales and marketing tool, I suppose. Enough organizations have specified internal policies requiring specific CC certifications for particular applications that lack of certification is preventing Linux deployment. To sell to certain segments of the market, the CC hoop jumping is required, even if the hoops are 2000 feet above the ground.

What's especially ironic is that the organizations that *really* know and care about security don't especially care about CC certification. The US DoD, for example, isn't letting lack of certification stop them from using Linux, particularly since they have their own security teams and those at the NSA to help decide what's secure and what's not. The NSA doesn't tell us what they are or are not doing with Linux, but the fact that they created SELinux speaks volumes.

Common Criteria is a good tool for building and maintaining secure software, but it's a fantastic tool for building press releases, marketing brochures and FUD campaigns. Be wary of suits spouting EALs.

[ Reply to This | # ]

The Trojan Horse with the see-through skin!
Authored by: moogy on Monday, April 12 2004 @ 12:35 PM EDT
I am getting rather tired of seeing this new FUD that uses
the Greek Trojan Horse analogy. The Greek Trojan Horse was
effective because no one could see what was inside. In a
similar manner no one can see what's inside the proprietary
hidden code, therefore it's the proprietary code that fits
the analogy, but you CAN see what's inside of OSS software.

---
Mike Tuxford - irc.fdfnet.net #Groklaw
First they ignore you, then they laugh at you,
then they fight you, then you win. --Gandhi

[ Reply to This | # ]

Which philosopher will be next?
Authored by: chrism on Monday, April 12 2004 @ 01:40 PM EDT
Anyone want to take bets on which famous philosopher or parable of the ages will
next be used to attack open source methods?

I'm betting Zeno of Elea and his famous paradoxes of movement will be used to
claim that Linux is clearly not secure compared to closed source alternatives
like Windows.

I wonder how the argument will read? Will the author hope that the average
reader won't know that the paradoxes were solved by the invention of the
calculus (well, I mean by some of the results commonly grouped in with calculus
these days)? Will our rebuttal take the form of an elementary lesson in
infinite series that have a finite sum?

These are the days of our lives...

Chris Marshall

[ Reply to This | # ]

Speaking of Backdoors
Authored by: tielman on Monday, April 12 2004 @ 01:48 PM EDT
The issue of backdoors in propriatary code was brought up last week by none other than Cisco.

In this advisory entitled A Default Username and Password in WLSE and HSE Devices they clearly state that they put one in: "A default username/password pair is present in all releases of the Wireless LAN Solution Engine (WLSE) and Hosting Solution Engine (HSE) software. A user who logs in using this username has complete control of the device. This username cannot be disabled. There is no workaround."

Ooops.. I guess that wouldn't have been caught if it were Open Source 'eh?

[ Reply to This | # ]

I think closed source in a security risk
Authored by: Night Flyer on Monday, April 12 2004 @ 02:22 PM EDT
I think closed source is a bigger security risk. Microsoft has back door entry
points to Windows. Note: I am not paranoid, I am merely observant.

I am told, and I have observed, that the Windows 2000 at work "phones
home" on a regular basis. I have been unable to catch the packet sent to
analyze it, but the firewall records the (blocked) attempts. What is not clear,
to me at least, is whether some attempts bypass the firewall block while
disguised as benign communications.

What if someone builds a virus that appends itself to this back door, and sends
whatever to whomever it wants. (Now that I think about it, it has probably been
done already.)

Isn't that a built in security risk of major proportions?

It appears to me that, for whatever reasons, Microsoft does not want to close
its backdoor(s) into its operating systems.

I was talking to a manager of my local bank and she was boasting that their ATM
uses Microsoft Windows. Ummmmm. I find this more than vaguely upsetting.

(I know this is a bit of an apples and oranges argument. Windows 2000 doesn't
meet Criteria Evaluation Assurance Level 7 (EAL 7), but this concern is real,
verifiable and generally apparent and it relates to security risks in closed
source software.)



[ Reply to This | # ]

Att PJ: (OT) SCO's Rocket Scientists finally found?
Authored by: doughnuts_lover on Monday, April 12 2004 @ 02:34 PM EDT
From Yahoo's SCOX board:

Rocket Scientist ID'd
by: stats_for_all
04/12/04 10:23 am
Msg: 122059 of 122097

I believe I have Identified Darl McBride's much discussed team of MIT Rocket scientists who analyzed code. PointServe, a Austin, Tx company, where McBride was CEO and raised Venture Capital fits the bill. McBride ran PointServe prior to his stint at Franklin Covey.

Look at the following quote from the PointServe company profile:
-----begin quote-----
"You don't have to be a rocket scientist to start a software company, but PointServe's founder and CEO proves it doesn't hurt. Ed Powell, a former MIT rocket scientist, applied artificial intelligence designed to increase satellite performance to the more earthly problem of scheduling pest control, plumbing, and other home services."

G. Edward Powell, Chairman and Chief Executive Officer
Dr. Powell founded PointServe in 1996 with the vision of applying economic optimization technology to service supply chain management. For eight years prior to founding PointServe, Dr. Powell was a Member of the Technical Staff at the MIT?s Lincoln Laboratory, where he developed advanced forecasting, modeling, and simulation algorithms for autonomous satellite navigation. These algorithms now represent the core of the Company?s economic optimization technology. Dr. Powell received his Ph.D. and M.S. in Aerospace Engineering from the University of Texas at Austin and a B.S. in Aerospace Engineering from Auburn University. Dr. Powell is currently a member of the President?s Information Technology Advisory Committee.

Mark T. Lane, Chief Scientist
Dr. Lane has been with the company since shortly after its founding, designing algorithms that represent the state-of-the-art in forecasting and real-time optimization. He has over 15 years of experience in numerical and data analysis, optimization and combinatorial techniques, and genetic and evolutionary computations. Prior to joining PointServe, Dr. Lane spent 10 years as a member of the Technical Staff at MIT Lincoln Laboratory. He was the Co-Principal Investigator for the Midcourse Space Experiment, the largest SDIO-funded satellite mission. Dr. Lane is the author or co-author of more than 20 refereed journal articles and more than 30 other published papers. He holds a Ph.D. from Vanderbilt University in Mathematics with a specialization in Abelian group theory and related abstract algebras.

PointServe
110 Wild Basin Rd
Suite 300
Austin, TX 78746
Ignite Your Mobile Workforce!
-----end quote-----

Researching PointServe, I came on a reference to a lawsuit filed against the company on 4/14/03 for stealing and distributing another companies source code.

From the Austin Business Journal article:
-----begin quote-----
During the fall of 2000, PointServe was struggling with its mobile workforce software, according to the suit, and decided to scrap its version of the technology and replace it with Brazen's.

The suit claims PointServe offered to buy Brazen in late 2000 and agreed on a fair market value of $9 million. Brazen's founders licensed exclusively, without royalties, the source code to PointServe, the suit states.

Brazen's founders allege they discovered recently that PointServe breached the merger agreement and the license agreement by wrongfully disclosing Brazen's source code to Southern Union Co., a PointServe investor and customer.

"In the software industry a developer's software source code is his very life-blood," the suit states. "Disclosing software source code is the equivalent of giving away a company's most valuable assets."

-----end quote-----
www.bizj ournals.com/austin/stories/2003/04/14/story5.html

[ Reply to This | # ]

O'Dowd ALTERED Thompson's quote
Authored by: Anonymous on Monday, April 12 2004 @ 03:00 PM EDT

The distortion of the meaning of Ken Thompson's quote has been covered already, here and elsewhere, but I wonder if anyone else has noticed so far that O'Dowd actually edited the quote, quoting sentences 1, 2 and 4 of a paragraph but omitting number 3. Here it is the way Thompson said it, at least as ACM reports it:

"The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code."

Now when I was going to school, we were taught that if you were taking anything out of a quote, even so much as a single word, you replaced it with an ellipsis to show people that you had changed the words you were quoting and that if they wanted to know exactly what the person had really said, they'd need to look up the original source. O'Dowd's press release doesn't do that. It simply drops the sentence out without alert.

It may not technically be illegal to deliberately misquote in that manner (and I think it must have been deliberate) but it's surely unethical.

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: juhl on Monday, April 12 2004 @ 05:05 PM EDT
I just couldn't help writing them a letter.
 

For the attention of Mr. Dan O'Dowd
-----

First of all, please excuse my bad grammar and spelling, I am not a native english speaker.

I read in this article http://home.businesswire.com/portal/site/google/index.jsp?ndmViewId=news_view&a mp;newsId=20040408005676&newsLang=en over at businesswire.com, the not so flattering things you have to say about the Linux Operating System and the development community behind it.
I am myself a part of the Open Source software community and besides being a long-time user and contributing to a number of userspace applications I have even contributed the odd patch here and there for the Linux kernel, so I can't just ignore what you say in that article.

One thing you are reported as saying is
"The open source process violates every principle of security. It welcomes everyone to contribute to Linux. Now that foreign intelligence agencies and terrorists know that Linux is going to control our most advanced defense systems, they can use fake identities to contribute subversive software that will soon be incorporated into our most advanced defense systems."

That everyone (with the appropriate skills) is welcome to contribute to Linux development can not be argued. That it violates every principle of security can, however, be argued. It's well recognized in security circles that "security through obscurity" provides no actual security - keeping the workings of your code closed in binary only applications and relying on the assumption that noone can thus find out how the system works is a flaw. Just as a determined attacker will quickly determine that a windows user changed the name of his Administrator account to Admin, he will also (with the right skills and tools) resonably quickly be able to reverse engineer parts of a binary application to look for vulnerabilities.
Hiding your code gains you nothing - quite the opposite; with a closed system you are relying completely on the developers of your vendor to write secure code, and you have no way to verify it yourself or hire someone to do it. With an open system you have not only the developers looking over the code, but also a lot of users, and you are able to do security audits of your own.
Also your statement that all that is required to get code accepted into Linux is a fake identity and then contribute code, is not only wrong, it's insulting.
Getting code into the Linux kernel is *hard* - I know this from personal experience. When you submit your code you will be asked to explain in detail what it does, and it will be scrutinized and criticised by a great number of people. Only if you are able to give satisfactory answers as to the workings of your code as well as be willing to make any changes suggested by the other developers will your code stand a chance at being accepted into a maintainers tree - and that's just the first step. Once you've convinced a subsystem maintainer of your codes usefulness it may sit in a niche development tree for ages. Only when the maintainer is satisfied will your change go upstream to the maintainer of either the stable or development kernel tree (where it will once again be scrutinized and discussed). If your code gets that far it usually goes into a beta or test release where a larger userbase will also be looking at it and submit comments and suggestions. And even if it does make it through all those steps and into a production release, there is never any guarantee that it will stay - if someone finds a flaw you can be quite sure the code will be pulled or fixed almost instantly.
Not only are you saying that the kernel developers will accept code from anyoneone who offer it, but you are also implying that the developers do not have the skill to spot subversive code - this I find quite insulting.
Yet another flaw with your comment is that you assume that just becourse code goes into the Linux kernel it will the automatically get into defense systems. I very much doubt that your US military does not thoroughly review, test and validate any systems they choose to deploy.

Another thing you mention is Ken Thompsons famous trojan. You seem to use that as an argument that just becourse you can inspect the code you can't be certain it is safe. To a small degree that may be true, but you are neglecting some huge points.
First you ignore the fact that a specific version of a binary compiler was needed in order to propagate the trojan - switch compilers and you are rid of it or at least you can then compare two different binaries which would enable you to better spot suspicious code.
Secondly you ignore the fact that a developer changing the code substantially could very likely produce code that would then not be recognized by the trojaned compiler and thus would be immune.
Thirdly you ignore the fact that closed systems are even more vulnerable to attacks of this kind since there you don't even have the source, but *only* the binary - and occourances such as the "easter eggs" in certain versions of Microsoft Word and Excell (and other similar incidents) certainly prove that you can't just blindly trust developers of commercial software to only include the code that the management wants them to (and who says you can trust the management by the way?).

Other parts of the article give the impression that the fact that some code in Linux is being produced in Russia and China makes it likely to be less secure. That's very predjudiced and quite offensive.

I will stop now before this mail becomes too long, but I would would like to comclude by refering you to the following article on the groklaw.net website which goes into much greater detail of the statements you made and (in my oppinion) manage to effectively debunk all of the Fear Uncertainty and Doubt you are trying to spread through your statements. http://www.groklaw.net/article.php?story=20040411073918151

I don't expect a reply to this mail, but if you want to give one I would certainly like to hear you explain your views in more detail. Feel free to write me.

[ Reply to This | # ]

Warning against Greenhill!
Authored by: swedulf on Monday, April 12 2004 @ 05:40 PM EDT
It is kind of funny that O'Dowd just supplied us with a great reason for NOT
bying Greenhill products if you are a security consious agency anywhere outside
of US. It only takes one loyal US citicen, employed by Greenhill, to include the
type of trojans that O'Dowd implies. Such an act could probably be regarded as
patriotic and included, even without, the knowledge of Greenhill management. I
am shure that Greenhill management takes every precaution to market trustworthy
products, but due to its closed nature, we have no way to be shure. And all it
takes is one good patriot!

I am shure that Greenhill resellers all over the world are greatful for mr.
O'Dowds sales support effort.

Of course the same reasoning applies to any boxed, closed source, software that
is to be used where security is important, like government, administration,
defence, banking and so on, and in some cases we know that software is designed
to "call home" and report, See the MS-XT EULA. And considering the
trend towards outsourcing and international software development, I would be
very cautious even if I represented an American agency. Remember, all it takes
is one good patriot, from anywhere in the world!

[ Reply to This | # ]

From the "stupid technology patents" dept.
Authored by: Anonymous on Monday, April 12 2004 @ 05:54 PM EDT
It's not just software that must endure stupid technology patents.

http://www.gamasutra.com/php-bin/news_index.php?story=3660
(registration possibly required)


April 8, 2004

Microsoft Registers Hard Drive Patent
Rumours have once again begun to fly regarding whether the next-generation Xbox
will or will not contain a hard drive. After announcing a significant deal with
flash memory producer M-Systems many had assumed that a hard drive would
definitely not be included in the new console, but a new U.S. patent seems to
contradict this.

The new patent concerns "a gaming system [that] includes a hard disk drive
for storing applications and other data.” This has started new speculation on
whether Microsoft might be considering two variations of the new console, at
different price points, in the same manner as Sony are apparently planning with
the PlayStation 3.

Source: Gamer Feed (http://microsoft.gamerfeed.com/gf/news/6065/ )

[ Reply to This | # ]

CEO's of LynuxWorks and FSMLabs Reply to Green Hills' FUD
Authored by: hikingpete on Monday, April 12 2004 @ 07:00 PM EDT
I'm going to cry. That is so freakin' wrong. Can anyone be that blatently
stupid? The only reasonble explanation is that it's malicios. I mean, there are
some really stupid people out there, but that- that is just wrong.

He suggests that other options should be considered when security is a must. His
option is one that is definately one that should not be considered. Either he's
a moron, or amazingly deceitful. Not qualities you want in someone controlling
your security software.

[ Reply to This | # ]

OT P.J. in Linux Journal
Authored by: RevSmiley on Monday, April 12 2004 @ 07:20 PM EDT
Pamela Jones has an article on "Open Legal Research" in my copy of Linux Journal this month.

Very good. Very nice Pamela.

---
Never accredit to unalloyed evil what simple greed compounded with stupidity can explain.

[ Reply to This | # ]

The irony here is amazing
Authored by: Jude on Monday, April 12 2004 @ 07:31 PM EDT
Dowd is telling lies in an attempt to convince people that his products are
trustworthy.

I think the irony is mind-boggling.

[ Reply to This | # ]

Reminds me of Microsofts 1999 NSA key fiasco
Authored by: seeks2know on Monday, April 12 2004 @ 07:43 PM EDT

All of the discussion above reminds me of the hoopla surrounding the 1999 discovery of a second public key, named _NSAKEY, in Microsoft's Windows 9x, Windows NT, and Windows 2000 operating systems.

In 1999, Andrew Fernandes, Chief Scientist at the Ontario-based Cryptonym Corporation, noticed that Microsoft provided a second public key used by Windows for encryption purposes which was names NSAKEY.

In the weeks just prior to Fernandes' claim news began circulating that the U.S. Department of Justice was asking for special legislation that would let them spy on computers without a warrant or a user's knowledge. The discovery of an additional, unnecessary crypto key named after the U.S. National Security Agency (who is in charge of crypto activities) fueled the flames of paranoia that the U.S. government already possessed a trojan horse permitting access to de-crypt any protected data on any Windows system.

Of course, Micorsoft denied that the key provided a back door. Scott Culp, Windows NT security product manager, said "They're in there because that's how we comply with export controls that the NSA is overseeing."

After the dust settled, experts determined that NSAKEY did not provide a back door. Privacy Software Corporation stated:

"We have determined that the "_nsakey" discovered by Cryptonym is in fact nothing more than a second "public key" and is not in any way, shape or form a "back door.""

However, what they did discover was still very alarming:

The risk in Microsoft's error is not that the libraries contain a "trojan horse" or "back door." We have determined this much for certain. The flaw involves providing a second public key as well as a serious design flaw that allows this second key to be manipulated and/or REPLACED with another key that might not originate from a certified source. Providing more than one valid key for a cryptography process effectively cuts the effectiveness of the cryptography in half since someone trying to break the cryptography now has two chances to "win" instead of one. This is significant enough of a security risk but Microsoft goes one better by allowing their Cryptography API to REPLACE this second key with simple calls to the API which copies out the second key, replaces it with another and saves it - all without raising any alarms.

As if this alone was not enough to be concerned about, they noted this in their conclusions:

The Microsoft Cryptography API is so seriously compromised that it must be considered to be untrustworthy and owing to the severity of the security issues involved, no Microsoft product which uses any of these functions or libraries should be used for any sensitive purposes until such time as Microsoft corrects these problems universally. Because these functions are integrated into the operating system itself (and other programs manufactured by Microsoft as set forth in the independent information contained in the links above) and so many portions of the operating system's functionality is dependent upon "trusted functions" whose "trust" is established by a flawed encryption system, the operating system itself should be considered "toxic" from a security standpoint.

So, when I hear someone claim that closed systems are more secure than open systems, I always remember this incident. It is a stark reminder that there are many hidden pieces of code lurking unknown to users. Who knows when your system will unexpectedly dial home to an unknown entity, or provide unauthorized access to a stranger.

Give me open software where many eyes have scrutinized each software function - and where I can inspect the code directly if I have a concern. I don't need an "expert" to tell me which is more secure.

---
"The least initial deviation from the truth is multiplied later a thousandfold."
-- Aristotle

[ Reply to This | # ]

OT: Trolltech Interview
Authored by: dmscvc123 on Monday, April 12 2004 @ 07:52 PM EDT
PF: Do they have any influence on you?
ME: Not really. They have a 5.7% stake in Trolltech. Historically Canopy became
an investor because we cooperated with Caldera. As you might know we made and
delivered the graphic install, which was the first graphical install for Linux,
for Caldera Linux. The Canopy Group as the main investor in Caldera was so
impressed by the work we had done that they wanted to invest in Trolltech, to
make sure that Trolltech could become a solid company that could continue to
deliver software to the Linux community. It's pretty ironic to see what has
happened historically after that of course. But they don't have any influence on
Trolltech. Trolltech is employee-owned, 65% of the shares are owned by the
employees and we control the business so they have a small stake in us and that
is it.
....
PF: The thing that SCO is asking and preparing to sue everybody about some code
they pretend they own in Linux.

EE: I can tell you that we do not support these actions from SCO. Trolltech in
many ways is dependent on the success of Linux. We think Linux is a Good Thing.
We support Linux in many ways. On the other hand everybody has the right to
bring his case to court. In this case it is very strange that they have not
pinpointed exactly where in the code there is a problem and we feel that if they
really had a problem with this, they could have acted very differently in
presenting this to the community. So again we do not support these actions.
http://dot.kde.org/1081772638/

[ Reply to This | # ]

Russia and China?
Authored by: Anonymous on Tuesday, April 13 2004 @ 12:56 AM EDT
Russia and China?

What year is this guy living in?

Russia and China are now the US's
sources-of-cheap-labor^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H^H%H
valued-economic-partners.

The cold war's over man. Even the KGB decided they preferred mobile phones to
mobile missile launchers, MTV to MIRVs, IBM to ICBM, etc.

Everybody knows Russia and China ain't the "enemy" anymore...

In 15 years or so, expect this guy to tell us how US airlines are flying
"subversive" French-made aircraft.




[ Reply to This | # ]

The Inquirer also has the news - "Man goes ballistic, says Linux is a security threat"
Authored by: Anonymous on Tuesday, April 13 2004 @ 08:07 AM EDT

The story can be found here: http://www.theinquirer.net/?art icle=15274

[ Reply to This | # ]

Reverse engineering
Authored by: apessos on Tuesday, April 13 2004 @ 09:32 AM EDT
Could someone refresh my memory about reverse engineering. I thought it was
part of our fair-use of products to be able to do that? Or have I completely
lost my mind? Being an engineer, it's fun to take things apart and get an idea
for how they work. It's called learning.

I ask because all these clauses and talk about reverse engineering not being
allowed seem counter intuitive to me.

[ Reply to This | # ]

Not just eyeballs, microscopes...
Authored by: tz on Tuesday, April 13 2004 @ 11:23 AM EDT
One other thing that happens with OS is that it can and often is instrumented
with a window onto the kernel (think /proc and lsof).

If someone put a backdoor in, it would likely be discovered when someone asked
why there was some extra process, open file or socket, or something else.

Linux, the various flavors of BSD, and most other FOSS projects have these
interscopes available or they can easily be added.

A closed OS may or may not have such tools (where is lsof with -i and other
options for Windows?). Even so, they can't be added to and usually limit the
scope of what they show.

Unlike Windows, that includes the HTML processor and Media Player as PART OF THE
OS by marketing edict, and opens dozens of ports and services whether they are
needed or not, most Linux systems are modular and minimalistic. No open socket
- no way of exploting it. Compromising a mail reader doesn't compromise the
kernel.

A lot of FOSS development is to minimize, eliminate, or isolate the use of
root.

Close source could work on a very secure OS, but they would need a lot of very
creative auditors constantly looking at the code. Some attacks are very
creative (like the timing attack against private SSL keys - you could determine
the bit in the key by the time it took to do the math).

It is not that FOSS would be invulnerable, but if you noticed strange traffic,
you could then add an interscope, and find out and fix things. With closed
source, call the vendor (and leave a message if this happens late friday).

[ Reply to This | # ]

Linux Speak: Seperating Fact from Fiction
Authored by: SpinyNorman on Wednesday, April 14 2004 @ 09:23 PM EDT
The Indian IT Trade 'zine CXOToday has a response to O'Dowd's blather (and the
Yankee Group TCO FUD) from the Director of Red Hat India.

http://www.cxotoday.com/

[ Reply to This | # ]

Slightly OT - security through diversity
Authored by: Chaosd on Thursday, April 15 2004 @ 09:07 AM EDT

I've been thinking about this for a while, and at the risk of getting flamed I'll make a modest proposal.

On of the whole points about Unix is that there are many similar versions. All use similar command syntax - many apps will cross-compile (big cheer for POSIX), but core design philosophies can differ, and so implementations of common processes will differ.

Off the top of my head, there are three main FLOSS Unix variants: Linux, Free BSD and Hurd - feel free to rant on about any I've missed - it'll help my case ;)

If an organisation is serious about security it'll mix and match these platforms - even to the extent of cycling them in and out of various roles. One week you could use a Free BSD firewall withe a GNU Hurd based router, then next week you drop in a Linux firewall and a Free BSD router. (Yes, I know this is a lot of work, but that's what sysadmins should be being paid for...).

The point is that in a heterogeneous environment no one single exploit will work. If somebody can hack a FreeBSD box, they will get held up by the Linux box next in line. A binary virus written for 80x86 won't do much damage on an Alpha box.

Contrast this to a modern, homogeneous setup, where everybody runs Windows 2000 on 80x86. Once an exploit exists all machines are vulnerable

IMHO - given that 'bug free' is an oxymoron, if the world mainly used Linux the security risk would be the same as it is with most people uising Windows. There would certainly be fewer exploits, but each exploit would still have global scope.

---
-----
There are no stupid questions

[ Reply to This | # ]

Eugene Spafford spits up the same tripe on EE Times
Authored by: Asynchronous on Monday, April 19 2004 @ 12:44 PM EDT
But at least he was a bit more even handed.... sort of.
Purdue University professor Eugene Spafford and Cynthia Irvine of the Naval Postgraduate School warned that the highest-level, but little-understood, security concerns are sometimes ignored during the development of control systems for tanks, bombs, missiles and defense aircraft. Linux, Windows and Solaris operating systems should not be used in such applications, Spafford said.

...

Now... from an MS press release in 2003:
REDMOND, Wash. -- Feb. 20, 2003 -- Microsoft Corp. and leading academic security and privacy research scientists from around the world today gathered for the first meeting of the company's Trustworthy Computing Academic Advisory Board. The board was formed to advise the company on security, privacy and reliability enhancements in Microsoft® products and technologies, so that Microsoft can obtain critical feedback on product and policy issues related to its Trustworthy Computing initiative.

...

The board is composed of 19 leading research scientists and privacy policy experts, each with a significant track record in his or her field of expertise:

Martín Abadi, University of California, Santa Cruz
Elisa Bertino, University of Milan, Italy
Fred Cate, Indiana University School of Law
Dawson Engler, Stanford University
Virgil Gligor, University of Maryland
Richard Kemmerer, University of California, Santa Barbara
Chris Mitchell, Royal Holloway, University of London
Greg Morrisett, Cornell University
Deirdre Mulligan, Samuelson Law, Technology and Public
Policy Clinic, University of California, Berkeley
David Patterson, University of California, Berkeley
Fred Schneider, Cornell University
Paul Schwartz, Brooklyn Law School
Eugene Spafford, Purdue University
Neeraj Suri, TU Darmstadt
Peter Swire, Ohio State University
Vijay Varadharajan, Macquarie University, Australia
Eugene Volokh, UCLA School of Law
James Whittaker, Florida Institute of Technology
Jeannette Wing, Carnegie Mellon University

[ Reply to This | # ]

Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )