decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books

Gear

Groklaw Gear

Click here to send an email to the editor of this weblog.


You won't find me on Facebook


Donate

Donate Paypal


No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
Chuvo's Page, Malware, and Scalable Systems
Saturday, July 15 2006 @ 09:08 AM EDT

A reader sent me a link to Sun's corporate research home page, because it stood out to him as a good example of corporate internet policy. What belongs on a corporate website? When should companies take down a page as not getting enough visitors? Of course the context was SCO's rapidly disappearing site. Does a page like Chuvo's page belong? That dog is so cute, and I definitely like Sun better for having his page up there.

That led me to the page on Contrarian Minds at Sun, which is seriously impressive, and then to this page about Dr. David Yen, who heads up Sun's Scalable Systems Group. It got me thinking. I've been talking with a law professor about the future of the Internet, as it happens, because he's writing a paper and asked me my opinion. His view, essentially, is that security issues have become so dire, some regulatory response is inevitable, and he offers some suggestions. He feels that most users have more computer power than they can safely use, because they lack the skills. My view is that the problem isn't the users, it's the software. If Steve Ballmer can't do it, who can? Obviously, it can't be a matter of skill. It's more a question of choosing a different operating system.

I've never had a virus on my Mac or my GNU/Linux computers. Ever. Never, ever. I can open up attachments in email and click on stuff and nothing ever happens. Why is that? Because the operating systems are built for normal users to be able to use them relatively safely.

I'm not saying nothing can ever go wrong, just that nothing ever has and it's been a lot of years. He responds that once either operating system becomes popular, they'll have troubles too, which shows you that Microsoft PR is effective, but I don't believe either could ever be as unsafe as Microsoft Windows, because they are designed differently, but even if it were true, the immediate amelioration of the malware problem that would result from large groups switching is worth doing for the short term alone. I've been trying to express that the problem is the software monoculture, in a world where almost everyone uses Microsoft's software, and that it's a design issue, not a matter of software popularity, although that surely doesn't help. Just get a large segment of the population to switch to Macs or GNU/Linux and you'd find the problems shrinking, I think. If someone is itching to regulate, there are surely ways to encourage such a development.

And then there is the proposition that purists need to compromise because the Internet will not be able to handle the demands that are going to be made on it. Hollywood wants to use the Internet to sell movies and the article on Dr. Yen mentions WalMart wanting to use RFID, which clogs things up plenty too, I gather.

And while I stay away from politics, my logical mind asks this question: if folks want to use the Internet for things that put a strain on everything, essentially for their own business benefit, then shouldn't they be the ones to pay to build it out or build their own, instead of negatively affecting everyone else's experience on what is surely as much a public commons as a public park or a river or the air? Are movies important enough to alter the very functionality of the Internet?

When I read Dr. Yen's page, I wondered: is there something else we could try to scale? Is there really no tech solution? Anyway, it interested me enough to want to share it with you, because you know more than I do about the tech, and I wondered what you thought. Politics is off topic here, but just from a tech perspective, what solves the malware problem, short of such horrifying suggestions as I read in the paper, that governments require software writers to write to certain specifications? How do you answer his firmly held belief that the problem is a matter of popularity, not design? Do you have any papers I could share with him? What about scalability? His paper has me quite alarmed. How can folks regulate something they don't understand? How can we help them understand before they ruin everything?

The professor suggests having two Internets, one for business, which will be seriously locked down to keep it "safe" for business, where users have very few options, since in his view they don't have the tech skills to be safe (talk about blaming the victim), and the other a "quiet backwater" more or less the way the Internet used to be where folks can be as creative and experimental as they wish, free from regulations, to encourage innovation. If there were no degradation in the nonmarketers track, I'd say, fine. Do it.

Now, I view the movie track as being the quiet backwater, personally, but I'd say please do take all the marketers and "entertainment" sellers and give them their own corner ASAP. If they can persuade their customers that all they are allowed to use on the Internet is a purchasing appliance, go for it. They don't understand the Internet, so they want to remake the Internet in their own old-fashioned business model image, whereby they send consumers stuff on a one-way pipe, hobbled with DRM, and all consumers get to do is pay for it, sit there passively and watch it -- however many times they are allowed to before it turns the key and shuts them out. That isn't what the Internet was designed for. It's like their brains have been DRM'd to only allow them to think in old ways. That's fine, until they wish to impose their old ways on everyone else. And the rest of us, who do get what the Internet is for, would very much like them to go away and leave the Internet alone. Let them set up a mall for movie lovers, by all means, and I hope they make millions, trillions, whatever -- their wildest dreams come true. If the folks who can't live without Tom Cruise want to pay for that, let them, by all means. But the Internet is a public good, and the telcos' and the entertainment industry's bottom line isn't all that is at stake here.


  


Chuvo's Page, Malware, and Scalable Systems | 387 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
Off Topic Here, Please
Authored by: TheBlueSkyRanger on Saturday, July 15 2006 @ 09:13 AM EDT
Hey, everybody!

Clickies if you got 'em.

Dobre utka,
The Blue Sky Ranger

[ Reply to This | # ]

Corrections Here, Please
Authored by: TheBlueSkyRanger on Saturday, July 15 2006 @ 09:15 AM EDT
Hey, everybody!

You know the drill.

Dobre utka,
The Blue Sky Ranger

Coding is an art form that fights back.

[ Reply to This | # ]

Monoculture
Authored by: kenryan on Saturday, July 15 2006 @ 09:48 AM EDT
The monoculture argument is a poweful one. Note that even if the world switches
wholesale to Linux, there still won't be a monoculture, at least not to the
extent we have today. There will be the Debian folks, the Fedora folks, the
Slackware folks and so on - distros with very different approaches to ease of
use, update methods, and feature mix.

Eventually I would expect alternate CPU architectures to become more popular
since the OS won't be so heavily biased towards Intel - I'd expect to see more
PPC-based hosts, ARMs, some SPARCs, maybe MIPS will make a comeback, and likely
some that don't yet exist (you know, the sort of thing that results from free
and fair competition).

What malware might exist would have to drag along source code, and possibly its
own compilers!

There will always be a gradient in user skills - someone who happily clicks
"okiedokie" to any prompt that pops up is not going to be helped by
any OS he has any degree of control over. Even reasonably clued folks aren't
immune (I consider myself more clued than most but I still got a server nailed
by a spammer - a minimalist Slackware system, essential services only, but I was
tardy getting security updates in place [tardy = approx. 2 years; I deserved
what I got]).

So long as there is a range of systems and a range of skillsets out in the
marketplace virus writers and zombie providers are going to be getting pretty
bored.



---
ken
(speaking only for myself, IANAL)

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: MathFox on Saturday, July 15 2006 @ 09:54 AM EDT
The professor suggests having two Internets, one for business, which will be seriously locked down to keep it "safe" for business, where users have very few options, since in his view they don't have the tech skills to be safe
What does "safe for business" mean? What rules would govern this businessnet? How does one enforce this rules internationally? Are "respected" "pop up and email publicity" companies allowed on businessnet?
Bonus question: How do you keep businessnet separated from the free internet?

---
If an axiomatic system can be proven to be consistent and complete from within itself, then it is inconsistent.

[ Reply to This | # ]

"SCO's rapidly disappearing site." Guess who.....
Authored by: tiger99 on Saturday, July 15 2006 @ 10:06 AM EDT
I did a wget of the whole thing a week or more ago, but didn't get all the options right so many links are incorrect on my local copy.

But, having a quick look at the files, I noticed that under the root I had precisely the following files and directories:

  • index.html
  • partners.sco.com
  • wdbi.sco.c om
  • websurveyor.net
  • www.enderlegroup.com
  • www.sc o.com

Seems that a well-known paid shill is quite closely coupled into their site.....

[ Reply to This | # ]

OLD Humorous Page at IBM
Authored by: Anonymous on Saturday, July 15 2006 @ 10:09 AM EDT
Check this, it's ancient (when should it be removed - NEVER). Napolean Camplex

[ Reply to This | # ]

Methinks your friend is an idiot
Authored by: The Mad Hatter r on Saturday, July 15 2006 @ 10:14 AM EDT


Tell him that he's in the position as I am when I'm writing on the SCO-IBM court
case. I am not a judge or lawyer, so anything that I say is personal opinion.

He is not a techie, so anything he says is personal opinion.

This doesn't mean that there's anything wrong with personal opinion. Opinion is
very important as a driver in our sociatey.

But when you are advocating wholesale societal change, you need to provide
evidence of why your proposal will work. This means talking to techies, business
people, telcos, etc. and learning. Then do the report.

We know that a Gnu-Linux/Panther/BSD world would be safe and why. As a victim of
Redmond he doesn't - and he can't be expected to as a non-expert. And of course
as a non-expert anything he writes is questionable.

Please pass my post along to him. Get him to read Groklaw for 2-3 weeks, as well
as Newsforge, and maybe the Inquirer. Send him to the issues index at Automation
Axcess (www.aaxnet.com), in other words get him to do his research before he
publishes.

If it helps PJ I will be happy to talk to him - you have my contact info.

Wayne



---
Wayne

http://urbanterrorist.blogspot.com/

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Totosplatz on Saturday, July 15 2006 @ 10:17 AM EDT

By the early 80s the present system of permissions and ownership now found in *NIX systems first emerged. Historians please correct me but that is how I remember it. These particular protections against malware had come into existence mostly (I believe) to thwart the various tricks that grad students and upstart undergrads were playing on one another. That's maybe not reality but in any event before any flavor of Windows even existed you will already find a system of permissions and of ownership in *NIX systems which still does not exist in any flavor of Windows, as far as I am aware.

*NIX systems had a long association with the gestating Internet in the 70's and 80's, and *NIX systems have a long association with networked computing. The Internet experience of *NIX based systems is not without flaw but is all in all a fairly calm experience, precisely because of the long history of creating an architecture of permissions and of ownership so as to thwart grad student pranks and more serious stuff. In other words, long before Congress opened the Internet up to the general public *NIX systems ALREADY had in place an architecture which was sufficiently protective to prevent wildfire outbreaks of malware.

I believe that the impression that the Internet is too complicated for the great majority of us is based upon the bad experiences of Windows users attached to the Internet. This impression is false.

---
All the best to one and all.

[ Reply to This | # ]

Most offensive concept?
Authored by: BigBadBob on Saturday, July 15 2006 @ 10:18 AM EDT
PJ tells us her law professor friend "feels that most users have more computer power than they can safely use, because they lack the skills." Thus we need government regulation. Am I alone in the view that this is offensive eliteism? I mean, what is he talking about:
  • a 7 day waiting period before buying an Athlon 64?
  • a one megabyte-per-month law?
  • background checks before getting licensed for Linux?
Fooey! I think law professors have more power than they can safely use! They need to be regulated in the court of public opinion.

[ Reply to This | # ]

Obviously, it can't be a matter of skill
Authored by: Anonymous on Saturday, July 15 2006 @ 10:21 AM EDT
While I see and acknowledge your underlying point, I'd like to offer a different
data point to your experience.

I've been in computing for quite a while, and my "skill" is probably
above average, but there's been a PC in the living room of my house for over 15
years now (not the same one, of course), always under the control of an IBM then
MS OS, actually Windows for most of that time. It's seen three kids from grade
school up and out to college, so it's been far from a "lab"
environment. And I can say, as you, that I have never had a malware problem.
Why? Because I am knowledgeable and careful, and take reasonable -- albeit far
from excessive -- precautions.

Don't confuse me with an MS apologist, because I'm not. I was a big OS/2 fan,
and I have Linux systems now as well, which I like (and I like the presence of
choice even more). My only point is that it is not impossible to run a clean and
stable MS Windows system for a very long period of time.

[ Reply to This | # ]

Re : David Yen
Authored by: urzumph on Saturday, July 15 2006 @ 10:27 AM EDT
I must admit, I found this article confusing. You meantion a lot of stuff about
network scalability ('clogging up the internet with movies', etc) but the David
Yen article talks about how to produce CPUs that handle network loads well.

This may sound similar, but they are actually very different. Network
scalability is simply about having enough wire and processors to correctly
transmit all the data. Actually designing a processor to do that well is another
matter.

Network servers have a relatively unique profile as to how they use the CPU
(many but simple processes), and catering to that is a difficult task.

[ Reply to This | # ]

Network Congestion and the future of the Internet
Authored by: Anonymous on Saturday, July 15 2006 @ 10:32 AM EDT
A couple of points that the folks in congress and some professors don't
understand.

1- That the father of the internet says that this congestion issue and the raid
to propretize the internet is not was the design was for.

2- That IV4 has packet retransmission needs of 50-70 percent depending... That
means that 50-70 percent of traffic on the internet is retransmitted packets
(that is why VOIP is not as clear sometimes as the regular phone lines). IP V6
- WHEN THE ROUTERS AND THE INTERNET parts are all SWITCHED TO IP Version 6 (IP
V6) then the estimated packet retransmission proble drops to the 7-11 percent OR
LESS depending (that is a huge improvement in the internet speeds simply because
all those packets that are being retransmitted will not be retransmitted and so
the existing PIPES will be that much bigger ). Most equipment out there now is
already able to do IPv6 (it just requires an easy switchover)... not a
replacement expense.

The telephone companies FEAR IP-V6 because then it will be VOIP that is as good
as their proprietary lines now! They have competition in VOIP - so they want
congress to build a dual network where they can charge the competion and retain
an monopoly on local service (that they now pretty much have). NOW YOU KNOW WHY
THEY HAVE NOT SWITCHED TO IP-V6 as of NOW. They want to tell the unknowing
that the pipes are too clogged and they need a way to make more money to unclog
them... well, IP-v6 is just a setting away for most right now and they delay
this feared improvement as competition direct against their regular phone
business.

3a- Fiber - Well, Fiber will clean up many more line quality packet
transmission issues (making the internet not only faster but cleaner). Since
the invention of multi-spectrum fiber the amount of dark fiber has become an
issue for many because a single strand of fiber is so efficient and there is all
that dark fiber waiting to be filled with multi-spectum fiber data transmission.
There is no real need to build out the existing fiber trunks in the short term.
Fiber and the use of fiber to the business or house will mean a cleaner and
faster internet and will reduce user end packet retransmission to less than 3
percent, meaning with IP V6 and fiber, that the existing pipes will be in a best
case scenerio 67% better able to handle traffic (and this is a snap shot as the
clog problem and the percentages SCALE relative to a factor... meaning that the
amount of internet that will be freed up could be a factor times 67% greater in
real numbers (of course this is debatable and depends on how good the sniffer
packets in IPv6 perform going thru certain equipment and how some equipment is
better than others. But, the improvement with IPv6 and fiber will be astounding
(something that I have not yet heard mentioned in one congressional hearing)!

3b - Fiber - to the business or house. Well, existing copper has problems with
degrading the points where the ends meet the terminals (copper ages, the wires
get moiture and that is a problem for data, and the corrosion at the points were
one piece of copper joins to another means that there will be a 100% chance of a
non-revenue service call to clean the ends and LABOR is EXPENSIVE. I know of
one fellow in the telephone company that his moving to the battery division as
the use of fiber means that there is only about 12 hour or so (depending on the
backup power solution) of battery backup for rural locations were fiber is
coverted to copper... so maintainace is needed on those battery and power backup
solutions... HOWEVER, the reason why he wants to move from his regular line job
to the battery job is because the fiber to the house means NO NEED for the usual
maintaince calls for low quality lines as the fiber does not degrade, and does
not serve as a conductor for lightning that causes massive repair bills for
equipment and labor... so with Fiber the Telcos will save big on labor and
equipment repair (and the old copper they take out they can resell at the market
rate for copper - now very high, and copper will always be in demand for other
than telco wiring needs).

Conclusion - this hype that the pipes are too full and their is no hope insight
is FUD. IPv6 and underused existing fiber and fiber to the user, are
technologies that can be paid for by exiting Internet pay-for structures.

Internet neutrality needs to be an AMMENDMENT TO THE CONSTITUTION ... as until
it has this protection, then on this budget bill or the next (where things
always get passed no matter how bad they are as ideas), well the internet will
not be protected as special interest groups who WANT TO OWN THE INTERNET WILL
NOT LET UP AND WILL KEEP THE PRESSURE ON TO SOMEHOW OWN IT... and until you have
2/3rds voting protection of an ammendment to the constitution... an ammendment
is, and will be, the only way to protect a free internet in a free speech sense.
To preserver the internet as it was born... we need this level of protection.
Any bill in congress can get changed too easily to offer any protection for the
internet at all!

4 - Security! as a foot note read this: Computer Security is an Oxymoron and
if it exists depends on 3 things to get it right: Electronics, Math, and
Humanity. Any 3 can fail and Humanity you can not perfect... so failure is
always going to happen. Getting rid of a monoculture where we have data
standards like ODF where hundreds of OS versions, or thousands can use the same
data formats RUN ON DIFFERENT apps and Operating systems will help. A two
tiered internet for security is a dream as it will still be vulnerable. Read
Secrets and Lies, by Bruce Schneier AND read a ton of other materials about
security with computers and you will begin to understand how a big an issue it
is (and how it can not be solved by dividing up the internet into 2 parts... as
then whose computer are you going to allow on the "secure part" and
what assurance that you don't have an Electical bridge from one internet to the
other somewhere, or a Mathamatical bridge, or a human bridge where a valid user
has a mental problem and one whole lot of computer skill to go along with his
access rights to the business network... this professsor is dreaming.

[ Reply to This | # ]

Users
Authored by: Carlo Graziani on Saturday, July 15 2006 @ 10:45 AM EDT
I still feel that there is no solution to the malware problem without ascribing
at least *some* responsibility to users.

The fact of the matter is, allowing untrained people to plug computers into the
network as if they were toasters is a lot like letting people drive cars without
making them pass a drivers' test. A poorly administered and configured
networked computer endangers everyone's Internet use.

I have no brief to defend Microsoft's bloated accretion-ware, and I agree fully
that it is architecturally un-securable. But to imagine that a Linux-Mac-heavy
population would make malware go away is to forget what the Internet was like
*before* everyone's Mom was using it, and most nodes were Sun/SGI/Vax/IBM. It
was essentially an all-Unix environment, and after about the mid-80s, attacks on
Sendmail, ftp, finger, etc. were extremely common.

Keeping those systems secure took professional sysadmins watching over users
like hawks, patching software assiduously, and monitoring logs for a living (it
also took an evolution of programming style by developers of network software,
who did not start out with security as a priority).

Best practices have evolved out of that experience that helps make Unix-esque
systems orders of magnitude more secure than MS's stuff. But it is not in any
sense "impenetrable", and I think it is likely that as more talented
but morally-retarded programmers turn their attention back to Unix-like stuff,
we will find that Mac and Linux home users have been living in a fools
paradise.

The fact of the matter is, the most secure architecture is no protection against
inept administration. Home users can turn even an OpenBSD box into a
script-kiddies' playground, by making bad configuration choices. I just talked
to a guy last week who was planning on getting a new computer, because he
believes his current one "has a virus". It didn't seem to occur to
him that unless he changed something about how he uses his computer, his next
one will likely become infested as well. I'm starting to get really tired of
having this kind of conversation.

I don't believe legally mandating software design principles can possibly work.
However, I do believe that laying at least some responsibility at users' door is
at least part of the answer.

My legal toy model for doing this is mandating that ISPs (1) require users to
post a "good admin practices" bond, (2) disconnect any user whose
machine shows signs of being 0wned by a cracker (participates in DDOS, sends
spam, etc.), and (3) declares the bond forfeit upon disconnection, and requires
another to be posted for re-connection.

If this happened, a lot would change in a hurry. Users would be a lot more
serious about demanding secure software designs from their vendors -- possibly
in court. Licensed and bonded home computer repair and maintenance businesses,
capable of backup-disinfest-restore ops would fill ten pages in the phone book.
And quick courses on secure computer maintenance for home users would become
common, and well-attended.

I'm sure there are other models (state licensing, like for drivers, comes to
mind). The point is, it makes no sense to pretend users aren't part of the
problem, because they are. Recognizing that fact can allow them to become part
of the solution as well.

[ Reply to This | # ]

It's not monoculture.
Authored by: Stumbles on Saturday, July 15 2006 @ 10:57 AM EDT
I have to disagree to a point that monoculture software environments
are the problem. Yes, such an environment can and does greatly
magnify the weakness that might be present in any given application or
operating system.

But the real problem is the design of software. The analogy that was
used of users having way to much power is really a bogus one. When I
hear these sorts I often think of things like pro auto racers who
scamper about at 100+ mph, wreak which totally demolishes their car
and yet walk away with nary a scratch. Why is that? Well, for one
protections are built into such cars at the get go and equally important
the drivers (users) must wear the proper clothing protection.

That may not be the best analogy but the point is.... it's the base design
of the operating system... stupid (that's not directed at anyone in
particular). And the things users must do to accomplish some task.

We have seen over the years how a billion dollar company is in my
view pretty much incapable of providing timely patches to their
problems. While at the same time they seem to have a propensity for
just ignoring other security issues.

And yet, after watching the open source folks respond to the same type
problems there does not seem to be the same kind of time related
issues. Go figure.

Regulatory action certainly will not fix a billion dollar a year company
and their problems. Such actions seldom fix anything and at worst
simply add an additional layer of um, er oversight that ends up further
impeding corrective actions.

The proprietary and open source folks don't need such government
meddling. There are many many businesses and individuals who have a
real interest in helping both camps correct problems. The real problem
is the proprietary folks seem a whole lot less willing to do the most that
they can to fix them.

---
You can tuna piano but you can't tune a fish.

[ Reply to This | # ]

Users or Capabilities
Authored by: Anonymous on Saturday, July 15 2006 @ 11:00 AM EDT
The malware problem isn't the user per se, and it isn't the software per se,
it's that the software requires a higher degree of knowledge that the users
possess, and more time than the users have to apply to the administration of the
computers.

And that's true for all the major OSs. I'm running Ubuntu, Suse, Windows and
Mac. The Mac is the easiest to administer, but even it is too much for my
mother.

No, the Interet won't be safer until those without a clue are made to use
computers that can't be used to harm others.

(Posted using Ubuntu)

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: jmac880n on Saturday, July 15 2006 @ 11:01 AM EDT

if folks want to use the Internet for things that put a strain on everything, essentially for their own business benefit, then shouldn't they be the ones to pay to build it out or build their own, instead of negatively affecting everyone else's experience on what is surely as much a public commons as a public park or a river or the air? Are movies important enough to alter the very functionality of the Internet?

It seems to me that my net connection came advertising a certain bandwidth - with no limitations other than a fairly reasonable Acceptable Use Policy.

In other words, I have already paid for the privilege of impacting the rest of the net. And so have the sites that I visit, when they pay for their fat pipes.

If the telecoms now want more, then it strikes me as poor planning, greed, or panic about competition from alternative business models (VOIP).

What makes the Internet such an incredible tool is that you can use it for so many different things (many of which have not even been developed - or even thought of - yet). A lot of people and companies justify their net connection's cost on the basis of the useful things that you can do now. The ability to innovate on the network comes as just another perq.

If the net is segregated, then I believe that many fewer people would be able to attach to the developmental side, making it much more difficult to actually prove the value of new, innovative applications.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Saturday, July 15 2006 @ 11:05 AM EDT
This is going to sound rather blunt but Im afraid the goor professor referred to
above is clueless about the internet. I cant say how much she/he knows about
computers in general but if that is anything to go by some serious reading is
called for.

The Internet started out life as a military tool. It morphed into an acedemic
network that became popular as a general method of information sharing.
'Business' only discovered the internet very late.

Evolution on the internet is very fast. Even by computer standards. If it works
people will flock to it. Think of eBay, Groklaw, Craigslist and others.

Suggesting that regulation would 'help' is simply daft. No one could have
predicted the popularity of groklaw. It simply filled a niche that was needed.
No regulator would have dreamed that one up.

If there was to be a 'business only' internet it would go bust very fast. The
internet isnt a typical media stream where space is precious. The intenet has a
world wide reach where a 'Mom and Pop' store can compete - in theory - with
anyone. Some have done so very sucessfully. Limiting your users to a subnet
would simply reduce your throughput by orders of magnitude.

This is not to say that there is not a place for VPN - virtual private networks.
They do have their uses. But by definition these are limited to those allowed
access to them. If you only have (say) 10 customers then a VPN will work for
you. If you are (say) Dell you have a lot more and a VPN wont work. They simply
will be ignored.

If there is a problem with malware then the correct approach is to fix the
software it targets. Bug fixing in Linux is a BIG business and occurs very fast.
There is very little malware for Linux. This isnt simply because Windows is more
popular - although that may contribute - but because 'security by obscurity' is
an intrincially flawed concept.

If someone really really wanted the Coke formula today they would go off to a
shop, buy a bottle of the black stuff and spend a few weeks with a mass spec and
a set of chemical databases. Reverse engineering this formula isnt hard any more
- just very tedious.

Virtually anything that is in common use today can be reverse enginered given
enough time and eyeballs. Computer SW is no exception. The question is why do
they do this?

Some do this because its fun. Other because they think they can make money out
of it. Still others because they are paid to do it - the military are interested
in these things.

On a practical level if the prof in question thinks that regulators can draw up
enough rules to make the system safe then why is he in business? There are
surely enough rules governing human behavior aquired over centuries of practice?


The mere fact that s/he is still teaching law proves the point: there comes a
point where it is close to impossible to regulate behavior like this.

There are also international considerations. Whatever regs one country has to
set up all others have to agree to. That will take years. In the mean time the
internet will have moved on.

In summary then

(1) a subnet is a bad idea that would end up dying of lack of use. MS tried this
before adopting TCP/IP.

(2) regulations should be as light as practical and be for guidance only rather
than prescriptive. Anything else wont work or will simply be ignored.

(3) work with the existing tech bodies. The W3C have doen a fairly decent job of
regulating what can and cannot be done on the internet so far. Not perfect but
not bad.

--

MadScientist

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Saturday, July 15 2006 @ 11:07 AM EDT
i think the solution to the issue of internet security is simple: stop telling

people it's safe. we are in this strange place where teh government is in fear

of cybercrime, but still says go ahead and put your banking details online. i
think we need to acnowlege it's a jungle out here, and unless you understand
that there are things liek phishing, understand what a security certificate is
and what it means, and that eamail from out of the blue should not be
trusted, then you shouldnt be using the webfor anything important liek
finances. while i agree that mac and linux are safer, i think it's got as much

to do with the fact that the system forces you to undestand things liek usert
permisions and be generaly computer literate to use it. these operating
systems also are vulnerabel to attack, but the users are generaly smart
enough to under stand that. i.e. you dont open random attachments in root
pj, windows users often do the equivalent

[ Reply to This | # ]

Monoculture and Design: both are problems
Authored by: joef on Saturday, July 15 2006 @ 11:28 AM EDT
I believe the problem has two aspects that, taken together, have created an
extremely hostile environment that we call The Internet.

1. The MS set of philosophies leads to high vulnerability in a given instance
of a computer running their offerings. I call theirs a "set" of
philosophies, because they aren't consistent at any one time, nor are they
consistent over time. As a result they are regularly faced with dilemmas of
their own making. Example: To preserve backward compatibility of their new
offerings or to force upgrades on unwilling customers. All of their product
line is tightly coupled to the operating system's basic functions so as to
provide the convenience of an automatically invoked application, yet the vast
majority of exploited security vulnerabilities are based on just these
"convenience" features. An aggravating factor is that its genetic
ancestry was never intended to require security to more risks than one would
find in a locked office. Its evolution since then hasn't yet rid the system of
all of these weak genes. There is obviously a lot more to it than described
here, but this certainly sets the stage.

2. The "Monoculture" aspect of the problem indeed compounds it
thoroughly. It creates so many vulnerable targets for a given vile act that
there is a lot of profit in acting, and and in doing the research necessary to
discover yet another way to perform a vile act. And given the poor design and
practices in the MS code, it is a very rewarding activity.


Regarding the larger issue if regulation, I believe it is inevitable. There are
too many competing interests, and at some level there will always be a
constraint that becomes the nexus of a conflict of interests. Given what the
world of realpolitik has become,, it is naive to assume that those who write the
laws are going to look out for the interests of the general public. Maybe we'll
have to just decide for ourselves as to into whose special-interest bed we will
crawl.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: toads_for_all on Saturday, July 15 2006 @ 11:41 AM EDT
"Ballmer spent the better part of the next two days trying to rid this PC of worms, viruses, spyware, malware, severe fragmentation, and well, you name it."

One has to wonder if incidents like this might be behind the delays in releasing Vista. Maybe they're finally "getting it".

"...but I don't believe either could ever be as unsafe as Microsoft Windows, because they are designed differently,..."

But that is not to say that Windows, given time and several large cluesticks, could not be safer. At one time there was talk that MS developers were advocating a different approach to building Vista, by using a "Linux" development model. I can't seem to find a link to it right now. Anyway, I believe MS could produce a more secure version of Windows, but they would have to start over from scratch, lose a lot of the fluff, and treat Office, Internet Explorer, and Media Player like third-party apps that don't have access to "root". To me, that has been one of the biggest security problems is not isolating apps from the OS.

"How do you answer his firmly held belief that the problem is a matter of popularity, not design?"

IMHO, in a way it's both. If the Windows/Linux user ratio were reversed, Windows would still have malware, but the problem would not be perceived as being as "bad" because fewer users would be affected. The high number of Windows users makes it a bigger problem. Which is why I believe that, just as any company has an obligation to make their products safe, MS has a obligation to make Windows more secure.


Maybe the CPSC should start looking in to MS?

[ Reply to This | # ]

Using muticast mbone
Authored by: Anonymous on Saturday, July 15 2006 @ 11:45 AM EDT
From PJ article:
" And while I stay away from politics, my logical mind asks this question: if folks want to use the Internet for things that put a strain on everything, essentially for their own business benefit, then shouldn't they be the ones to pay to build it out or build their own, instead of negatively affecting everyone else's experience on what is surely as much a public commons as a public park or a river or the air? Are movies important enough to alter the very functionality of the Internet?
When I read Dr. Yen's page, I wondered: is there something else we could try to scale?
"

It is hard to respond to something that is expressed as general as this but cause me to think of something very specific:

That is this: (this is a tech view of things)
As an example if movie distribution on the internet means one pipe (controlled by DRM or what ever) for each view of a movie it will be very wasteful of bandwidth. A much better approach would be local caching much more along the lines of some of the html proxy servers or (the dns caching system except for larger scale). Imagine 200 or so people from one local isp requesting the same movie from across country (read long distance or more than 3-5 hops). If there is one pipe for connection than there is 200x traffic on most of the path that is not necessary. Most of the path should be shared and than split up to 200 at the isp level (or one or two machine hops away).

I did a simple google search to see if I could find some one else thinking alone the same line as I was (while not exact fit) I found these Bbone or Bbone summary

This would be a service that would make since on mbone or multi-case back bone. I would think user of the serice were to schedule there download with local caches might keep popular movies in cache and so there would not be the need for a lot of cross internet transfers.(If the service was real popular than download times might not be longer than they would be from an overloaded server -- maybe wishful thinking here) for a description of basic idea behind mbone see Multicasting and the Mbone (This page has a link at the end saying you can watch old 'Scooby Doo' over mbone today. I have not tried it so do not know how good it is).

The trick is to prevent rebroadcasting of identical information. I hope this helps with a tech perspective on this. (This tech perspective does not help with identifying what would be required for drm or any other 'control' tech. I would think content could be sent pgp (pretty good privacy) encoded and the keys could be sent with all the 'control' network. This would get most of the benifit with very little of the overhead involved in drm. However for something like broadcast TV the DRM stuff would not be necessary (and the pgp should would not be needed either :) ).

If there is anything patentable in this idea. I am publishing it now. (So only I can patent it within 1 year in the US anyway. I do not think it would be hard to get a pledge from me not to file a patent on it. Now does publishing on Groklaw put this into prior art. So someone else does not try to patent it? (If you can not tell, I do like first to invent in US).)

a florida resident.

[ Reply to This | # ]

Firebreaks
Authored by: rsteinmetz70112 on Saturday, July 15 2006 @ 11:58 AM EDT
Having a diverse ecosystem of different operating systems would act as a
"firebreak" for any particular exploit. If a particular exploit were
to work on some version of some package in Linux/Unix/BSD/Mac/whatever, then
other versions of Linux/Unix/BSD/Mac/whatever would not be vunerable.



---
Rsteinmetz - IANAL therefore my opinions are illegal.

"I could be wrong now, but I don't think so."
Randy Newman - The Title Theme from Monk

[ Reply to This | # ]

  • Firebreaks - Authored by: Anonymous on Sunday, July 16 2006 @ 01:10 AM EDT
Complex Problem
Authored by: tyche on Saturday, July 15 2006 @ 12:00 PM EDT
PJ,

There seems to be a mixture of causitive situations associated with the problem
of insecurity on the internet.
1.) Monoculture - It has been noted, on various programs and movies that I've
seen concerning wildlife, that predators tend to go after the weakest elements
in a herd. The sick, old, very young and injured are prime targets. Crackers
may be termed predatory programmers, and the Microsoft operating system appears
to be the weakest member of the "herd". This makes it a prime target
to those who would do harm.
2.) Business of selling - Otherwise known as "those who wish to
control". When I was producing pages for a web site I continually had to
tell my boss that if you put it out there it'll get taken. Post nothing that
you are unwilling to loose. It is human nature to take that which is put out in
an unguarded place. Putting movies, pictures, books and/or articles on the net
causes them to be an "attractive nuisance", much like putting a
swimming pool in your back yard and expecting the neighborhood children to stay
out of it. "Those who wish to control" are predators, as much as any
cracker could be. In this case, instead of direct distruction, they want to
restrict the ability of others to communicate in order to make money - feed off
the herd.
3.) The carrier ("backbone") that provides the connections. Recently
telephone companies have lobbied to be able to charge more for certain services.
A sub-set of "those who wish to control", they are also predators.
They, too, wish to feed off the herd without incurring the displeasure of other
predators (who might have bigger teeth).

When predators breed out of control, the herd suffers and/or dies off and/or
moves away. Then the predators die off or move away. Basic wildlife management
imposes restrictions on the levels of predators to prey. I would propose that
the best "security" for the internet is to impose such restrictions on
the predators, rather than allowing them to indiscriminately prey on the prey.

As a human being, one has a choice of whether to be the predator or the prey.

Craig
Tyche

---
"Do not meddle in the affairs of wizards, espicially simian ones. They're not
all that subtle."
"Lords and Ladies", Terry Pratchett

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Saturday, July 15 2006 @ 12:12 PM EDT
Like many others posting I have to say I don't think much of your academic friend's ideas.

For starters just what is meant by "an internet for business" - is it just for business domains or does it include anyone who has not taken out a licence to experiment. The latter version means that all of those home users whose machines have been trojanned because they have no idea of how to keep a machine secure (or more likely simply don't care) will be on the 'business domain', how does he plan to keep their machines secure ? The former includes a lot of small businesses who tend to have the same lack of clue about the responsibilities of running a computer. The only way I can see of dealing with this problem is to insist (on pain of death or something similarly nasty) that all machines are checked regularly by a registered professional - an idea that will go down with the majority about as well as a plutonium brick.

While monocultures are part of the problem they are not the whole of it. Another important part is the attitude that computers can be made suitable for anyone to use. Yeah, Bill G invented that idea, but these days it's being pushed by a lot of populist politicians, including the famous Tony B Liar. I don't know what's happening across the pond, but the UK still regularly gets treated to a load of garbage (I'd like to use much stronger words) about the 'digital divide' by those seeking the popular vote. Until the latter get it into their heads that a lot of people should never be let loose with a computer then we are going to have problems. Unfortunately, using a computer is still seen as requiring a less responsible attitude than driving a road vehicle.

Incidentally, the idea of insisting on some form of examination be passed before one could put a computer on the internet was touted about 2 years ago in the British Computer Society members journal - by a senior figure of Microsoft UK no less !!

[ Reply to This | # ]

Kindly point that professor...
Authored by: The_Pirate on Saturday, July 15 2006 @ 12:16 PM EDT
...to http://news.netcraft.com/ ,and ask him to read the Web Server Survey. And
check a few servers.

Every single wira or worm that has 'paralyzed the Internet' has been running on
M$. If popularity==risk, then how come nobody can kill the Apache servers?
On the Internet, F/OSS is the big player.

I cannot help thinking this professor seems rather biased - or maybe he just
attempt to provocate? Who is funding him?

[ Reply to This | # ]

Your viewpoint is technically naive
Authored by: Anonymous on Saturday, July 15 2006 @ 12:24 PM EDT
Unix is not immune. A Unix virus was demonstrated long before Linux even
existed. It will propagate more slowly on a well administered system than in
the Windows world. I doubt however, that many personal Linux systems would
qualify as "well administered".

Your browser runs in the space controlled by your UID and malware can easily
take over everything you have privileges for. Every browser I've tried has been
maddeningly buggy. After penetrating the user account, it wouldn't be that hard
to boost privileges to root (e.g. when the user invokes su(1) ) and take over
the entire system. This is obvious to anyone w/ significant system admin
skills.

You don't see as severe a problem in the Linux world because the user base is
not as large. This makes it a less attractive target. Also, despite rampant
complexity, Linux is still not as complex as Windows.

Andy Tannenbaum is addressing the underlying technical issues in Minix 3.
Linux won't ever get there because Linus has a different objective. There's
nothing wrong with that, but it is a limitation.

Real security is very much harder than most ever imagine. For a better grasp
read Ken Thompson's Turing award paper, "Reflections On Trusting
Trust". Then ask yourself what would happen if a tainted compiler got
placed in a popular Linux distribution.

As noted by another writer. Without expertise one only has a personal opinion,
which is generally of dubious correctness. In my case I have 20+ years of
programming and system administration experience w/ over a dozen operating
systems (all Linux & Unix variants counted together as one OS)

rhb

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: fozzy on Saturday, July 15 2006 @ 12:56 PM EDT
I think the best counter to the "Linux isn't attacked because of it's low market share" FUD is the monthly Netcraft Webserver surveys:

http://news.netc raft.com/archives/web_server_survey.html

Apache is currently running 63% of all webservers, compared to IIS at 30%. Heard of many Apache attacks of late?

Another piece of FUD to the dustbin.

--
Fozzy

[ Reply to This | # ]

Congestion
Authored by: Anonymous on Saturday, July 15 2006 @ 01:37 PM EDT
If the Internet is getting crowded then there is one way of clearing a lot of
space. Clobber junk mail hard.

Take out the viruses, junk etc at the internet level. If mail servers responded
by not sending this stuff then DDOS attacks, botnets, spam etc etc would get
clobbered.

We cannot make all users take all the precautions they need for many reasons.
Perhaps this needs to be taken to a higher level.

The question would be how to control it so it was not used for censorship.

Tufty

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Jude on Saturday, July 15 2006 @ 01:44 PM EDT
> And while I stay away from politics, my logical mind asks
> this question: if folks want to use the Internet for
> things that put a strain on everything, essentially for
> their own business benefit, then shouldn't they be the
> ones to pay to build it out or build their own, instead
> of negatively affecting everyone else's experience on what
> is surely as much a public commons as a public park or a
> river or the air?

If I am watching a streaming movie that I bought from $CONTENT_PROVIDER, who
exactly is using the bandwidth? The idea of charging the provider more for
"clogging the internet" makes no sense to me, because the usage is
determined by the customers, not the provider. In any case, any extra charges
to the provider will just get passed to the customers anyway, so what's the
point of charging the provider?

What I think is particularly evil about the "charge the business"
approach is that it not only hides the cost of the bandwidth usage, but it also
can lead to telcos collecting extra money from people who are not even their
customers.

Imagine that (for example) $TELCO decided to charge iTunes for giving $TELCO
customers decent connections to iTunes. If iTunes doesn't pay, they risk losing
the business of people who get ISP service from $TELCO because of slow or
unreliable downloads. OK, so where does iTunes get the money to pay $TELCO? If
they charge higher prices ONLY to people who connect from $TELCO IP's, they
probably annoy the $YELCO customers just as much as poor connections would, and
they lose the business anyway. So instead, they raise their prices to
everybody, which effectively means that iTunes customers who do not use $TELCO
ISP service end up paying $TELCO anyway. That is hardly fair.

Also, look at who is most anxious to start delivering video over the internet:
The telcos who want to complete with cable companies. The problem for the
telcos is that IPTV customers will use huge amounts of bandwidth, but if they
try to make those customers pay for the bandwidth the price will be too high. I
think the telcos are trying to get competing content providers to subsidize the
infrastructure improvements the telcos need to do IPTV.

[ Reply to This | # ]

Two Internets
Authored by: PolR on Saturday, July 15 2006 @ 02:07 PM EDT
This is such a ridiculous idea. But I understand its appeal. I spend my days at
job struggling with people defending this same kind of theory to keep some
critical applications safe from security treaths. There is a category of persons
that thinks it would be nice if we can have all the rogue software in a ghetto
and convince them to obediently stay there.

One problem is how many computers will people have? One for safe business and
one for the "innovative" but risky applications? Because if you use
the same computer for both netwrks, they are not really separate. Once the
computer is infected, it is infected for both networks.

Suppose that by any chance you convince people to have separate computers, won't
they need to transfer data from a computer to another? Again if you connect the
computers, you no longer have separate networks.

Finally how do you make sure you can separate business from other applications?
Do you really think malware will stay obediently on one side and never invade
the other? Malware is after money. Spam, spyware, phising, they are all
parasites that exist for the sole purpose of supporting someone's sick idea of a
business model.

Security by regulations. Ask the RIAA and MPAA how well this worked for them.

We already have plenty of regulations against malware. That didn't top Sony from
implementing rootkits.

[ Reply to This | # ]

Two Internet levels sounds like a Microsoft dream
Authored by: Anonymous on Saturday, July 15 2006 @ 02:16 PM EDT
Why not just charge Microsoft (CASH ) for any security breach that occurs
because Windiws is insecure and was easily compromised.

A fiasco with CREDIT CARDS comer to mind.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Saturday, July 15 2006 @ 02:19 PM EDT
>> The professor suggests having two Internets, one for business, which
will be seriously locked down to keep it "safe" for business, where
users have very few options,

What does "locked down" mean? What is "business"? What are
"options"?

The Internet is a communication protocol which allows to transport information
from one end to the other. The problem is how are they processed. To have a new
network will not change how the information transmitted on it will be processed,
as far as I can tell.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: sgtrock on Saturday, July 15 2006 @ 02:39 PM EDT
PJ, you and the law professor are working from the false premise that we are
running out of bandwidth due to economic factors. I have worked on data
communications systems and network communications systems of various flavors for
almost 30 years. Over the years I've been a tech, designer, installer,
superviser, project manager, and architect, so I've seen those systems from all
sorts of angles. During my career, I've seen the cost of moving bits go from
several hundred dollars a month for 110 baud teletype links to less than fifty
dollars to move 1.5 Mb/s to my home. Korea, Japan, and elsewhere are hooking up
100 Mb/s to their houses for less than what it costs me. That's roughly 8
orders of magnitude decrease in cost to deliver the same amount of traffic that
I saw at the beginning of my career.

We don't have a bandwidth problem because of either economic or technical
factors. We have an attempt to control traffic by people who would benefit from
restricting us from utilizing the bandwidth as we see fit. Anyone who thinks
otherwise hasn't really looked at who is pushing for restrictions and/or
additional costs for certain customers or traffic.

In my experience, ANY attempt whatsoever to divide or inhibit the flow of
information is ultimately doomed. Why? Because as soon as you divide the
traffic in an attempt to segregate it, people and organizations will gravitate
toward any solution which provides them with more freedom to do what they want.
If SBC, or Time Warner, whoever attempts to control traffic, another solution
will inevitably show up that is freer. If Congress is stupid enough to
legislate some sort of restricted or segregated Internet, those of us in the US
will suffer. The traffic will simply move elsewhere, and everyone else will
benefit. :)

[ Reply to This | # ]

Linux is still vulnerable: Much malware is OS-independent
Authored by: McMartin on Saturday, July 15 2006 @ 03:17 PM EDT
<i>I can open up attachments in email and click on stuff and nothing ever
happens. Why is that? Because the operating systems are built for normal users
to be able to use them relatively safely.</i>
<p>
I'd say this is actually because you got very, very lucky. Javascript-based
worms are quite common, and Cross-Site Scripting-based attacks (usually as parts
of identity-theft attacks on banks) have cost millions. Neither touch the OS
level at all, and rely on <i>conforming</i> Javascript
implementations on the browsers. (The Cross-site Scripting attacks are
typically exploiting bugs in the business-specific code on the server.)
<p>
Running Firefox -- even on Linux -- would not have protected you from the
MySpace worm. Your system would have contributed to what was essentially a DDoS
attack against them, and your MySpace account would have been corrupted just as
much as any Windows user's.
<p>
Clicking unknown links is <i>never</i> safe, and it's swiftly
getting to the point where visiting unknown pages isn't, either. At minimum,
restart your browser instance before doing anything requiring personally
identifiable information.
<p>
People like to call people who say Linux is unsafe trolls around here -- and in
fact already have in the comments above. In reference to Knuth, no less. I
invite those who consider <i>this</i> trolling to think for two
seconds about universal JavaScript compliance, and what that means in terms of
"monoculture."
<p>
(Or, for that matter, HTML compliance. But since people aren't blaming Windows
for people falling for old-school phishing and 419 scams, it's not directly
relevant here. Obviously, scams relying on the gullibility of the recipient are
OS-independent. The point I'm trying to make here is that Linux doesn't protect
you from scams and hacks relying only on HTML and JavaScript compliance -- and
this class of security vulnerabilities, such as Web-interfaced SQL injection and
XSS resulting in identity theft -- have become vastly more important in the past
few years.)

[ Reply to This | # ]

Dire? Sez who?
Authored by: fb on Saturday, July 15 2006 @ 03:25 PM EDT

I question the entire frame of the discussion. Security issues becoming so dire? Massive regional power outages are a significant threat, too. Shall we have government regulations mandating backup power supplies?

A vague premonition of disaster is a bad foundation for major, innovative changes in policy affecting everyone. This is one case where I'd like to see the lawyers weighing in last.

[ Reply to This | # ]

  • Dire? Sez who? - Authored by: Anonymous on Sunday, July 16 2006 @ 10:39 AM EDT
The 'movie' internet is already here
Authored by: Anonymous on Saturday, July 15 2006 @ 03:49 PM EDT
It's called cable TV (or satelite). Why do the need the internet other than for
the consumenr uplink?

Philip

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Saturday, July 15 2006 @ 03:57 PM EDT

The largest problem we face as a society at the moment is persuading children that they should and can learn to program computers. (If no-one learns, there will be no-one to fix the inevitable defects).

A regualtory approach will squash people's desire to learn ... if there are circumstances where you can be punished by the state for programming a computer, my employer will probably tell me not to do it ...

Besides, a computer with its electricity bill paid and its Internet access bill paid is a 'resource'. One hopes that it will do its bill-payer's bidding, but if the bill-payer does not know what it is capable of and how to control it, then it is likely to be taken over by someone else who does know these things and abused for their purposes.

So I think all we can do is forgive, learn, and understand.

I know how they work; I can keep mine safe, and for my kids' schools and any charities I give time to; and my employer can hire me out to keep his clients safe if they pay his price. We're good at taking responsibility for stuff.

It's not just Microsoft Windows. It's fundamental, when you get so many identical systems on Internet and their owners don't understand them or have anyone to outsorce responsibility to.

[ Reply to This | # ]

Security and Microsoft internal politics
Authored by: Anonymous on Saturday, July 15 2006 @ 04:21 PM EDT
Microsoft is currently engaged in an immense internal political battle. The
results of that battle will determine how Microsoft repositions itself for the
future and includes how Microsoft will solve the security problem and the
development problems that are currently overwhelming Microsoft.

The repositioning goes through two phases. First there is an internal political
battle. The people in that battle form political factions based on ideology.
Most business people would use the term "business plan" instead of
ideology but I will stick with ideology The ideologies in this case are the
various ideas on how Microsoft will operate in the future. All during the
political infighting people are jockeying for position within their faction so
that they will get as powerful a position as possible when their faction,
hopefully, destroys the other factions.

When the battle is over then the winning ideology is adopted. The winning
ideology is that of the best political infighters. The winning ideology is not
necessarily the best ideology so choosing long range strategy by political
infighting is a very chancy way to do business.

Ray Ozzie has been a spectacular success at Microsoft political infighting. He
joined the company as an outsider spare part in an acquisition. In a very short
period of time he politicked his way into being Bill Gates' number one yes man,
ahead of Steve Ballmer. Concurrently Bill Gates was (and still is) under a lot
of pressure to get completely out of Microsoft. Ray Ozzie managed to become
simultaneously the leader of the opposition and to convince Bill Gates that
making Ray Ozzie the Microsoft number two man would deflect criticism from Bill
Gates.

Bill Gates knows that his time is up but emotionally he just cannot face the
fact. This is common among powerful men. The only powerful man that I have
ever seen face his imminent fall from power with grace and dignity was Pierre
Trudeau when he realized that his political career would be over in the next
Canadian election. Instead of reacting with the grace of Pierre Trudeau, Bill
Gates has reacted with the anger and unreasonable stubbornness of Lyndon
Johnson. When Lyndon Johnson realized that his political career was over
because of the Viet Nam war issue he promised to give the 1968 Democratic
presidential nomination to Hubert Humphry provided that Hubert Humphry would
give continued unwavering support to Lyndon Johnson's failed Viet Nam war
policy. Hubert Humphry thus obtained the Democratic nomination to the 1968
presidential race shackled with an issue that made his defeat certain.

In his early days Bill Gates created an interpreted BASIC compiler that was
useful for student exercises when training programmers. Bill Gates is very
proud of his BASIC compiler. Microsoft has invested a lot of money in changing
the interpreted BASIC compiler into an interpreted C compiler. They have added
an IDE. The result, called .NET, is a much improved interpreted compiler that
is useful for student exercises when training programmers. In Bill Gates' mind
.NET will be combined with another of Bill Gates long cherished ideas, running
programs from the network on a per demand basis, into the next broad based
computing platform called Windows Live in Bill Gates' fantasy world. Running
programs from the network on a per demand basis stumbles over the formidible
obstacles of performance and security but fifteen years of abysmal Windows
security has not yet beat that concept into Bill Gates' head. So Microsoft is
spending an excessive amount of development money creating an interpreted
compiler that is useful for student exercises when training programmers and not
much else. Part of the price that Ray Ozzie had to pay in his rapid climb to
power is that he had to fervently commit to Windows Live. Bill Gates is not
going to leave Microsoft until Ray Ozzie vindicates Bill Gates by making Windows
Live the next universal computing platform. Unfortunately for Microsoft and Ray
Ozzie Windows Live will never be any more than .NET.

Microsoft development is dead in the water. To get out of their rut Microsoft
will have to get rid of Bill Gates entirely. He cannot be Chief Software
Architect. He cannot be Chairman of the Board. He cannot sit on the Board of
Directors. He must be GONE.

Now for an aside about Steve Ballmer. The only thing that supports Steve
Ballmer as President of Microsoft is Bill Gates. Steve Ballmer is incompetent.
He has no loyalty from any of the people working for him. He has no support on
the Board of Directors other than Bill Gates. The shareholders detest him.
When Bill Gates leaves Microsoft then Steve Ballmer will be right behind him.

Now back to solving the Microsoft ideology problem. When Bill Gates goes the
mother of all political infighting will come to a head. Which political faction
wins has nothing to do with the correctness of their ideology. Microsoft could
go through a period of worse mismanagement than is the current situation with
the Gates-Ballmer team. I would like to introduce some ideas from Lou
Gerstner's book, "Who Says Elephants Can't Dance". "Who Says
Elephants Can't Dance" is Lou Gerstner's autobiographical account of how he
managed the famous IBM turnaround. Now the situation at Microsoft and the
situation that Lou Gerstner faced at IBM are very different so only some of Lou
Gerstner's ideas are applicable to the Microsoft problem. I don't advocate
slavishly following "Who Says Elephants Can't Dance" as Bill Hilf
seems to be advocating. That gets into solutions looking for problems that
Microsoft doesn't have. The first idea from Lou Gerstner's book is that the
next leader of Microsoft cannot be one of the participants in the Microsoft
political infighting. Microsoft must look for a candidate turn around leader
from outside Microsoft. Neither Bill Gates nor Steve Ballmer can have anything
to do with the selection process. The new outsider should be installed as
President and CEO of Microsoft after Steve Ballmer is booted so that the
newcomer does not have to fight the battle of ejecting his predecessor.

The second idea was given to Lou Gerstner by his brother who happened to work
for IBM. The idea is that the new leader must not allow political infighting to
determine ideology. The new leader must not reward political infighting. So
where do the new ideas come from if they are not presented as part of the
executive turf wars? Lou Gerstner found a treasure trove of valuable ideas in
the TJ Watson Research Center. Google has two programs to encourage innovative
ideas among the rank and file. Google allows their employees to spend twenty
per cent of their time working on their own ideas and Google hires students for
a Summer of Code where the students work on their own ideas. I don't think that
Microsoft has any equivalent to the IBM TJ Watson Research Center or Goggle's
paid employee innovation time. Microsoft has always been convinced that Bill
Gates was genius enough to design the future of world technology by himself. I
suggest that Microsoft look for new ideas among the lower ranks of Microsoft
staff. What would be the winning ideas? I don't know. It is conceivable that
the current Microsoft method would work with a genius in charge and Microsoft is
trumpeting Ray Ozzie as the required genius. It will be 10 years before
Microsoft knows whether Ray Ozzie is a genius or a flop. It would be a much
safer bet for Microsoft to turn to turn all of their employees loose to do their
own thing and let the marketplace winnow out the geniuses from the flops. It
would be much more economical for Microsoft to add resources to the winners and
bury the losers with no regrets, just like Open Source does.


--------------------
Steve Stites

[ Reply to This | # ]

You will never convince people like this
Authored by: Anonymous on Saturday, July 15 2006 @ 04:26 PM EDT
Back in 1994 I attended the 4th International Conference on Computers, Freedom
and Privacy (CFP). At that time there were people in attendence advocating the
manditory licensing of all programmers "because properly functioning
software is too important in our lives. Why, just anybody can write software
that goes into aircraft control systems!".

Today's Dr. Yen is no different. You could point out the disporportion between
MS vulnerabilities and other OS's vulnerabilities, in comparison with the ratio
of users. You could shift the argument and point out the same ratio in
webservers, or firewall appliances by varying manufacturers, or whatever. In
most all cases I can think of you wont get nearly the same ratio of userbase to
malware. But it won't matter. Some people just hunger for a solution imposed
from the top. Usually, those people see themselves as being the ones doing the
imposing.

Frankly, the burden of proof is on him, not us. He's the one who wants to make
big, sweeping, changes.

Regards,
Karl O. Pinc kop aatt meme dott com.

[ Reply to This | # ]

The Problem Wears Many Coats
Authored by: Bill The Cat on Saturday, July 15 2006 @ 08:07 PM EDT
Software hacking (both good and bad) has evolved over the decades since the
first personal computer. What is also happening is that single country systems
are being rapidly replaced with international systems. Filtering and checking
input before passing to functions is a much more difficult process than it was
in the days of ASCII and bit counters.

Often it is necessary to break a system to make it work. Take for example,
Linux. I recently had to install a Russian language system here in America.
Not a problem really except that I had to break the password controls to make it
work. Yes, I could switch languages and keyboards and tell the OS that I wanted
Russian Cyrillic but, the password file had to remain in English ASCII~!! So,
here's a system installed for Russian, Russian Keyboard and Russian strings
everywhere but, you need an English keyboard and UI to login!!! The password
file would not accept Cyrillic Unicode character sets at all. I tried Suse, Red
Hat, and some others. Each had the same limitation.

Open Source to the Rescue! Because I had access to the source, I could make the
necessary changes to the software so that it would accept non-ASCII input, test,
validate and store to the password and shadow files. While I was at it, I
changed some other things to Unicode too.

The problem was that a lot of what was there to prevent unauthorized access was
now broken and, I'm still trying to secure the system again. Why an OS is
designed to be installed in different languages but still requires English to
login is beyond me but my client is happy with the progress so far.

The bottom line is that having access to the source written by somebody else is
a challenge to adapt to unique situations. It is extremely easy to break some
things while trying to fix a single element. So, while Linux may be secure (yes
there have been patches and security warnings), what you do with it may break a
lot of that in the process.


---
Bill Catz

[ Reply to This | # ]

an ordinary joe loses patience...
Authored by: Latesigner on Saturday, July 15 2006 @ 08:15 PM EDT
...or, save a tree kill a professor.
They're not the problem and they're getting really tired of guys like him.
Does he have any idea how offensive he is?
I would have thought one William Shockley was enough.

---
The only way to have an "ownership" society is to make slaves of the rest of us.

[ Reply to This | # ]

Too Many Issues Jumbled Together
Authored by: LionKuntz on Saturday, July 15 2006 @ 08:23 PM EDT
There are too many distinct issue jumbled together in one topic to deal with.

The one that I am most concerned with is related to malware.

Firstly, the public premise is wrong. Because of that faulty premise malware is
able to exist, and without that premise malware is reduced to insignificance.

When I acquired this computer I went to Frys and made purchases. I alone. Bill
Gates wasn't there, Linus wasn't there, no PJ and the Professor wasn't there.
Nobody was standing by my side slipping me $500 for part interest in this
computer. I alone own this computer.

The operating system I installed created multiple directories and emplaced
multiple files, most with obscure names and unknown functions. I was invited to
make some prompted choices of modules and accessories, but I was not informed
what was being added to a machine I alone own.

This is the beginning of malware. No log of installations was created. I could
have created my own log of the files placed, their locations and their
filelengths to later detect malicious modifications as additions or
modifications, but that would serve no useful purpose as many data modules are
automatically created (without informing me, without being logged, without
asking me) on a computer I alone own.

Over time I am riding herd over 60 megabytes of file data on my hard drive, 90%
of it I recognize, but thousands of modules and files I never consciously put
there and have no log of who put them there.

Using software modifies files and creates additional data files. Sometimes the
program even has self-modifying code that records data in the executable in a
perpetually morphing file. Again, I am not notified or the changes logged.

The average person doesn't want to know the details -- I don't usually want to
know the details, but the point is that I am not even permitted to know the
details in the event that I ever need to root out maliciousness.

You, the software vender, programmer, are a guest on my computer. Yes, you
perform desired services for which I am grateful, but I alone paid for this
computer and it is mine alone. Any contract made under duress is void -- I have
no contracts with you that I need honor if you forced the terms after you have
received my money - I simply disregard shrinkwrap EULAs as void in all cases.
Copyright law as applied to books applies to software -- the book publisher has
no further say over my use once he has accepted payment and transferred the
goods. When I buy it, it is 100% mine and 0% yours, that copy, from then until
forever after.

Even if I don't really want to know the gritty details, you have an obligation
to tell me anyway, maybe unobtrusively with "read.me.text" files, or
in the documentation, or by a legibly titled log file, what you are doing to the
computer I alone own. This is an on-going obligation you owe me in exchange for
the money you received and permission to become a guest on my computer.

When there is a culture which respects my ownership rights, then malware cannot
hide. When you deviate from respect for my ownership rights, then you create a
culture of camoflage within which anonymous malware modules can hide and prey.

So the first step in eradication of malware predation is a culture of disclosure
based on respect of my ownership rights. If that is some manner impinges on your
competitive posture in the marketplace, that is 100% your problem and 0% my
problem.

You owe value to customers who have already paid, not just an expectation of
reaping new customers in future competitive sales.

Your rights in guarding of your proprietary information, methods and concepts,
does not exceed the obligation you incurred by offering the product for sale to
me to be used on a computer owned 100% by me.

You perpetually relinguished certain property rights by making the sale --
that's the definition of a sale.

When there is a culture of respect, malware is not a difficult problem to detect
and to cure.


On the other subjects of the professor I have no comments.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: philc on Saturday, July 15 2006 @ 08:26 PM EDT
There is, in my opinion, there is nothing wrong with a company, or any website
owner, to completely control what is on their site. What gets added, what gets
removed and when. As in many other aspects of life, you get to experience the
results of your actions.

Malware is a Microsoft problem. It has been around since dos got access to the
internet and Microsoft has never seen fit to fix it. At present malware is big
business that makes Microsoft and other anti-virus vendors Billions of USD per
year. They have absolutely no motivation to reduce their revenue stream. So they
won't.

The only way that Microsoft will fix the maleware problem is when their OS
business suffers more than their maleware business makes. Until then nothing
(not even government regulations) will change anything. So if you want to
eliminate the maleware problem switch to Linux or MacOS or one of the BSDs and
get your friends to switch too. (Good luck - most people don't care).

The way to get the malware problem under control is to make the software vendors
legally responsible for the damage their products cause. Car makers get sued for
bad safety so why don't software vendors? If Microsoft had to pay to cleanup the
mess their software causes they would fix it.

Scaleable internet. It is. It always will be. There is more dark fiber (by a
lot) then in-use fiber. The dot-com era created huge (as of now untapped)
capacity. 5 years ago I worked for a company that installed a coast to coast
OC192c connection (capable of 2 Gig Bytes bandwidth) most of it went unused. Not
enough traffic. There are thousands of coast to coast fibers and bandwidth on
each fiber is increasing. Companies will add routers and other equipment as the
opportunity for profit presents itself.

The big problem for the internet is to keep it open to everyone. Proprietry
protocols should be banned. ISP inspired restrictions on individual users should
also be banned.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: rusty0101 on Saturday, July 15 2006 @ 11:13 PM EDT
I have owned bought and built computers running ms-Dos 4.01 (and a variety of
the later releases), windows 3.0, 3.1, 3.1wfw, 95, 98, 2k, OS/2 2.1, 3.0, 4.0,
BeOS 4.5, 5.6, Mac os 8.5, 9.0, 10.4 and a variety of Linux distributions.

I agree that conceptually, it is not difficult to create a virus that will run
on any of those platforms. The question becomes, at what level does a virus need
to execute for it to serve it's own purpose, and how easy is it to get to that
level.

My first experience with a virus was in the days of DOS 4.01. And what I
encountered was a computer virus. There are a lot of other variations of MALware
out there now, some called viruses, some not.

A significant amount of the malware that is in existance today has nothing to do
with viruses. Adware and stuff like the Sony Root Kit are actually malware that
is being distributed by people via completely legal means, and ostensibly for
completely legal reasons. That doesn't change the fact that they are a bad idea
to begin with, but being legitimate seems to make a lot of people feel much more
comfortable about doing things that many of us look upon with derision.

Does malware like the Sony root kit fall into the area of 'reasonable' software
to the person putting together the paper? How about as it sets up the users
system to install and run other softwre without the user being able to detect
its existence at all? Or that attempting to uninstall the software breaks the
system as it has replaced drivers that the unistall process doesn't know how to
restore?

There are side-effects to requiring programmers to only work on authorized
projects as well. Most 'authorized' projects are part of a business some where.
Anyone taken a look at the recent statistics for projects that a business
starts, then discards fefore completion? If some software is created as a result
of one of these projects, and gets out into the wild, it is an 'authorized'
software product, which basically will be without any form of support. If there
is a bug within the software, that someone later discovers, and there is no
mechanism in place for putting out updates (the project was canceled recall, and
the programmer can no longer work on that product, then we are in no better a
position than we are in today, and potentially a worse position. At least today
someone can examine the software, and write a replacement tool that provides
that functionality, without the bug included. That is not true if programmers
are only allowed to work on 'approved' software.

As another consideration, it is not difficult at all to write software in
another country that runs perfectly well here, as the computers are using the
same physical archetecture. What happens when something an Approved developer
creates is used by someone in another country, who is not an Approved developer,
to do something malicious.

As an example of this, it is not particularly difficult to write a basic script
to be run in Word, or Excel, that does something completely unexpected to a
system.

Likewise macro recorders are effectively programs that create software based
upon a user's keystrokes or mouse clicks. Does every user of a computer that
might create a macro to speed up a task like indenting every paragraph in a
story they are writing by three spaces, now have to go through some process of
becoming an approved developer, and have to go through some other process to get
their macro authorized? This sounds like a lot of extra work to put on an
administrative assistant, trying to get their job done more effeciently.

Macro recorders and embeded scripting languages are not restricted to Windows
platforms, or even Microsoft applications. There is a scripting language built
into Open Office, which has been used to create a template of a virus. Perl and
Python are often embeded within Linux applications as scripting languages for
those applications, but the ability to use either does not prevent either from
being used for activities outside of that application.

All of that said, Linux and Mac OSX are designed with a substantially more
secure design, that in most cases will provide a safer platform for users to
work within. That doesn't prevent bad things from ever happening, but it does
make it a bit more difficult for the malicious programmer to insure that what
they are doing will remain hidden from the user for an extended period of time.

That is my oppinion, and it is subject to change.

-Rusty

[ Reply to This | # ]

Simple answer: Read-only Operating Systems
Authored by: cjames on Sunday, July 16 2006 @ 01:31 AM EDT

There is an obvious answer: All operating systems should be read-only.

Imagine the world worked like this:

  • Everyone carries a bootable USB thumb drive with their personal data. Today that's 8GB, in a couple years or so, we're talking about 100GB in your pocket.
  • This little pocket device has a read-only boot partition, and a a writable, but encrypted, data partition.
  • On your boot partition, you have hash signatures of every OS and version you trust.
  • You stick your favorite OS CDROM in, boot from your USB drive, and your boot loader verifies the hashed signature of the operating system CDROM. Only if the signature matches a system that you trust does it boot.
  • Once the OS is booted, you enter the encryption key for your data partition, and a program on the boot partition verifies the integrity of your applications (Thunderbird, Firefox, etc.).
  • If everything checks out, you're in business. A virus-free computer, guaranteed.

In such an environment, infections are impossible. Even if someone infects your OS, you just shut off the computer and the virus vanishes.

The genesis of this technology is already here. Lots of people are carrying around thumb drives with Thunderbird, FireFox and OpenOffice, along with all of their personal data. Many of these programs come in all PC-, Mac- and Linux-compatible executables. It would take a trivial amound of space to keep digital signatures for dozens or hundreds of trusted operating systems.

Linux is already designed for this, witness Knoppix, Ubunto, Slax, and a host of other great bootable CD's. Windows could probably do this if they tried. I'm sure the guys and gals at Apple could make a bootable CD for the Macintosh if they haven't already.

Linux is relatively secure because you have to very deliberately become the super-user before you can cause serious damage, in contrast to Windows which encourages users to always give themselves administrator permission. (In fact, most Windows programs require this, whereas with Linux you can almost always install a program in your own personal bin directory and it will work.)

Why not take this to the next logical step? Don't let anyone install anything into the operating system, period. Operating systems should always be unwritable. If you need to upgrade, you burn a new CD or DVD. What does that cost these days, ten cents?

If the PC manufacturers got on board, you could take it one step further: IDE and SATA disks with a hardware switch on your front panel that disables changes to the operating system partition. That way, you get the performance of a hard drive, without the risk of a writable file system. When it's time to upgrade the OS, you boot, ensure that you're connected to the right web site, flip the write switch, and do your update. When you're through, you flip the switch back to read-only, then reboot from an CD and run a hash signature on the updated disk. If it passes, you're in business with a high-performance, but absolutely ininfectable, system.

In such a system, the boot loader would ensure compliance by refusing to boot from a writable partition. This would keep you from accidentally leaving your read/write switch in the "writable" position.

Notice that this protects against all attacks, including root kits. Since the disk is verified at the raw level by a separate OS that also boots from a read-only medium, there is no way to ever sneak anything into the OS. Period.

Craig

[ Reply to This | # ]

Net Neutrality
Authored by: Anonymous on Sunday, July 16 2006 @ 02:31 AM EDT
I see the hesitancy in addressing this issue, perhaps because you know that
those against Net Neutrality are rather unpopular online, and again perhaps
because you're not completely confident in your own understanding of it.
Whatever the case may be, I will give you my perspective on it.

First, walled gardens are not the Internet, and I capitalize that for a reason.
The Internet has value only because of everyone who connects to it. Splitting
it lessens its value. Its value lies at the edge, the users. There is no
changing this, any more than you can change that 5 is greater than 2. AOL
proved that this will fail, but the attack the telcos try now is new.

Next, I would put up another word about movies "clogging the tubes" as
it were. Bandwidth does cost money, yes. But there is bandwidth and there is
latency. Large movie transfers consume bandwidth--there are just so many bits
to move, it's not critical that any given bit gets there quickly, just that all
of them get there before too long. Latency is another thing. In part, it's
restricted by the speed of light. You will never see ping times shorter than a
certain amount from the US to Japan. The laws of physics themselves absolutely
forbid it. The closest to a "cheat" on this anyone has dreamed up is
to lessen the distance by taking a shortcut through a black hole (or rather two
black holes, connected--a wormhole). Suffice it to say, there is absolutely no
reason to believe this will ever be feasible--it would require more energy than
exists in our solar system.

I note, too, that you mention "let them pay for it." We do.
Bandwidth costs money. Internet providers budget for this. ISPs simply buy
bandwidth in bulk and average the rate across their subcribers. If one
subscriber's use impacts another's, it means that the ISP under-budgeted the
bandwidth to hide the cost from their customers. I.E. to charge them less by
giving a lower-quality service and hope that those who buy from them aren't
savvy enough to realize it. Or, possibly, they use a shared connection like
cable. But this is last-mile connectivity--not the bandwidth between ISPs. And
we paid the telephone companies something like billions of dollars for them to
improve this.

So no, movies aren't clogging up the tubes. Where there are problems, it's
because they treat the quality of their offerings as "blah blah fast blah
blah dedicated" and slap any knowledgeable consumer in the face (yes,
that's as literal a quote as I can give from the Cox ad--and make no mistake, it
is a direct insult to their customers' intelligence). Cable is very fast. It
offers high burst speeds. It sucks once more than a couple in the neighborhood
actually try to take advantage of that.

Finally, the real aim here isn't actually about money. Not *directly* that
is--it certainly does boil down to money in the end, or rather, being in a
position to forcibly extract it from us (something like "rent seeking
behavior" I think). As I've already mentioned, they can and -do- charge
for bandwidth use. The can also use QoS to, say, prioritize all VOIP calls
(where latency is critical) over FTP (where latency is not critical). This is
not what they intend to do, however.

They want to control us and our use of the Internet. They want to make their
competitors offerings suck so that they can push us into using their own.
Verisign put up SiteFinder and instantly broke many important applications, such
as spam filters (mail from domains that don't exist == spam, more often than
not). Other companies have blocked websites silently, such as the one blocking
the union website campaigning against them.

The new attacks are things like slowing iTunes because they want you to use
their own music service. In other words, anti-competitive measures from
near-monopolies.

We've fought incursion so far because the networks are dumb. They're *supposed*
to be dumb--they work better that way. "The Internet interprets censorship
as damage, and routes around it" is so often quoted, but they've finally
discovered the corollary: "What if censorship is built into the
Internet?"

Censorship may evoke the wrong feeling--at least *most* of the time, they're not
going to censor something because they disagree with ideals. Rather, they want
to extort money from us "pay extra if you want to use services other than
ours, and even then they won't be as good." And if you've seen the
contempt they have for customers who even know what a dedicated connection is,
well... True, to be completely fair, DSL can certainly be badly oversold, too,
but "dedicated" is NOT a "blah blah" sort of thing ...

Finally, as to regulation of online offenses--we don't need more. We need more
clueful judges, and to be fair, there *are* a few out there. For example,
simple things like "exceeding authorization" can throw them for loops
if they don't understand how a computer authorizes activities. They can think
that visiting a "secret" URL that is actually public exceeds
authorization because of an AUP you couldn't have read without going to the site
before you went to the site via the link and could've reasonably known. If you
know how the normal authorization mechanisms work, you know when they're being
bypassed deliberately and when some idiot just left the gate open and someone
wandered past. True, some might presume that that's too lenient in some cases
(perhaps they deliberately searched out the secret URL?), but I conclude that
it's bad public policy not to hold people responsible for keeping their gates
closed when such behavior should rightfully be considered negligent.

Perhaps there should be special judges to handle computer crimes? Or at least,
they should defer to special masters or whatnot when it's clear that the exact
workings of the technology elude them. They should not reason based on which
lawyer's metaphor appeals most to them, but upon the *actual workings* of the
technology. Or find someone to do that for them.

That aside, we *do* need more concentrated law enforcement. Right now, if you
have an internet crime, who do you report it to? Yes, there _are_ places to
contact for everything from 419 scams to kiddie porn. But they're _not_ well
known! And they're scattered across many agencies, such that I would have to
spend quite a while at researching them to compile any kind of decent list.

So one thing we *could* use is somewhere online to file online criminal reports.
Someone who will figure out who *does* have jurisdiction over whatever the
offense is and who can take charge of it. In other words, something like an
online dispatcher, who routes communications to the proper agencies. Yes, there
are technical issues. If they hire competant people who use proven technology,
it will work just fine. If they don't, well, that's always going to be a
problem.

I see so many people these days who think they don't need to know how computers
work and who therefore don't bother. It's only going to get worse. But perhaps
Kuhn's theory is right, and attitudes will only change once those who believed
such things all grow old and die, leaving new people with different perspectives
to replace them...

Sorry this is so long, I have a tendency to ramble. Most anything else you want
to know can be found on SaveTheInternet.com though.

[ Reply to This | # ]

  • Net Neutrality - Authored by: Anonymous on Sunday, July 16 2006 @ 08:40 AM EDT
    • Net Neutrality - Authored by: Anonymous on Monday, July 17 2006 @ 03:12 PM EDT
  • Net Neutrality - Authored by: Wol on Monday, July 17 2006 @ 04:04 AM EDT
"Catastrophe" by Richard Posner
Authored by: CraigV on Sunday, July 16 2006 @ 03:59 AM EDT
I think the book "Catastrophe" by Richard A. Posner
(former Judge and now law professor) might be relevant to
this discussion. His book was interesting and
provocative, but needed a counter discussion, particularly
to the later chapters where he seemed a bit too sure of
himself.

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: Anonymous on Sunday, July 16 2006 @ 04:37 AM EDT
I dont think the good professor knows what hes talking about.

Security is most definately a matter of design. One of the big problems with
Windows is that its security model is vast and complex. So vast and complex that
it is very difficult to write a secure program in windows. Furthermore the model
is so clunky that a big chunk of users (I'd say majority but I dont know that
for a fact) run their systems at all times using an admin account.

A complex security model combined with Micorosofts public/private API, combined
with zero public scrutiny of the design or implementation (thats what happens
when you use proprietory operating systems) means that there are loads and loads
of attack vectors possible.

Systems like linux have much simpler security models. (Don't confuse simple with
less powerful, they are just as powerful.) Combined with fully publically
audited API's and implementations and you have a much more secure system.

Look at the attack mechanisms utilized in most of the Win32 viruses. They tend
to exploit interfaces which would never have been allowed in a system like
linux. Things like VBA/ActiveX are perfect examples.

I also think its worth considering the implications of the "popularity
verus design" argument. In other fields do we dismiss product failings
because the product is the most popular? I dont think we do. If a particular
popular automobile is susceptible to certain types of accidents we dont dismiss
it as being a consequence of the automobile being popular, rather we insist on
the design being fixed. Why does the good professor think that software should
be any different?

The other thing that really makes me wonder, is how does the professor expect to
produce this "secure internet". Presumably by designing it to be
secure. But if its going to be the business internet then presumably its going
to be popular, which if we accept his own premise then well, its not going to be
secure. So it seems like on one had the professor thinks that popular operating
systems cant be secure, but that a popular internet can be. He cant have it both
ways. He cant argue that popular operating systems will inherently be insecure
without accepting that the same problem confounds the idea of a secure internet.
Of course if he accepts that the problem is solvable by design then the entire
theory is reduced to ashes.


[ Reply to This | # ]

Regulation for fun and profit
Authored by: emmenjay on Sunday, July 16 2006 @ 09:23 AM EDT
I'd like to see one level of regulation for Internet use:
If you connect a computer to the Internet, then you are responsible for it.

- If spam is sent from your computer, you are guilty of a crime and can be
prosecuted.
- If your computer hosts a phishing site, same deal.
- If somebody "owns" your computer, that is your fault for not using
adequate security.

- If you are an ISP, you are responsible for knowing who put something on your
computer. You cannot be expected to monitor every file on every web page, but
if something illegal is found then you should have logs that identify who is
responsible.

- If an end user's computer is spewing out spam, the ISP ought to notice that
and cut them off until they fix their system, or else the ISP must be liable for
some of the blame.

If end users and ISPs are forced to take responsibility for their systems, you
will see a huge reduction in illegal activity.

Certainly there are ISPs and users in certain countries who don't seem to care
about such things. However, if most countries enforce the principles above,
those who don't might find themselves isolated as "reputable" users
start to block their IP addresses.

It won't solve every problem, but it would be a *very* good start. However it
won't happen until their are financial incentives (big fines) as most ISPs are
not going to spend money on security without a big push.

Michael J Smith
[warning, this advice may be worth every cent you paid for it]

[ Reply to This | # ]

Business internet is already evolving
Authored by: obiwan on Sunday, July 16 2006 @ 10:51 AM EDT
Tell the professor to google for TPM,NGSCB,DRM,VEIL,LaGrande,...

[ Reply to This | # ]

Solved problem in computer science (;-))
Authored by: davecb on Sunday, July 16 2006 @ 11:16 AM EDT
Yen suggests having two Internets, one for business, which will be seriously locked down to keep it "safe" for business, where users have very few options, since in his view they don't have the tech skills to be safe.

I suspect Dr Yen hasn't been talking to Dr Yankelovich (who put up Chuvo's page). She's been working on secure window systems, and might be able to tell him and us about MAC.

Back when I was pretty junior, I did a unix-related project based on cross-compilers hosted at HI-Multics.ARPA. That machine was running "Accesss Isolation", a form of mandatory access control (MAC), which kept us from messing around with material that didn't belong to us.

If I was to work on a project in cooperation with some different group, we were both added to a "category", and could work on anything in that category together. We couldn't export the result back to our parent groups except via an agreed-upon process which had been put into place. A lot like two people writing a contract, except the terms were defined by code.

And it was extremely easy for me to understand: far easier than file permissions or ACLs!

It's still genuinely difficult: there are lots of problems with MAC (ask me about the *-property some time (:-() or about thousands of individual data-owners), but the primitives to build a scalable, general system of electronic cooperation already exist, in commercial systems like Trusted Solaris 10 and in free software like SELinux and Trusted BSD.

The hard part was making it simple enough for me, the then junior programmer seconded to a Unix project from a GCOS shop, who had trouble even copying files.

I suspect the same applies to my 100-year-old friend Mary, who probably needs something like a pen drive with her secure kernel on it that authenticates her to her bank, and authenticates the bank right back, and can thereby block fake copies of the bank from receiving her business becuase they don't have a banking contract with her in her MAC table.

--dave

---
davecb@spamcop.net

[ Reply to This | # ]

The lost professor
Authored by: blang on Monday, July 17 2006 @ 01:04 AM EDT
Your professor sounds a bit like some one who just heard about the internet, and
before he understood hwo it works, goes ahead and proposes solutions to problems
that don't exist, etc.

Oen was hsi comments about the users tech skills.

If the professor was in any way up to date with current events, the end users
are not the problem. In many cases they are the solution, and have taken steps
such as installing firefox despite IT's direction that they use MSIE. The big
identity theft scandals were not caused by end users.

True, many fall victims to phishing scams, but you will find just as many large,
fancy corporations falling prey to social engineering. We routinely hear of
large government organizatiopns getting hacked, such as DOD, NASA. Mobile phone
companies, email providers(hotmail), credit card clearing houses(card systems),
all examples of exposing their customer's data to hackers.

You COULD inform the professor that there are in a way 2 different networks fro
web traffic. One is the open network, and the other is encrypted traffic, over
for example https, which usually is accompanied by a site certification.

There is only one place where more regulation is neeed on the internet.
Regulation of the government. There are certain laws that need to be upheld,
and a thing called the constitution, and the bill of rights. Over the last half
decade, these laws have been ignored, loopholed, trampeled on, and poopooed by a
renegade administration.

We don't need no stinkin "law professor frankenstein makes the internet to
his image". What we need is for our existing laws to be enforced, and our
existing civil liberties not sold up the river, and 3 branches of government
playing their respective roles as prescribed.

Now, this might sound a bit political, but regulation and legislation doesn't
live in a vacuum.

[ Reply to This | # ]

Popularity vs design
Authored by: blang on Monday, July 17 2006 @ 01:26 AM EDT
"How do you answer his firmly held belief that the problem is a matter of
popularity, not design? "

First of all, I don't believe linux was necessarily DESIGNED to be secure. I
think it is instead an accumulationof decisions, refusing insecure features, and
a layered philosophy that minimizes the harm that could be done by regular
users. Some of those decisions were made already during unix and predecessors.
Unix was a multi-user system. Only root was allowed to install new syste
software.

Dos started out as single user, even single process. First windows versions were
just eyecandy over that DOS engine. Now we have threads, processes, adn other
features that traditionally might have belonged in a multi-user environment, but
most Windows users still have a single user, with all admin privileges and at
all times have the power to install something that will hose the system. So it
seems Windows has grown into something outside it's comfort zone.

But where the popularity arguement really falls flat on it's face, is when
comparing early DOS to contemorary linux. In the first years, viruses on DOS
travelled by floppy. Boot sector viruses. Lots of machines were infected. Sinec
they were for the most part off network, it didn't bring everything to a halt,
but viruses were abundant. In the early years, I am rather confident that the
number of DOS usrs were not more than the current number of linux users today.
The number people writig viruses has gone way up, and they have better skills.


Also, consider that one of the most popular targets for hackers are web sites.
Linux holds a very significatn market share for web servers, but you don't often
hear about a linux web server being defaced.

[ Reply to This | # ]

Myth about trusted computing
Authored by: blang on Monday, July 17 2006 @ 01:47 AM EDT
The myth is that it is possible.

No matter hwo many obstacles, handshakes, encryptions, device lockdowns, and
biometrics are applied, there will always be ways to emulate devices.

How will a server know that the end user has a secure env?
It simply cannot. Secure monitor? No good, they can just add a monitor
scanner.

In order to be truly trusted, all our programs wold need to eb monitired 24x7,
and all users would need to be monitored as well. Adn teh computer would need to
be physically secured, maybe by video, and certifying it's continued operation
adn non-tampring by showing a stream of data such as a hash of realtime clock,
serial no, a secret key.

But in practice, every workstation hooked to the net would need to reside in
heavily guarded vaults. The professors' internet si shrinking fast. With these
kinds of measures, there would only be few 1000 users who could afford to be on
net.



[ Reply to This | # ]

Commons
Authored by: Yossarian on Monday, July 17 2006 @ 12:32 PM EDT
>And while I stay away from politics, my logical mind asks
>this question: if folks want to use the Internet for things
>that put a strain on everything, essentially for their own
>business benefit, then shouldn't they be the ones to pay to
>build it out or build their own, instead of negatively
>affecting everyone else's experience on what is surely as
>much a public commons as a public park or a river or the air?

The idea that rich people should have the right to fence the
commons for their own benefit is very English. See
http://www.wealthandwant.com/docs/Williams_IdK.html

"They hang the man and flog the woman
That steals the goose from off the common,
But leave the greater criminal loose
That stole the common from the goose" - - (Anon.) 1821.
(Referring to the British General Enclosure Acts from the 14th to 19th
centuries, in which common land was fenced and placed under private ownership)

[ Reply to This | # ]

Chuvo's Page, Malware, and Scalable Systems
Authored by: BassSinger on Monday, July 17 2006 @ 06:43 PM EDT
PJ asked, "How would you counter the popularity argument?"

First, I'd do some research into when the first virus was found. I'd gather
data on the size, shape, scope and denizens of the internet as it existed at the
time. Then I'd compare that to the number of Mac & Linux systems on the web
*now* and see if the actual number of virus free systems now isn't more than the
number of total systems that were available then. Hmm. That would pretty much
belie the "popularity" argument. Secondly, I would point out that
*before* windows came along there were many systems on the web, almost all of
them UNIX or some variant, and yet there were no viruses on *the most popular*
systems of the day. Thirdly, I'd point out that the majority of the *servers*
on the web are not running Windows, and that if a malevolent personality (the
kind who create and release worms) wanted to do damage (create and release
worms), surely more damage could be done via the servers than via the desktops,
if the two were equally vulnerable. Hence, the major server OS would be the
prime target, all things being equal.

A thought that has been running through my brain: Perhaps we have a lot more
malware now because a system is available that any old script kiddie can damage,
whereas before (there was an "internet" in the '70s, it was just very
limited in scope, if not capabilities) you needed a certain degree of
intelligence to write malware that would work on the operating systems of the
time.

I know there are bugs in Linux and FreeBSD (the basis for OS-X), but I would bet
they are more secure now than their predecessor, UNIX, was way back in the day,
when we had no viruses and worms to worry about.


---
In A Chord,

Tom

Proud Member of the Kitsap Chordsmen
Registered Linux User # 154358

[ Reply to This | # ]

Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )