|
Correcting Microsoft's Bilski Amicus Brief -- How Do Computers Really Work? - Updated 2Xs |
 |
Saturday, October 31 2009 @ 08:55 PM EDT
|
Groklaw member PolR sent me some observations on Microsoft's Bilski amicus brief [PDF; text] submitted to the US Supreme Court in the case In Re Bilski. Oral argument will be on November 9th. Presumably their arguments will be before the court. But are they technically accurate? PolR thinks they are not, and he decided to correct some materials in it, both some historical facts and the description of how computers today work.
Is it true, as Microsoft wrote in its brief, that computers are at heart just a "collection of tiny on-off switches--usually in the form of transistors"? Or that "The role of software is simply to automate the reconfiguration of the electronic pathways that was once done manually by the human operators of ENIAC"? Are computers just a modern equivalent to the telegraph or the Jacquard loom, a series of on-off switches, as the brief asserts?
Or is that hyberbole, and technically inaccurate hyperbole?
How do modern computers really work? What impact did the discovery of the universal Turing machine have on how computers work, compared to prior special-purpose computers like ENIAC? What are the differences between how analogue and digital computers work?
We have heard from the lawyers, but what about from those whose area of expertise is the tech? I think you'll see how this technical information ties in with the questions the US Supreme Court would like answered -- presumably accurately -- as to whether or not software should be patentable and whether computers become special purpose machines when software is run on them. Po1R's collected some very useful references from experts. Feel free to add more references in your comments.
Update:
Groklaw user fncp sent us urls demonstrating that the ENIAC was doing symbolic manipulations. It didn't perform calculations in the manner of a differential analyzer:
Official history of the ENIAC
The ENIAC on a chip project
The ENIAC was one of the first electronic devices able to do symbolic manipulations. What it didn't do is store programs in the same memory as data. This is the crucial step that was missing to incorporate the principles of an universal Turing machine into an actual device. It is only when this step was accomplished that the possibility of programs manipulating or generating other programs becomes possible. Research on how to implement this possibility led to the development of what is now known as the Von Neumann computer architecture.
The Stanford Encyclopedia of Philosophy writes on this history:
In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.
Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.
Update 2: A comment by Groklaw member polymath is worth highlighting, I think:
Reducto ad absurdum
Authored by: polymath on Sunday, November 01 2009 @ 11:04 AM EST
The last extract reveals that Microsoft's brief fails even on its own terms.
While Morse's telegraph was patentable the sequence of 1's and 0's used to send any given message was not patentable. While Jacquard's loom was patentable the arrangement of holes on the cards that produced cloth was not patentable. While the machines for manipulating Hollerith cards were patentable the arrangement of holes representing the information was not patentable. Even a printing press is patentable subject matter but the arrangement of type to produce a story is not patentable. Likewise computer hardware may be patentable subject matter but the pattern of transistor states that represent programs and data are not patentable.
All of those patterns are the proper subject matter of copyright law. Neither patents nor copyright protect ideas or knowledge and we are free to create new implementations and/or expressions of those ideas.
To allow a patent on a program is akin to allowing a patent on thank you messages, floral fabric designs, census information, or vampire stories. It is simply nonsense.
*********************
An Observation On the
Amicus Curiae Brief from Microsoft, Philips and Symantec
~ by PolR
I have noticed something
about PDF the amicus brief from Microsoft, Philips and Symantec submitted to the US SUpreme Court in the In re Bilski case. This amicus brief
relies on a particular interpretation of the history of computing and on its own description of
the inner workings of a computer to argue that software should be
patentable subject matter. I argue that both the
history and the description of the actual working of a computer is inaccurate.
I note that the authors of
the brief are lawyers. They are not, then, presumably experts in the history of
computing. The statements from the brief are in direct contradiction
with information found at expert sources I've collected here.
How Do Computers Work
According to the Brief?
Here is how the brief describes how computers work:
The fantastic variety in which computers are
now found can obscure the remarkable fact that
every single one is, at its heart, a collection of tiny
on-off switches--usually in the form of transistors.
See generally David A. Patterson & John L.
Hennessy, Computer Organization and Design (4th
ed. 2009); Ron White, How Computers Work (8th ed.
2005). Just as the configuration of gears and shafts
determined the functionality of Babbage's computers,
it is the careful configuration of these on-off switches
that produces the complex and varied functionality of
modern computers.
Today, these on-off switches are usually found in
pre-designed packages of transistors commonly
known as "chips." Thin wafers of silicon, chips can
contain many millions of transistors, connected to
one another by conductive materials etched onto the
chip like a web of telephone lines. They are organized such that they can be turned on or off in patterned fashion, and by this method, perform simple
operations, such as turning on every transistor
whose corresponding transistor is off in the
neighboring group. From these building blocks,
mathematical and logical operations are carried out.
Patterson & Hennessy, supra, at 44-47 & App. C.
The challenge for the inventor is how to use
these transistors (and applying the principles of
logic, physics, electromagnetism, photonics, etc.) in a
way that produces the desired functionality in a useful manner. Computer programming is an exercise
in reductionism, as every feature, decision, and
analysis must be broken down to the level of the rudimentary operations captured by transistors turning on and off. This reductionism is matched by the
detail with which transistors must be configured and
instructed to carry out the thousands or millions of
operations required by the process.
Early electronic computers were "programmed"
by laboriously rewiring their electrical pathways so
that the computer would perform a desired function.
ENIAC--the first general-purpose electronic digital
computer, functioning at the mid-point of the Twentieth Century --
could take days to
program, with operators physically
manipulating the
switches and cables. Patterson &
Hennessy, supra,
at 1.10. [ed: graphic of ENIAC]
Fortunately, this is no longer the case. Transistors, packaged onto silicon chips, permit electronic
manipulation of the pathways between them, allowing those pathways to be altered to implement different processes without direct physical manipulation.
The instructions for this electronic reconfiguration
are typically expressed in computer software. See
Microsoft Corp. v. AT&T Corp., 550 U.S. 437, 445-46
(2007) (noting that, inter alia, Windows software renders a general-purpose computer "capable of performing as the patented speech processor").
To allow more sophisticated control over the millions of transistors on a chip, inventors rely on a
multi-layered scheme of pre-designed software "languages" that help bridge the gap between the on-off
language of the transistor and the words and grammar of human understanding. These allow control of
the transistors on a chip at various levels of specificity, ranging from "machine language," which allows
transistor-level control, to "programming languages,"
which allow operations to be defined through formal
syntax and semantics that are more easily understood by humans. Each language pre-packages the
mathematical and logical operations that are most
useful for the users of that particular language. See
Patterson & Hennessy, supra, at 11-13, 20-21, 76-80.
Using these languages, the inventor can create "software" that
defines the operations of semiconductor chips and other hardware.
These operations are the steps of a computer-implemented process. The
role of software is simply to automate the reconfiguration of the
electronic pathways that was once done manually by the human
operators of ENIAC.
What is the Glaring Error in this
Description?
The brief fails to mention a most
important mathematical discovery -- the universal Turing machine -- and
how it influenced the development of the computer. The ENIAC didn't
use this mathematical discovery, while computers built afterwards use
it. Because of the discovery of Turing, the programming of modern
computers doesn't operate under the same principles as the ENIAC.
An article titled "Computability and Complexity", found on The Stanford Encyclopedia of
Philosophy's website, describes
the contribution of universal Turing machines to the development
of computers:
Turing's
construction of a universal machine gives the most fundamental
insight into computation: one machine can run any program whatsoever.
No matter what computational tasks we may need to perform in the
future, a single machine can perform them all. This is the insight
that makes it feasible to build and sell computers. One computer can
run any program. We don't need to buy a new computer every time we
have a new problem to solve. Of course, in the age of personal
computers, this fact is such a basic assumption that it may be
difficult to step back and appreciate it.
What, In Contrast, Was the Operating
Principle of the ENIAC?
According to Martin
Davis, Professor Emeritus, Department of Computer Science
Courant Institute of Mathematical Sciences
New York University, described here as one of the greatest living mathematicians and computer scientists, and who is interviewed here,
in his book Engines of Logic: Mathematicians and the Origin of the
Computer, in Chapter 8, page 181, here's how ENIAC worked:
An
enormous machine, occupying a large room, and programmed by
connecting cables to a plugboard rather like an old-fashioned
telephone switchboard, the ENIAC was modeled on the most successful
computing machines then available -- differential analyzers.
Differential analyzers were not digital devices operating on numbers
digit by digit. Rather numbers were represented by physical
quantities that could be measured (like electric currents or
voltages) and components were linked together to emulate the desired
mathematical operations. These analog machines were limited in their
accuracy by that of the instruments used for measurements. The ENIAC
was a digital device, the first electronic machine able to deal with
the same kind of mathematical problems as differential analyzers. Its
designers built it of components functionally similar to those in
differential analyzers, relying on the capacity of vacuum-tube
electronics for greater speed and accuracy.
Davis, according to Wikipedia is also the co-inventor of the Davis-Putnam and the DPLL algorithms. He is a co-author, with Ron Sigal and Elaine J. Weyuker, of Computability, Complexity, and Languages, Second Edition: Fundamentals of Theoretical Computer Science, a textbook on the theory of computability (Academic Press: Harcourt, Brace & Company, San Diego, 1994 ISBN 0-12-206382-1 (First edition, 1983). He is also known for his
model of Post-Turing machines. Here is his
Curriculum Vitae [PDF], which lists his many other papers, including his famous article, "What is a Computation?" published in Mathematics Today, American Association for the Advancement of Science, Houston, January 1979 [elsewhere referenced as "What is a Computation?", Martin Davis, Mathematics Today, Lynn Arthur Steen ed., Vintage Books (Random House), 1980].
What is a Differential
Analyzer?
Here is an explanation,
complete with photographs at the link:
There are two distinct
branches of the computer family. One branch descends from the abacus,
which is an extension of finger counting. The devices that stem from
the abacus use digits to express numbers, and are called digital
computers. These include calculators and electronic digital
computers.
The other branch descends from the
graphic solution of problems achieved by ancient surveyors. Analogies
were assumed between the boundaries of a property and lines drawn on
paper by the surveyor. The term "analogue" is derived from the
Greek "analogikos" meaning by proportion. There have been many
analogue devices down the ages, such as the nomogram, planimeter,
integraph and slide rule. These devices usually perform one function
only. When an analogue device can be "programmed" in some way to
perform different functions at different times, it can be called an
analogue computer. The Differential Analyser is such a computer as it
can be set up in different configurations, i.e. "programmed", to
suit a particular problem.
In an
analogue computer the process of calculation is replaced by the
measurement and manipulation of some continuous physical quantity
such as mechanical displacement or voltage, hence such devices are
also called continuous computers. The analogue computer is a powerful
tool for the modelling and investigation of dynamic systems, i.e.
those in which some aspect of the system changes with time. Equations
can be set up concerned with the rates of change of problem
variables, e.g. velocity versus time. These equations are called
Differential Equations, and they constitute the mathematical model of
a dynamic system.
How Do Universal Turing
Machines Differ from the ENIAC?
The
previous explanation spelled out the differences between a digital
computer and an analog computer. But what is the unique
characteristic of a digital computer? The Stanford Encyclopedia of
Philosophy's The Modern History of Computing describes
the programming of a universal Turing machine as the manipulation
of symbols stored in readable and writable memory:
In 1936, at Cambridge University, Turing invented the principle of
the modern computer. He described an abstract digital computing
machine consisting of a limitless memory and a scanner that moves
back and forth through the memory, symbol by symbol, reading what it
finds and writing further symbols (Turing [1936]). The actions of the
scanner are dictated by a program of instructions that is stored in
the memory in the form of symbols. This is Turing's stored-program
concept, and implicit in it is the possibility of the machine
operating on and modifying its own program.
The
ENIAC was manipulating current and voltages to perform the
calculation in the manner of a differential analyzer. But symbols are
different. They are the 0s and 1s called bits. They are like letters
written on paper. You don't measure them. You recognize them and
manipulate them with precisely defined operations. This is a
fundamental insight that the program is data. It is what makes it
possible to have operating systems and programming languages because
when program is data, you can have programs that manipulate or
generate other programs.
Martin
Davis, in the same book previously quoted, on page 185, explains the role
of symbols:
It is well understood that the computers developed after Wold War II
differed in a fundamental way from earlier automatic calculators. But
the nature of the difference has been less well understood. These
post-war machines were designed to be all-purpose universal devices
capable of carrying out any symbolic process, so long as the step of
the process were specified precisely. Some processes may require more
memory than is available or may be too slow to be feasible, so these
machines can only be approximations to Turing's idealized universal
machine. Nevertheless it was crucial that they had a large memory
(corresponding to Turing's infinite tape) in which instructions and
data could coexist. This fluid boundary between what was instruction
and what was data meant that programs could be developed that treated
other programs as data. In early years, programmers mainly used this
freedom to produce programs that could and did modify themselves. In
today's world of operating systems and hierarchies of programming
languages, the way has been opened to far more sophisticated
applications. To an operating system, the programs that it launches
(e.g. your word processor or email program) are data for it to
manipulate, providing each program with its own part of the memory
and (when multitasking) keeping track of the tasks each needs carried
out. Compilers translate programs written in one of today's popular
programming languages into the underlying instructions that can be
directly executed by the computer: for the compiler these programs
are data.
Does This Mean That It Is
Possible for Computer Algorithms to be Generated by Programs as
Opposed to Being Written by Humans?
Absolutely.
For example, consider the language SQL that is used to access
information stored in a relational database. We can write in SQL
something like:
select
name from presidents where birthdate
This
statement may be used to retrieve the names of all presidents whose
date of birth is prior to January 1, 1800, assuming the
database contains such information. But the language SQL doesn't
specify the algorithm to use to do so. The language is free to use
the algorithm of its choice. The language may read all presidents in
the database one by one, test their birth date and print those that
pass the test. Or the language may as well read an index of the
presidents that lists the presidents in the order of their birth date
and print only the first president in the list until it find one born
past January 1, 1800. Which algorithm is best will depend
on how the database has been structured, and the choice is left to the
language discretion.
Why Does a Program
Require Setting Transistors on a Modern Computer?
This is
because the RAM memory of modern computers where the program symbols
are stored is made of transistors. They are required only to the
extent the symbols will not be remembered by the computer if the
transistors are not set. If the computer memory is not made of
transistors, a program can be loaded in memory without setting any
transistors. This was the case with some early models of computers, as
is reported by Martin Davis, in the same book, on page 186:
In the late 1940s, two devices offered themselves as candidates for
use as computer memory: the mercury delay line and the cathode ray
tube. The delay line consisted of a tube of liquid mercury; data was
stored in the form of an acoustic wave in the mercury bouncing back
and forth from one end of the tube to another. Cathode ray tubes are
familiar nowadays in TVs and computer monitors. Data could be stored
as a pattern on the surface of the tube.
Today
the symbols may be represented by transistors in silicon chips, but
also as grove patterns on optical disk, magnetic pattern on magnetic
hard drives or tapes, wireless electromagnetic signals, optical waves
etc. There is a diversity of media that is possible. Symbols are
information like ink on paper except that computers use media other
than ink. Programs may be downloaded from the Internet. During the
transit the same symbols may take any of these forms at a point or
another. The symbols are translated from one form to another as the
information is transferred from one piece of equipment to another. At
each step the meaning of the information is preserved despite the
change in physical representation. It is this symbolic information
that is used to program the computer.
Why Does the Amicus Brief
Argue That the Setting of Transistors is Relevant to Patentability of
Software?
It is
because they argue that it makes software similar or comparable to some industrial-age
patentable devices. From the brief pp. 14-17:
In this respect, modern computer-related inventions are no different
from other patent-eligible innovations that have produced a new and
useful result by employing physical structures and phenomena to
record, manipulate, or disseminate information.
Perhaps the most celebrated example of such technological innovation
is Samuel Morse's invention of the electric telegraph, which (like
modern computers) employed binary encoding in conjunction with the
sequential operation of switches. Although petitioners focus almost
exclusively on the Court's rejection of his eighth claim (on which
more below), the Court allowed a number of other claims, including
the fifth. O'Reilly v. Morse, 56 U.S. 62, 112 (1854). That claim
was for "the system of signs, consisting of dots and spaces, and of
dots, spaces and horizontal lines." Id. at 86. This system, an
early version of Morse Code, was nothing other than a system for
manipulating an on-off switch -- the telegraph key in a prescribed
manner to produce the useful result of intelligible communications
between two parties. Indeed, although much less complex, the
telegraph system -- a web of interconnected switches spreading around
the globe, enabling binary-encoded communication -- was comparable to
the modern Internet.
The Industrial Age also knew software and hardware in a literal
sense; the core concepts in computer design and programming were
developed in this period. The principle of encoded instructions
controlling a device found application at the opening of the
Nineteenth Century, with the famous Jacquard loom, a device (still in
use today) that adjusts the warp and weft of a textile in response to
"programming" contained on punch cards. The loom's control
apparatus consists of a series of on-off switches which are
controlled by the pattern of holes punched in the cards, just as the
pattern of microscopic pits and lands on the surface of a CD can be
used to control the transistor switches inside a computer. Hyman,
supra, at 166; Patterson & Hennessy, supra, at 24.
Inventors soon seized on the "programming" principle applied in
the Jacquard loom. A defining characteristic of Babbage's
Analytical Engine, for example, was the use of punch cards, adopted
from the Jacquard loom, to store the programs run by the machine.
"Following the introduction of punched cards early in 1836 four
functional units familiar in the modern computer could soon be
clearly distinguished: input/output system, mill, store, and
control." Hyman, supra, at 166. Babbage's close friend, Ada
Lovelace (the daughter of Lord Byron), is now recognized as "the
first computer programmer" for her work developing software
programs for the Analytical Engine. Nell Dale et al., Programming and
Problem Solving with C++, 406-407 (1997).
Later in the Nineteenth Century, Herman Hollerith, a U.S. Census
Office employee, developed a means of tabulating census results using
punch cards and mechanical calculation. His method allowed the
country to complete the 1890 census two years sooner and for five
million dollars less than manual tabulation. William R. Aul, "Herman
Hollerith: Data Processing Pioneer," Think, 22-23 (Nov. 1972). The
company he founded became the International Business Machines Corp.,
and the once-prevalent IBM punch-cards were both the direct
descendent of the means used to program a Jacquard loom and the
immediate predecessor to today's CDs and other media, which contain
digitized instructions for modern computers to open and close
millions of switches.
As has been often noted by historians of technological
development, our perceptions of innovation and modernity are often
misguided -- the roots of technological change are deep, and run
farther back in our history than we perceive. See Brian Winston,
Media Technology & Society: A History 1 (1998) (arguing that
current innovations in communications technology are "hyperbolised
as a revolutionary train of events [but] can be seen as a far more
evolutionary and less transforming process"). That is certainly
true with respect to computer-related inventions.
While the hardware and software implemented by a modern e-mail
program may be orders of magnitude more complex than the dot-dash-dot
of a telegraph key, the underlying physical activity that makes
communication possible -- the sequential operation of switches -- is
fundamentally the same.
Are Modern Computer
Really Similar to These Industrial Age Device?
No,
because none of these devices implements the principle of a
universal Turing machine. For example Andrew Hodges, a well-known
Alan Turing historian and maintainer of the Alan Turing's home page, describes the difference
between Babbage's analytical engine and computers as follows:
So I wouldn't call Charles Babbage's 1840s Analytical Engine the
design for a computer. It didn't incorporate the vital idea which is
now exploited by the computer in the modern sense, the idea of
storing programs in the same form as data and intermediate working.
His machine was designed to store programs on cards, while the
working was to be done by mechanical cogs and wheels.
The
cards storing programs for the Babbage engine were punched cards like
those used for the Jacquard loom mentioned in the brief. Martin
Davis, in the same book on pages 177-178, also mentions the Jacquard loom:
The Jacquard loom, a machine that could weave cloth with a pattern
specified by a stack of punched cards, revolutionized weaving
practice first in France and eventually all over the world. With
perhaps understandable hyperbole, it is commonly said among
professional weavers that this was the first computer. Although it is
a wonderful invention, the Jacquard loom was no more of a computer
than is a player piano. Like a piano it permits a mechanical device
to be controlled automatically by the presence or absence of punched
holes in an input medium.
The Microsoft
amicus brief did not explain that modern computers are programmed
from symbolic information. Likewise it doesn't discuss the difference
in nineteenth-century patent law between patenting an Industrial age
device that manipulate information and patenting the *information* in
the device. Should they have explored this difference they would have
found that the hole patterns in Jacquard punched cards and piano
rolls are a much closer equivalent to computer software than any
Industrial Age physical apparatus.
|
|
Authored by: Anonymous on Saturday, October 31 2009 @ 09:09 PM EDT |
"modern computers are programmed from symbolic information." ...
symbolic information as in text .... which makes it more appropriate to use
copyright instead of patents .... as open source software does .... [ Reply to This | # ]
|
|
Authored by: Totosplatz on Saturday, October 31 2009 @ 09:26 PM EDT |
Please make links clicky, and thanks to PoIR...
---
Greetings from Zhuhai, Guangdong, China; or Portland, Oregon, USA (location
varies).
All the best to one and all.[ Reply to This | # ]
|
|
Authored by: Anonymous on Saturday, October 31 2009 @ 09:40 PM EDT |
The key thing to note here is that the loom may well be patentable, but the
pattern of holes punched in the cards is definitely not, although the pattern of
holes is definitely copyrightable. In other words, the hardware is patentable,
but the software is not, but is instead protected by copyright. The same is true
for computer hardware and software.
[ Reply to This | # ]
|
- Bingo! - Authored by: Anonymous on Saturday, October 31 2009 @ 10:06 PM EDT
- Bingo! - Authored by: old joe on Sunday, November 01 2009 @ 03:40 AM EST
- Bingo! - Authored by: old joe on Sunday, November 01 2009 @ 03:52 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: Anonymous on Sunday, November 01 2009 @ 04:36 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: The Cornishman on Sunday, November 01 2009 @ 05:27 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: Anonymous on Sunday, November 01 2009 @ 05:36 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: Bernard on Sunday, November 01 2009 @ 05:48 AM EST
- So, you can patent a wing, not the mere use of its constituents. - Authored by: Winter on Sunday, November 01 2009 @ 08:32 AM EST
- The patents were broken. - Authored by: Anonymous on Sunday, November 01 2009 @ 10:02 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: Anonymous on Sunday, November 01 2009 @ 11:15 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: tinkerghost on Sunday, November 01 2009 @ 02:04 PM EST
- configuration?..they call it ASSEMBLY language don't they? - Authored by: Anonymous on Sunday, November 01 2009 @ 05:41 AM EST
- So, you can patent sticks, cloth, cable and pullies - Authored by: tinkerghost on Sunday, November 01 2009 @ 01:58 PM EST
|
Authored by: JamesK on Saturday, October 31 2009 @ 09:48 PM EDT |
IIRC, according to what I've read, ENIAC was closer to a programmable
calculator. It worked in decimal, with registers, counters, etc.
---
IANALAIDPOOTV
(I am not a lawyer and I don't play one on TV)[ Reply to This | # ]
|
|
Authored by: Anonymous on Saturday, October 31 2009 @ 10:05 PM EDT |
While, I definitely agree that programs are more the equivalent of the punched
cards, I see no relevance to the rest of your argument.
Modern computers, at least the common ones used by most people are digital
computers. Modern, digital computers run on VLSI transistor logic. This logic is
based on gates which are either on or off, all computer programs cause the
switching on and off of the gates. A gate is equivalent to an on off switch, a
one or a zero. I see no reason to on to say MS is wrong on this fact. Although
they have definitely twisted this fact to tell a totally non-sensible argument.
They are however correct that a PC is just a bunch of switches at the raw
hardware level, Turing and all aside. Your whole argument is very strange, until
you got to the punched cards parts.[ Reply to This | # ]
|
- Modern PCs - Authored by: Totosplatz on Saturday, October 31 2009 @ 10:16 PM EDT
- I agree with Microsoft. - Authored by: Crocodile_Dundee on Saturday, October 31 2009 @ 10:17 PM EDT
- So do I - Authored by: Anonymous on Sunday, November 01 2009 @ 01:28 AM EST
- As I understand it . . . - Authored by: tyche on Saturday, October 31 2009 @ 11:05 PM EDT
- The point is - Authored by: Anonymous on Saturday, October 31 2009 @ 11:14 PM EDT
- Correcting Microsoft's Bilski Amicus Brief -- How Do Computers Really Work? - Authored by: PolR on Saturday, October 31 2009 @ 11:19 PM EDT
- Correcting Microsoft's Bilski Amicus Brief -- How Do Computers Really Work? - Authored by: DaveAtFraud on Sunday, November 01 2009 @ 10:15 AM EST
- Just a bunch of switches? - Authored by: hardmath on Monday, November 02 2009 @ 05:50 PM EST
- Building blocks to more complex structures (with natural languages as an example) - Authored by: rhdunn on Tuesday, November 03 2009 @ 04:02 AM EST
- You've never designed a chip - Authored by: Anonymous on Tuesday, November 03 2009 @ 10:24 PM EST
|
Authored by: gvc on Saturday, October 31 2009 @ 10:16 PM EDT |
The main fallacy in the amicus brief is that software -- even machine language
-- specifies the physical operation of the computer "at the transistor
level" as they say. It does not. All programming languages -- even
machine language -- implement "abstractions" rather than physical
processes.
I don't find the discussion of Turing machines at all compelling. The Turing
machine is an imaginary mathematical notion used to characterize the limits of
what can be computed by *any* computer.
A more compelling argument, I think, is that at some point a complex system
ceases to be simply the sum of its parts. The underlying implementation of a
modern computer is irrelevant to its operation. What is relevant is the
abstraction that it implements. Software that harnesses that abstraction to
perform particular operations cannot reasonably be characterized as a physical
process.[ Reply to This | # ]
|
|
Authored by: tpassin on Saturday, October 31 2009 @ 10:29 PM EDT |
It's probably better not to rely on similarities to Turing machines too much.
Current computers aren't very much like Turing machines. The Turing machine was
a mental device used to explore the mechanics and limits of computing.
While it's true that most current computers do function by setting bits to 0 or
1, that is not inherent to the idea of programmed computing (where the program
can be seen as data in itself). Given suitable hardware, other number bases
could work just as well, and in the early days of computers, sometimes were.
In addition, I/O operations (I mean interrupts) are not necessarily equivalent
to simply setting switches to 1's or 0's. And even internal operations on
current computers still depend on strobe and clock signals, which again are more
complex than simply setting binary values into switches.
So I agree that the Microsoft explanation of how a computer works is too simple
and leaves out a lot, but this article doesn't really hit the mark, either, in
its discussion of Turing machines, etc.
It's interesting to speculate about the relation of copyright to the various
steps needed to create a program. For example, a compiler does not produce a
copy of the source code as its product. Is the compiled program a derivative
work? How could it be, it's not even human readable? If someone claims
copyright to the source, how does that give him copyright to the compiled
program? After all, the compilation is not a creative act, it's mechanical, so
perhaps the compiled program isn't a work subject to copyright (or shouldn't
be), since it is not the result of a creative effort.
Then the compiled program is not the runnable program, anyway - that is the
result of the linker's activity. So what is the relation between the running
program and the source code? In the case of interpreted programs, the
computer's activity is the joint result of the interpreted program, the
interpreter, and the computer environment. In this case, what is the
(copyright-wise) relationship between the computer's activity and the
copyrightable source code?
Interesting questions to ponder ...
---
Tom Passin[ Reply to This | # ]
|
|
Authored by: nsomos on Saturday, October 31 2009 @ 11:05 PM EDT |
That one phrase jumped out at me .... about
"computers become special purpose machines when software
is run on them."
This is about the same as saying when a car is driving
to the hospital, it is an ambulance, when it is rushing
to a fire, it is a fire engine, when it is used to deliver
a package, it is a UPS truck, when dropping off kids it
is a school bus, etc etc etc.[ Reply to This | # ]
|
|
Authored by: jesse on Sunday, November 01 2009 @ 12:03 AM EDT |
So I wouldn't call Charles Babbage's 1840s Analytical Engine the
design for a
computer. It didn't incorporate the vital idea which is now
exploited by the
computer in the modern sense, the idea of storing programs in
the same form as
data and intermediate working. His machine was designed to
store programs on
cards, while the working was to be done by mechanical cogs and
wheels.
Well... I would because:
Most people do not
include the punched cards as part of the system, I
do because one of the outputs
planned was more
punched cards.
Although Babbage did not consider
having his machine write programs,
it could IF the output punched cards
were
then used as a later program input.
That is why I personally consider
the analytic engine the
first computer. Even though it was never completed,
the
concepts embodied within are directly comparable to the
modern computer.
Even having the program separate from
the mechanical workings, it mirrors some
computers that
maintain security by marking some memory as execute only.
This
directly compares to the punched cards as being "read-only".
But as
stated, this is only my opinion. [ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 12:32 AM EDT |
There is a level of confusion in the argument and response that keeps it from
hitting the point squarely.
If computers are just switches, on or off, why don't we use a row of LED's for
output, and use a row of switches for input? There are many layers of
transformation of abstraction between the input and the output. We are
confusing simple input with simple output. Using a computer in a human usable
way is nonsense to the computer itself.
There are also many processors that connect the many representations of on or
off in the "computer". Starting with the keyboard processor, the
serial input port processor, the DMA processor, the buss processor, the CPU, the
GPU, the monitor processor. When we say computer, we can mean all of that taken
together, or just the CPU at the heart of the general processing. All the rest
of the processing functions are transforming abstractions and representations of
various types of signals to a format suitable for the CPU, and back from the
CPU. What is meant when we say computer? ENIAC was just the CPU.
Ones and zeros are just a different representation of data that is abstracted as
visible light and sound. Humans cannot accept input in any other way. The
question is: Can we patent a class of human interaction, rather than a specific
implementation? Software is meaningless without some human interface at some
point in the system.
-- Alma[ Reply to This | # ]
|
|
Authored by: tyche on Sunday, November 01 2009 @ 12:40 AM EDT |
Please include the title (and maybe the URL) of the news pick, just in case it
already rolled off the screen.
Thanks
Craig
Tyche
---
"The Truth shall Make Ye Fret"
"TRUTH", Terry Pratchett[ Reply to This | # ]
|
|
Authored by: SpaceLifeForm on Sunday, November 01 2009 @ 12:42 AM EDT |
s/Po1R's collected/PolR's collected/
---
You are being MICROattacked, from various angles, in a SOFT manner.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 01:22 AM EST |
Off
On for short time
On for long time
It differs from the modern internet in that human operators
must interpret the routing information, and make the connections.
Although it would be feasible to build a machine to do this,
such a machine was never AFAIK built.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 03:38 AM EST |
It's a fairly sterile debate, and it's coming to the question of "Do you
want the next generation taught to develop and maintain software, or not".
If you do, then you had better say "Thanks!" to the teachers (and
maybe pay them), and not threaten to discipline them commercially for infringing
intellectual property laws.
I guess we'll see, but it's my no means clear which way this will fall in the
USA.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 04:32 AM EST |
None of this refutes the basic premise. Software turns switches (usually
transistors) on and off. This just goes off on some philosophical tangent.
As far as the "universal machine" goes, as if that would matter...
Vista can't even be run on machines it was meant for, let alone on a Mac or a
PDP 11.
What possible bearing should the theoretical supposition of running slow
emulators have on whether process need to transform something or be associated
with a machine in order to be patentable?
Do try to stay on point people.[ Reply to This | # ]
|
|
Authored by: Ian Al on Sunday, November 01 2009 @ 05:09 AM EST |
Thanks to PoIR for this analysis. It needs all the little grey cells to
understand why, but it shows the Microsoft brief to be hollow and lacking in
merit.
I'm letting my mind wander, for fun. Because it is such a long ramble, I attach
it as a comment. Please note that the references are not to expert opinions.
They are often to Wikipedia. You might find the inspiration to read PoIR's piece
again in case you missed something.
---
Regards
Ian Al
Linux: Viri can't hear you in free space.[ Reply to This | # ]
|
|
Authored by: leopardi on Sunday, November 01 2009 @ 05:26 AM EST |
I'll try to keep this brief.
-
Software consists
of one or more computer programs,
and possibly includes initial
data.
-
A
computer
is a machine which automatically executes
programs.
-
The
execution of a
program transforms input and old
stored
data into output and new stored
data.
-
A
computer program is a collection of instructions
which
is encoded for
execution on a computer.
-
The
instructions in a program are usually structured as a
collection of
implementations of algorithms.
-
An
algorithm is a collection
of instructions which
describe how to
perform a computation.
-
A
computation is a specific
transformation
from input+data into output+data.
I'll
respond to this posting with a brief description of
Universal Turing Machines
and von Neumann machines.
[ Reply to This | # ]
|
- So then, BY DEFINITION (7), software passes - Authored by: Anonymous on Sunday, November 01 2009 @ 06:05 AM EST
- So then, BY DEFINITION (7), software passes - Authored by: Ian Al on Sunday, November 01 2009 @ 07:04 AM EST
- So then, BY DEFINITION (7), IEEE-USA is in favour of patenting algorithms? No. - Authored by: leopardi on Sunday, November 01 2009 @ 08:44 AM EST
- Deus ex machina - Authored by: Anonymous on Sunday, November 01 2009 @ 01:25 PM EST
- So then, BY DEFINITION (7), software passes - NOT - Authored by: PolR on Sunday, November 01 2009 @ 10:51 AM EST
- In short: Software is about Symbols going in and Symbols going out - Authored by: Winter on Monday, November 02 2009 @ 04:01 AM EST
|
Authored by: ThrPilgrim on Sunday, November 01 2009 @ 06:39 AM EST |
1) A computer program is patentable because it creates a unique circuit on a
computer. - from the brief.
2) A computer is an electronic device concisting of transisters. - from the
brief.
3) If the computer program in 1 can be shown to do somthing other than create a
unique circuit on a computer then point 1 is incorrect and a computer program
can not be patentable because it creates a unique circuit on the computer.
4) I can take a copy of the computer program in some symbolic form and give it
to a group of individules trained to reconise the symbols and perform actions
depending on the symbols, such as but not limited to stand up, sit sown, wave
hands in the air.
5) Depending on the state of an individual's colieges and the individual the
action taken on any perticular symbol may change.
6) Output is generated by having an individual record the state of the group at
descreet intavals and converting the state into characters by using a lookup
chart.
7) The individual recording the output has no access to or knolodege of the
symbols used in encoding the computer program.
8) Thus the computer program has not created a unique circuit on a computer but
it has resulted in an output state that can be interpriated.
9) Therefor point one is incorrect and software is not patentable because of its
ability to create a unique circuite on a computer.
Pleas pick holes in, improve this :-)
---
Beware of him who would deny you access to information for in his heart he
considers himself your master.[ Reply to This | # ]
|
|
Authored by: Magpie on Sunday, November 01 2009 @ 07:09 AM EST |
I think the argument being made is too obtuse. A more straight forward approach
to attacking the brief would be to focus on the "rewiring the
pathways", and the apparent jump from since the since the program is a
specification of the requiring it is a patentable machine.
Computers do NOT work by a program rewiring the pathways. The computer itself,
or more exactly the CPU itself undertakes the task dynamically over time in
response to BOTH instructions from the program AND from the interaction with it
and external stimulus.
[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 09:18 AM EST |
The point is that the description of an algorithm, in C, C++, Perl, BASIC, or
whatever - is independent of how computers work.
One of the
best-known of all algorithms - the
Sieve of
Eratosthenes - was devised more than 2000 years before any computer was
invented.
Any discussion of how Babbage's computer worked, or of how an
IBM 360 worked, or of how an Intel chip works, or of how computers in 100 years'
time might work, is completely irrelevant to the nature of most computer
programs. [ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 09:20 AM EST |
"Is it true, as Microsoft wrote in its brief, that computers are at heart
just a "collection of tiny on-off switches--usually in the form of
transistors"? Or that "The role of software is simply to automate the
reconfiguration of the electronic pathways that was once done manually by the
human operators of ENIAC"? Are computers just a modern equivalent to the
telegraph or the Jacquard loom, a series of on-off switches, as the brief
asserts?"
Aside from pointing out Microsoft's brief is flawed, what can we accomplish with
this?
When an scientist attempts to explain something to a lay person, and then that
lay person uses that explanation in a legal situation, there is a huge
opportunity for error.
Perhaps this is the greatest service of Groklaw, not documenting the SCO case,
but encouraging those knowledgeable in the computer industry to enter the legal
field.[ Reply to This | # ]
|
|
Authored by: TemporalBeing on Sunday, November 01 2009 @ 09:39 AM EST |
Po1R - very well said, and I very much agree. I'd also like to add the following
tidbit.
These allow control of the transistors on a chip at various
levels of specificity, ranging from "machine language," which allows
transistor-level control, to "programming languages," which allow operations to
be defined through formal syntax and semantics that are more easily understood
by humans. Each language pre-packages the mathematical and logical operations
that are most useful for the users of that particular language. See Patterson
& Hennessy, supra, at 11-13, 20-21, 76-80.
This is
technically inaccurate. To the programmer there is no such thing as transistor
level control any more, perhaps never was
since the advent of the assembler.[1]
Most all processors now implement what is called 'micro-code'; micro-code is the
true language
of the processor - and assembly language of sorts - and only the
processor maker writes any code in micro-code. The purpose of
micro-code is to
further abstract the processor from upper level languages and enable the
procesor maker (e.g. Intel, AMD, Motorola,
etc.) to fix some logic bugs (e.g.
the Pentium Floating Point bug) that would in previous generations required
rewiring the processor
and new hardware.
Processors are exposed to
programmers via the processor's Assembly Language, which can be coded either as
numeric digits (e.g. octal, hex, binary) or assembly pneumonic (e.g. mov ax,
bx). Assembly Language no longer maps to specific transistors; but to the
micro-code
described earlier. Additionally, each Assembly Language Instruction
is more equivalent to using a calculator than to rearranging the
use of
transistors. Assembly Language is essentially broken into two groups:
Mathematical Instructions, and Memory Instructions. The
Mathematical Instruction
set performs mathematical operations (add, subtract, multiply, and divide) and
directly relate to all
mathematical principles.
The Memory Instruction
set consists of two operational types: (i) data access, and (ii) instruction
access. The Data access operational
type basically tells the system where to
load and save memory to; sometimes (as in the Intel x86 instruction set) the
Data Access
is combined into the Mathematic - e.g. the mathematical instructions
can operate on memory regions as well as local data storage locations called
registers. The Instruction Access tells the system where to find the next
instruction. For all intents and purposes
Data Storage Devices such as Hard
Drives are seen simply as memory storage locations.
The entirety of the
Assembly Language for any computer can be fully mapped to mathematical equations
and formula. This is often the
basis of the reasoning behind why programming is
math.
Higher Level Programming Languages such as C, C++, Ada, Pascal,
and many others simply provide the Assembly Language functionality
in useful
functions and easier readability. For an Assembly Language to be useful, one has
to have a series of function sets; higher
level languages provide those function
sets with well-known interfaces.
Some Higher Level Programming Langauges
such as Sqeak, and Java provide further abstraction from the hardware
level.
[1]Really, the only time there was transistor level control was
with systems like ENIAC where you
had to move the cables, etc. between vacuum
tubes to program the system; once we entered into the era of recording
information into
a computer reable form (e.g. punch cards, tape, etc.) to
program the computer we left the era of transistor level control. [ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, November 01 2009 @ 10:03 AM EST |
Describing a computer as a bunch of switches, even from a strictly
hardware-oriented reductionist viewpoint, is so severely lacking as to be lying
by omission. The connections between the switches are as important as the
existence of the switches.
I'm talking just of the architecture of the *wiring* here, not of how software
can be used.[ Reply to This | # ]
|
|
Authored by: polymath on Sunday, November 01 2009 @ 11:04 AM EST |
The last extract reveals that Microsoft's brief fails even on its own
terms.
While Morse's telegraph was patentable the sequence of 1's and 0's
used to send any given message was not patentable. While Jacquard's loom was
patentable the arrangement of holes on the cards that produced cloth was not
patentable. While the machines for manipulating Hollerith cards were patentable
the arrangement of holes representing the information was not patentable. Even
a printing press is patentable subject matter but the arrangement of type to
produce a story is not patentable. Likewise computer hardware may be patentable
subject matter but the pattern of transistor states that represent programs and
data are not patentable.
All of those patterns are the proper subject matter
of copyright law. Neither patents nor copyright protect ideas or knowledge and
we are free to create new implementations and/or expressions of those
ideas.
To allow a patent on a program is akin to allowing a patent on thank
you messages, floral fabric designs, census information, or vampire stories. It
is simply nonsense.
[ Reply to This | # ]
|
|
Authored by: cjk fossman on Sunday, November 01 2009 @ 03:27 PM EST |
This statement is wrong:
The fantastic variety in which computers
are now found can obscure the remarkable fact that every single one is, at its
heart, a collection of tiny on-off switches
It is wrong because
a computer made up only of switches will not work.
A computer needs a
strobe signal, or clock pulse, to tell the processor to fetch the next
instruction. A simple switch will not supply these clock pulses.
The
multi-megahertz crystal on your computer's system board is the real heart. The
transistors are the lungs, kidneys, liver and stuff like that. [ Reply to This | # ]
|
|
Authored by: BitOBear on Sunday, November 01 2009 @ 09:19 PM EST |
The old business machines had wire-board and jumper panels. The act of
installing wires and moving jumpers effectively _created_ new instructions, all
based on electrical cascades. That is, by connecting the output of one switch to
the input of another the programmer could create "add" or
"add-with-carry" as opposed to "subtract". Programmers would
often have wire-boards and jumper panels at there desks and then physically
install them to perform a run.
In a modern computer, where "modern" starts somewhere in like the
forties (my detailed memory of timeline events fails me here), all of the
"instructions" are built into the computer when it is constructed.
That is, when you check the manual there is a list of data-patterns where one
pattern is "add" and another is "add-with-carry" and so on.
There is also a section on where/how the computer picks where to start looking
for instructions when it is powered on.
But what goes around comes around, so now days we have the Field Programmable
Gate Array (FPGA). This is blind 3D array of transistors. When you fill it full
of data, many of the transistors are disabled. What is left behind is an
electrical cascade matrix very like the old patch boards. Basically you can fill
an FPGA (of sufficient base capacity) with the picture of any computer chip you
want, and it will become that chip. It's still data programming a device but the
distinctions become obscure. For instance I can go out and buy/license "the
picture of an 80286 for a FPGA" and then use it to build a PC/AT (circa
1984) shouls I so desire.
In all cases however, a compiler is used to make the FPGA image, converting an
instruction stream into a program that is loaded into the FPGA. Just like any
other program (barring a massively parallel execution stream for FPGAs) it is
data causing a machine with known limits to operate wholly within those limits.
Again, and always, think sewing machine. I can make an infinite number of
garments, drapes, and whatnot all by operating a "sufficent" sewing
machine on the proper raw materials. I don't get to patent the "duvet"
just because I used a particular machine to make it. Just like I didn't get to
patent making it by hand. Both the standard sewing kit, and the sewing machine,
had "duvet" as reasonable end product.
The _real_ problem with Microsoft's brief is that the guys and gals back in the
patch-panel and wire-board era of computers _didn't_ get to patent their
separate "programs" for the ENIAC etc. The guy/entity/company that
invented the patch-panel and the machine that used it got the "first fruit
of their invention". The "programmers" were all just using the
patented-or-not tool.[ Reply to This | # ]
|
|
Authored by: Nivag on Sunday, November 01 2009 @ 10:32 PM EST |
It is clear that computers are nothing but rather sophisticated mechanisms for
doing chemical reactions. Since all the electronics of the computer are doing,
is taking electrons from one group of atoms and moving them to another group of
atoms. A knowledge of chemistry is vital in deciding what elements and
compounds that can have their electrons moved and those that prevent the
electrons going to the wrong places.
It is therefore inappropriate to suggest that it is an automated mechanical
device, let alone an engine of logic.
END SARCASM[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, November 02 2009 @ 01:36 AM EST |
Microsoft's argument hinges on the premise that software is analogous to a
setting of switches that cause the computer to perform a measurable
transformation.
Assume the switches specify the program and optionally any data (the program can
also read data from other sources), which gives a resultant computation (which
is physically measurable).
Either:
* A particular combination of switches is patentable
* Every combination of switches that produces the result is patentable.
In the first instance, the patent covers a particular expression of an idea.
This is what copyright covers anyway. A patent on this isn't very useful because
there are a practically infinite number of ways the switches could be set to
produce the same result. Instructions could be reordered, operations could be
substituted for equivalents (e.g. subtraction replaced by addition of a negative
value).
In the other case, any combination of bits that produces the output is
patentable, but this means the patent covers a process of obtaining a result,
and processes are not patentable (it says so in Microsoft's brief).[ Reply to This | # ]
|
|
Authored by: halfhuman on Monday, November 02 2009 @ 05:08 AM EST |
It may help to summarise the extensive and at times entertaining contributions.
1. The point of PoIR's article is that the Microsoft brief attempts to equate a
modern computer, while running software, to a device that is actually far
simpler. The brief is therefore deceptive and legally unreliable.
2. This simpler device would indeed be patentable, if it existed. (As pointed
out in earlier discussions, the combination would then create patentable
machines at the rate of billions per second).
3. However, there is no physical reality to this simpler device. It exists only
in a symbolic sense. It is only known by symbolic means, that is, through the
transformation of symbols.
[Small diversion: We neither know nor care whether the transformation of symbols
required a unique pattern of transistors, but usually we are sure it did
not---this post, for instance, can be read using many different CPUs (and even
using identical CPUs the thread may use different transistors on different CPUs
or at different times in the same CPU). This is completely irrelevant to almost
all software. I say almost all, but in fact cannot think of an excpetion.]
4. The process of transformation of symbols into symbols gives a complete
description of the software. The material basis (transistors, valves, quantum
gates, rooms full of people) is irrelevant to the description of software as
software (though to be sure the choice of software is strongly influenced by the
device it is to run on).
5. The correct mathematical framework is the Universal Turing Machine. The
correct computer science framework is the von Neumann architecture.
6. (See 4 above) The relevance to the Bilski appeal is that software on von
Neumann architecture is purely process which relies on the purely non-material
aspects of symbols. The most succinct summary was (from memory): "On a von
Neumann machine, programs and data are equivalent. Data is not patentable,
programs should not be patentable".[ Reply to This | # ]
|
|
Authored by: s65_sean on Monday, November 02 2009 @ 07:17 AM EST |
What impact did the discovery of the universal Turing machine
have on how computers work, compared to prior special-purpose computers like
ENIAC?
This implies that the universal Turing machine was
something that already existed in nature, and man merely stumbled upon it. Is
there another definition of discovery that eludes me? None of the definitions of
discovery that I can find have anything to do with inventing.[ Reply to This | # ]
|
|
Authored by: reiisi on Monday, November 02 2009 @ 11:07 AM EST |
What is the technical name for the logic fallacy of trying to pile a whole bunch
of bad arguments together to make a
valid argument?
I would be ashamed. I mean, submitting this kind of brief should be grounds for
disbarment. The kind of careful
grafting together of non-sequiturs, begged questions, red herrings, inverted
implication (composition), ... .
Did they miss any at all?
Well, I'm at a loss for words. Anyway --
Collection of tiny on-off switches. No, collection of amplifiers being coerced
as switches. Constrained operation, only
the binary states at the extremes are used. The amplifiers operate in switching
mode (gate) and feedback mode
(oscillators certain kinds of memory cells, logic gates, etc.) and several other
modes. And even the switches are more
than just on-off. Without multiplexing, you can't address a cell. And then
there's flash memory, more of an array of
capacitors (holes to be filled) than an array of switches, although you do
select the cell to read or write with a
multiplexor. (No, no, no, a multiplexor is not just an array of on-off
switches!)
The ENIAC. Are they going to suggest that anyone beside the patent owners on the
ENIAC should be allowed to patent
configurations of the cables? I mean, there's configuration and there's
configuration.
Configuring a series of pulleys, levers into a machine, yeah, there's invention
there. Implementation.
But, say you've "configured" a collection of more fundamental parts
into a programmable combination
microwave/conventional heating oven. Do we seriously want to suggest that
further configuration of the (designed to
be configurable) oven's memory cells (speaking extremely loosely) to
automatically bake a certain kind of bread is
somehow original enough to warrant a patent independent of the patent on the
oven? That could encumber the rights
of the owners of the patent on the oven?
Back to the ENIAC. It was designed to be configured. EVERY configuration of the
patented elements of the ENIAC
should be considered to be under the patent on the ENIAC, if the ENIAC is
patented. And, if not, since the
configurations are made available by the design of the inventor, then every
configuration should be considered
obvious.
Sure, it may not be obvious to attach an ENIAC to a missile. Or, rather, the
interface might be separately patentable
due to original invention necessary to connect the calculation outputs to the
controls. But that's not a configuration of
the ENIAC itself. It's a configuration of parts that are attached to the ENIAC.
Uhm, yeah, I'm not thinking of actually launching the original ENIAC. I'm
thinking of the micro-ENIAC or, if back in
the days of the original, of maybe radio control or of a device that selects and
feeds a flight table computed by the
ENIAC into the missile's controls. There's where you have something patentable.
CPUs are patentable. But if programs are patentable, we are going to have to
argue that there is something beyond
configuration going on in CPUs to argue patentable programs independent of the
patents on the CPUs. Not the
processing of symbols, not symbolic processing. Technically, the ENIAC had that.
Reprogrammability is a feature of
the function in question.
In order to argue patentability of software, we have to argue that the software
imparts a tangible existence to the
virtual machines being implemented, independent of the hardware the software is
running on. Microsoft's brief is
trying to argue that by obfuscation. (Yeah, Microsoft loves the way you can use
computers to obfuscate things.)
But even if you can convince a judge or jury of the tangible existence of the
virtual invention, you still have to show
that the existence of said invention is more correctly classified as the type of
invention covered by patent, rather than
the type of invention covered by copyright.
Symbolic processing is related to the question, but the real question is
complexity. (Not coincidental, that.)
Simple machines can only handle basic levels of complexity. They can't proceed
to a point, recognize an invalid state,
back up, and start down another path. (I'm talking about context free. And,
somewhere in the freedom-loving
philosophies of the 1960s, context-free existence somehow got valued as greater
than context-sensitive existence,
but that is just plain upside down.)
A machine that can back up and go down another path is at least context
sensitive. (Yeah, I'm mixing the engineering
term "machine" with the mathematical terminology of grammars.)
Humans operate in the range of context sensitive and unrestricted grammars. That
is, if we analyze our behavior
mathematically, we can (loosely) describe our behavior similarly to machines
designed to implement context sensitive
or unrestricted grammars.
Useful computer languages are specified as context sensitive grammars to make
them provable, but they are
implemented as unrestricted grammars.
CPUs, on the other hand, implement context free grammars. You have to add lots
of memory and programs that
access the memory in certain was to get context sensitive or unrestricted
grammars.
Even then, actual computers in operation are technically context free machines
(sorry) implementing subsets of more
complex machines. Within the limits of memory, they behave as the more complex
machines. Very large memories
allow them to behave, for many practical purposes, as context sensitive or
unrestricted.
All patentable machines implement context free grammars. Anything that goes
beyond context free rightly belongs in
the class of invention known as literature. There is a somewhat grey area where
you implement a subset of a context
sensitive grammar, when a machine can reliably recover from a few simple error
states with minimal human
intervention. But when the machine is programmed, that is absolutely a
linguistic function, a literary function, and
should fall under copyright instead of patent.
I've wandered.
Anyway, Microsoft is trying to pile up as much indirectly related fact as
possible to hide the fact that they are saying
that literary works should be patentable, and deliberately trimming off branches
of the argument that would lead to
the recognition that programs are literary works, intended to be interpreted as
literary works by machines that
emulate intelligent behavior at a primitive level.
I have to go to bed. Lots of work tomorrow.
Joel Rees[ Reply to This | # ]
|
- fallacies - Authored by: fstanchina on Monday, November 02 2009 @ 03:53 PM EST
- fallacies - Authored by: Anonymous on Tuesday, November 03 2009 @ 10:09 AM EST
|
Authored by: Anonymous on Monday, November 02 2009 @ 04:46 PM EST |
You're arguing about architectures - von Neumann vs. Harvard, shared memory vs
memory segregated between instructions and data.
I guess the thrust of your argument is something like this:
- only the former can really implement a universal Turing machine;
- modern machines follow von Neumann, with shared data/instruction memory;
- being Turing machines, their software implements mathematics;
- Mathematics is not patentable subject matter.
This is a good strategy. But Microsoft lawyers will argue, correctly, that
modern machines are in fact a mixture of both. At the physical level - the
transistors they harp on about - there is indeed a combination of data and
instruction segregation, because that is efficient engineering.
It's worth pointing out though that the software everyone is worried about does
not take account of this. Indeed, high-level abstracted languages can't take
account of it because there are many different engineering shortcuts of the
Harvard sort. It's the job of a compiler to make a mixed architecture look like
von Neumann.
Upshot: for Microsoft to get so low-level as to talk about transistors is to
argue about the wrong hardware. [ Reply to This | # ]
|
|
Authored by: gvc on Monday, November 02 2009 @ 07:19 PM EST |
PJ,
Check Wikipedia, and you'll see that Turing machines predate ENIAC by about a
decade, and that ENIAC was a fully Turing-complete computer. So I think your
intro presupposes facts not in evidence.
I'm not sure how you, as a non-technical person, should adjudicate all the noise
in this thread. That's one of the unresolved issues about "open
source." If everybody has an opinion, and there's no consensus, how do you
resolve it?
Do you appeal to the wisdom of crowds, and if so, how do you measure the
prevailing opinion when all opinions are from self-selected authors? Or do you
appeal to authority, and if so, how do you establish the authority of those who
post?
Wikipedia has these problems, of course. But I trust you'll agree that
Wikipedia is more likely than a random opinion here to be correct.[ Reply to This | # ]
|
|
Authored by: /Arthur on Tuesday, November 03 2009 @ 09:24 AM EST |
The Microsoft amicus brief did not explain that modern computers are
programmed from symbolic information. Likewise it doesn't discuss the difference
in nineteenth-century patent law between patenting an Industrial age device that
manipulate information and patenting the *information* in the device. Should
they have explored this difference they would have found that the hole patterns
in Jacquard punched cards and piano rolls are a much closer equivalent to
computer software than any Industrial Age physical
apparatus.
This tell's more on how Microsoft looks at there
own programs !
/Arthur [ Reply to This | # ]
|
|
Authored by: Anonymous on Tuesday, November 03 2009 @ 09:46 AM EST |
If Microsoft is correct and all software does is reconfigure the on/off switches
in a described manner, wouldn't the same software twice in a row need to produce
the same device, i.e. with the same configuration of the switches?
Does starting Abiword ever cause it to be loaded in exactly the same memory
locations twice? Is it the same special purpose machine?
For sure the patented machine will never be the same as the machine I am running
it on.
I doubt the patent would ever list the configuration of on/off switches needed
to implement the invention, I doubt the inventor even could.
If the schema of switches is not supplied doesn't the patent application lack
specifity?
If you load the allegedly patented program, there is only a random chance you
will actually replicate the "invention". How can you enforce a patent
when you cannot predict if it is actually going to be implemented by running a
specified program? Because you don't know if the switches will be configured
exactly in the way you described (if you ever did.)[ Reply to This | # ]
|
|
Authored by: Ian Al on Wednesday, November 04 2009 @ 06:02 AM EST |
My thoughts have been all over the place on this one.
I share the same problem
that most of the rest of you do. You know that computer programs on general
purpose computers are carrying out algorithmic processes on symbolic
representations of quantities and qualities. We know 'you can't patent
algorithms or scientific facts'.
I think I see Microsoft's approach. They
cite expert opinion that programs set and reset switches in computers
The company he founded became the International Business Machines
Corp., and the once-prevalent IBM punch-cards were both the direct descendent of
the means used to program a Jacquard loom and the immediate predecessor to
today's CDs and other media, which contain digitized
instructions thus altering paths.The role of software is
simply to automate the reconfiguration of the electronic pathways that was once
done manually by the human operators of ENIAC.
The Microsoft
argument starts much earlier. They start with the Bilski finding that a patent
can only be granted on a machine or material transformation of a tangible
article or substance. They start with the argument that the machine does not
have to be the invention, but what you do with the machine should be patentable
subject matter. But no particular "machine" is
required
They are really starting by saying that inventions are
applied science as opposed to pure science which is not patent subject material.
They point out that the inventive application of general machines and processes
should not, in itself, preclude patent awards.
While the popular
conception of "software" as something that is functionally distinct from
"hardware" can be useful, it tends to obscure our understanding of the physical
processes taking place within the computers all around us. This is reflected in
the commonly used term "software patent," employed by petitioners. So-called
"software patents" generally do not actually describe software at all, but
rather the process performed by a programmed computer. It is such a
computer-implemented process-- not software itself--that is potentially eligible
for patent protection. For this reason, the notion of "software patents" as a
category that is distinct from digital hardware patents lacks any coherent
technological or legal basis.
Purporting to analyze the patent-eligibility
of software, as opposed to that of hardware, relies on an illusory distinction.
The functionality of any digital device is the product of the same transistor
activity, and it is the configuration of the pathways between those transistors
that dictates their functionality. Like all patent-eligible processes,
computer-implemented processes combine physical activity with human-directed
logic. Irrespective of whether a particular configuration of transistors is
accomplished using a soldering iron or by means of software, the processes
conducted by these transistors are ultimately physical processes.
It follows
naturally from this understanding that the innovation that employs a computer to
do new and useful things is not necessarily encompassed by the innovations of
the transistor (or computer) itself--that is, a new way to use an existing
computer may itself be patent-eligible. ("The term 'process' ... includes a new
use of a known process, machine, manufacture, composition of matter, or
material"). Although the court dismissed this statutory text as "unhelpful" it
confirms that, among other things, each new application of computer technology
(at heart, each new use of transistors) which permits computers to perform a
useful function is the product of human innovation, the application of
principles to the functions of human needs.
In this respect, modern
computer-related inventions are no different from other patent-eligible
innovations that have produced a new and useful result by employing physical
structures and phenomena to record, manipulate, or disseminate
information.
If we can show that the programmer does not
know what switches are set and what paths are made then the setting of
the switches and paths (if any) is not part of the invention and is a moot
argument. The inventive subject matter must be what the inventor has created
and, with no link to the workings of the computer, that only leaves the form of
the program itself to be protected.
Many folk point out that, for general
purpose computers, programs do not set or change paths. They can only set or
change memory locations. Any other switch changes are a function of the computer
wiring and components. However, by using assembler language mnemonics and
compiling the resultant assembly language program the contents of registers and
the order of the processor's algorithmic steps can be controlled by the
programmer. This might be claimed as a descendant of changing the patches on the
patch panel of ENIAC. However, folk have commented that the different patch
patterns are no more patentable than the Jacquard loom patterns on the punched
cards.
TemporalBeing, whilst firmly being in the 'can't patent algorithms'
camp points out that some programming, as in the case of embedded controllers,
is done in assembly language which is a direct representation of the binary
machine code instructions used by the processor, the data that will be processed
and the input and output to the real world that will do useful things. The
programmer is not setting pathways, but is directly setting switches rather than
using a higher level language which conceals that from the programmer. He went
on to comment that this is usually done to create an embedded component in a
larger machine and so we have to accept that such components may be of a
patentable nature. I have argued in previous articles that they still fail the
algorithm test and that only particular interconnections of electronic
components might qualify for patents. The pathways of embedded controllers are
not changed by the program.
Although I generated fierce opposing comment
when I asserted this, if assembly language code is run on a computer controlled
by an operating system and that code does not limit itself to the OS APIs and
other published software interfaces then it will interfere, probably
terminally, with the operating system. Folk point out that the processor knows
no difference between the machine code instructions from an assembler program
and those from a higher level language program, but I maintain that ignoring the
OS will result in a machine hang or a BSOD. The reason for this is that the
program will be changing memory states and peripheral configurations without the
OS being aware of it. The moment the OS does something on the basis of those
altered states then there will be a major operationsl failure. Machine code
programs used correctly with a modern OS software interfaces is only replacing
the algorithmic steps of a higher level language and is not using an in-depth
knowledge of the hardware architecture.
In previous articles I mentioned
Field Programmable Gate Arrays whereby the pathways are changed by the program
to create the specific circuit. The program is stored in memory on the circuit
board and the FPGA loads the program and creates the circuit when the board is
powered up. In this case, a very high language (the development tool) is used
and the programmer is only aware of a block diagram representation of what the
FPGA will do and is not aware of the detailed circuit design.
Vic pointed out
that 'the management of the memory space is an OS-level feature (except when the
OS does not have memory management, in which case it must be "handled"
elsewhere); the application will be handled some memory in which to operate. The
exact location of that memory is usually unknown to that application (and
irrelevant as well). It gets a memory space; a sibling instance will also get a
memory space, and the two will probably look
identical from the application's
perspective. But they are different memory spaces'. So the programmer
programming for a modern operating system (as opposed to a DOS) has no control
over the setting of any switches and there is no equivalence with the early
computers and programmed machines that Microsoft cite.
In summary, Microsoft
maintain that programs, once turned into binary blobs and loaded into a general
purpose computer, are no different to rewiring the computer with a soldering
iron to do a new and useful job. I hope I have shown that it is different
because, with a computer using a modern OS, like a smart phone, the programmer
has no knowledge of the detailed working of the hardware and is prevented by the
OS of knowing what detailed effect the program will have on the hardware.
Further, the programmer does not know which switches are switched by the program
unlike the programmers of the ENIAC. Programs do not change paths on a computer
with a modern OS and so the comparison with changing pathways with a soldering
iron or patching the ENIAC with patch cords is a false one. And, finally, back
to the starting point. The direct patching of the ENIAC does require an
understanding of what the hardware does. However, as PoIR points out in the
article, the ENIAC was a digital device, the first electronic machine able to
deal with the same kind of mathematical problems as differential analyzers.
Programming it with a mathematical problem is not patent subject matter. Then
general purpose computers were devised based on the concepts pioneered by Turing
and von Neumann's realisation of the importance of the stored-program concept
for electronic computing. They broke the conceptual link between inventions that
used programming in previous times and the inventions that are in widespread
use, today. --- Regards
Ian Al
Linux: Viri can't hear you in free space. [ Reply to This | # ]
|
|
Authored by: Imaginos1892 on Wednesday, November 04 2009 @ 12:32 PM EST |
A modern digital computer - excluding the power supply, chassis, cables and
other parts that are
not relevant to its ability to be programmed - consists
of several integrated circuits containing
many billions of transistors,
diodes, resistors and capacitors which are interconnected to form
logic
circuits, registers, memory, instruction decoders, sequencers and many other
things. All of
those connections are permanently defined when the chips are
manufactured and the circuit boards
are assembled, and can never be changed
thereafter. Anything MS has put in their paper to imply
otherwise is either
a lie, or the result of a fundamental misunderstanding of how computers
work.
Which might explain some of the problems with their
software.
After the computer has been constructed, the only changes
possible are the voltages and currents
within the circuits. The computer
engineers have chosen to have certain voltage values at certain
nodes
represent binary bits set to 1 or 0 depending on the voltage present, allowing
the computer
to act as a binary finite state machine containing several
billion bits. From any possible state,
there are a number of defined
transitions to other states, and all possible states,
transitions,
and state sequences are already designed into the hardware. It
is not possible to "invent" a new
state that has not already been
anticipated by the designers. Some people will try to claim that
programming
an EEPROM modifies the circuit but that's not true either - it just places
persistent
stored charges in some parts of the circuits specifically
designed to hold them, which alters
current flow through adjacent parts of
the circuit.
A computer program is a list of state transitions. One of
the integrated circuit devices - the CPU
chip - reads the list in from
memory, interprets it as a sequence of instructions and operands,
performs
the specified state transitions, and hopefully performs some useful function. No
part of
the computer's circuits is modified in any way; only the voltages
and currents present in its
components change, in ways that the circuits are
specifically designed to support. And when a
user clicks on "Quit" or
switches the computer off, the program and whatever behavior or effects
it
bestows are gone without a trace.
Computer programming is not a
constructive or inventive process, it is a process of selection
and
exclusion - of paring down the vast number of possible successor states
at each point in the program
to the one that leads most efficiently toward
the required solution, or end state. At each line of
an assembly program I
select one of the machine instructions defined for the microcontroller I
am
working with and append one of the possible operands to get to the next
state. Did I hear "But
nobody uses assembly any more"? Not true. Some
operations are much more efficient in assembly,
and some can't be done at
all in C, like clearing all memory below the stack. C does not have a
syntax
for accessing the stack pointer register to make the comparison. On my current
project,
about 5% of the code is in assembly, and it's an important
5%.
High-level languages, standard libraries, system calls and other
such constructs simply represent
greater and greater degrees of aggregation,
reducing the programmer's choices to ever longer canned
state sequences in
which all the decisions have already been made. These constructs actually
limit
the programmer's flexibility, eliminating many approaches that would
be more efficient in terms of
code size and/or speed in order to make
programs easier to write and maintain. They are levels of
abstraction that
allow a programmer to operate at a coarser degree of granularity, but it does
not
matter to the computer. The CPU does not know whether you have linked a
pre-written library function
or pasted the instructions together yourself,
nor does it care.
The end result of running any computer program is the
conversion or translation of a bunch of bits
into a different bunch of bits.
That is all a microprocessor can ever do! The programmer and user
may
choose to agree that those bunches of bits represent some sort of object or
concept in the real
world, but that is irrelevant to the computer and to the
program. We are all able to communicate here
because we choose to agree that
certain patterns of bits in the video RAM of our computers
represent
characters, spaces, punctuation, words, etc. when processed and
converted into patterns of light on
our screens. None of that matters to our
computers, or to ibiblio or any the internet gateways and
servers in
between.
Composing this post has been a lot like writing a computer
program. I started with an empty screen
that had the potential to hold any
possible combination of letters, numbers and symbols. For each
location I
selected one of those symbols which I hoped would lead efficiently toward my
goal of
communicating certain ideas and concepts. My choices at each point
were constrained by the conventions
of English spelling, grammar and syntax,
at least to the degree that I desired it to make sense to
a reader using
those conventions.
(ibiblio and PJ have imposed additional constraints;
some sequences of symbols are not welcome here)
Some of you have
mentioned programmable gate arrays but have also missed a similar point -
the
unprogrammed FPGA does nothing because it has ALL possible
connections "made". Programming it
involves removing or blocking the
undesired connections to leave only the necessary ones. How can
you patent
the result of a process of selective exclusion? Don't get me wrong, it's a lot
of work;
but so is writing a novel. Both deserve the same kind of
protection: copyright.
A phonograph is patentable. The specific pattern
of wiggly grooves on "Secret Treaties" is not.
A movie projector is
patentable. The sequence of sounds and images comprising "V For Vendetta" is
not.
A pipe organ is patentable. Playing "Toccata and Fugue in D Minor" on
it is not.
A computer is patentable. The collection of bits making up "DOOM"
is not.
---------------------------------
SCO's horse is dead, the race
is over, the prizes have been awarded, the bets are paid off, the
spectators
have gone home, it's dark, and the crickets are chirping, but out
on the deserted racetrack that fool jockey
Darl is still convinced that if
he just keeps flogging the dead horse long enough it's gonna win.
[ Reply to This | # ]
|
|
Authored by: Anonymous on Wednesday, November 04 2009 @ 03:22 PM EST |
A purely mechanical device made of iron or steel (such as a real world paper
clip) is something that can be patented (and has been in the past). So is a
general purpose metal-working machine such as might be found in a "machine
shop". So is a variant of that machine which can be configured or
programmed to perform a series of manufacturing steps again and again while the
factory worker is elsewhere occupied.
Walking up to a general purpose metal-working machine and manually operating it
to make the patented paper clip is not patentable, but is restricted by the
paper clip patent. Configuring the automated metal-working machine to make
thousands of patented paper-clips is not patentable but is still restricted by
the patent on the paper clip. A ready-to-use punched card or other template to
make this happen is not traditionally patentable, but remains restricted by the
paper clip patent (this is explicit in the legislation).
Computer programs are a lot like such ready to use templates, they should not be
patentable per se, but may be templates for making something that may or may not
be patentable independently of the involvement of a program.
A computer program which makes a common household computer behave like a common
household Television receiver might reasonably be covered by a patent on
Television receivers in general, but replacing the non-computerized Television
receiver by a suitably programmed computer should not be considered a new
non-trivial invention, just as changing the wooden box around the TV screen to
one made of plastic is not a patentable invention in its own right.
[ Reply to This | # ]
|
|
Authored by: Anonymous on Wednesday, November 04 2009 @ 03:34 PM EST |
I seem to recall another important historic court ruling to compare to: The
frequently cited old ruling (sorry, I don't have the precise name or number of
that case handy) that the copyright on a book introducing a certain accounting
method did not imply a monopoly on products implementing that method, only a
patent (which had not been applied for in time) would have done that.
Could anyone add any details (like the precise citation and quote)?
[ Reply to This | # ]
|
|
Authored by: Anonymous on Thursday, November 05 2009 @ 05:03 AM EST |
In an understandable manner, one could argue that
-it is generally known that computers can execute multiple programs at the same
time
-this is because a stored program computer remains a stored program computer
even if a program is loaded
The fact that a programmed computer remains a computer contradicts the 'special
purpose machine' doctrine, because a 'special purpose machine' is not a computer
and as such incapable of executing a second program.
We could make this theory more accurate by considering resource contention.
If two programs share a computers, there is resource contention: there is only
one disk/memory/cpu and it should be shared. (Like people sharing a road.)
Programs must have a protocol amongst themselves to do so in an orderly fashion.
(Like the traffic code.) Usually the operation system defines and enforces this
protocol. (Like the government.) The sharing is however not a property of the
operating system, but of the computer, in the same way a road can inherently be
shared and the government is only required as a facilitator.
(the argument is rather crude, but I think it is essentially correct)
[ Reply to This | # ]
|
|
Authored by: Anonymous on Friday, November 06 2009 @ 04:05 AM EST |
The biggest problem I have with Microsoft's account is this line, "Computer
programming is an exercise in reductionism, as every feature, decision, and
analysis must be broken down to the level of the rudimentary operations captured
by transistors turning on and off."
Computer programming is the opposite of reductionism, the reason why high level
languages were invented was to allow the programmer to ignore the concept of
logic gates and switches and focus on a much higher level. [ Reply to This | # ]
|
|
|
|
|