decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books

Gear

Groklaw Gear

Click here to send an email to the editor of this weblog.


You won't find me on Facebook


Donate

Donate Paypal


No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
Is there a point to this? | 456 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
Is there a point to this?
Authored by: Anonymous on Thursday, November 29 2012 @ 11:24 PM EST
Writing software is "doing mathematics". Running the software on a
computer is also "doing mathematics". Mathematics encompasses both
coming up with proofs and then using the implications of those proofs to do
other useful things.

For example, consider the formula for calculating the derivative of a product of
functions:

(fg)' = fg'+f'g

In all calculus courses, students will use this formula to do things.

Students may be required to perform symbolic manipulation starting with the
product of two functions to get a second symbolic function. At this point, they
are doing abstract symbol manipulation using a known set of rules. This is
algebraic in nature and can be considered processing data and symbols using a
known algorithm. But the students are not coming up with a formula, they are
given an algorithm and they follow the predetermined steps to get the answer.

They may have problems with no real-world context where they are asked to
manipulate symbols and plug in numbers: "Find f'(3) where
f(x)=x^2-4x+3". This is a "plug and chug" problem, but it's still
doing math.

They may also have word problems. For example, there are "related
rates" problems or other simpler things that might be expressed like:
"A vehicle travelling in a straight line shows an odometer reading of
f(t)g(t) at time t for t = 0 to 100. How fast was it moving at t=60?"

These are "plug and chug" problems. The vehicle could either be real
or imaginary, but it's also still just doing mathematics. At this point, they
are doing abstract symbol manipulation and then computing things. But it's still
doing math.

In more advanced calculus courses, students will also see proofs of how to
derive the formula for a product of functions. They may even be given homework
problems or exam questions to come up with the proof of the formula, but they
are redoing work that was done by someone else.

But at some point, this formula didn't exist. Someone had to discover it and
prove that it was correct. There are questions over whether this ws Newton of
Leibniz, but I will not get into that here. The point is that someone had to
"do mathematics" to even conceive of the idea of calculus and then
prove that certain algorithms were correct so that the rest of us can "plug
and chug". At that point "doing math" meant discovering
interesting patterns of abstract symbols to manipulate in interesting ways.

All of these steps map to writing software in the following ways:

1. When people come up with previously unknown algorithms (to them anyway), they
are doing the kind of math Newton/Leibniz did when they discovered calculus.
They are coming up with rules for symbol manipulation that they haven't
discovered before, and they're not just looking up the answer or using someone
else's answer. Using the story of the apple falling from the tree, these
discoveries may even be motivated by real-world problems, but that doesn't stop
them from being math. Also, the concept of rigorous formal proof wasn't widely
accepted an used at the time. In fact, Newton's "proof" of
(fg)'=f'g+g'f would not be considered rigorous today. But it's still math. It's
coming up with an algorithm and attempting to make an argument that the
algorithm is correct.

Almost nobody proves that algorithms are correct, but there are techniques used
in programming to try to get close to that goal. Hoare logic was an attempt to
rigorously prove software correct, but AFAIK it isn't used much. However, the
concepts of preconditions, invariants and postconditions are often used to
annotate functions or interfaces.

Also, unit tests and other types of manual and automatic tests are an example of
"proof by example". Proof by example isn't a legitimate form of proof
(unless all possible examples are tested), but a good collection of test
examples can make it pretty likely that the underlying algorithm is correct. To
me, this is about percentages and not certainty.

2. When an algorithm is then discovered, it can then be used and dissected by
other people for use in their own problems. This is akin to proofs and formulas
getting dispersed and refined and optimized, and perhaps improved upon.

The algorithm then gets re-implemented by other people and and some people will
understand the implementations and some may just copy the symbols and pray it
all works. This is akin to the process of teaching students the proof of the
product rule in calculus. A lot of students will see it, and some may understand
it, but a lot probably won't. The point is the process of implementing or
understanding an algorithm is akin to reading a proof and trying to understand
it. Sometimes proofs are read more for results, and sometimes they are read more
for techniques. This is akin to using an algorithm as a small piece of doing
something else vs. looking at an algorithm that doesn't do what you want but
might be close, so you try to understand it so you can modify the parts you need
to do what you want to do.

3. When an algorithm chugs away in a computer with numbers plugged into it, this
is akin to the "plug and chug" problems on a calculus exam. The
formula or algorithm is there, and the students just have to plug numbers into
it to get the answer.

The point is that the steps of doing actual mathematics: conceiving of and
writing a proof, having others dissect and improve the proof, and then using the
formulas and algorithms in the proof to do other things are all math.

These steps map directly to coming up with an algorithm based on a problem,
having others re-implement and reuse parts of an algorithm, and then finally
"plugging and chugging" using the algorithm in a computer to actually
compute something.
I will end with some interesting observations:

1. Newton and Leibniz came up with calculus at approximately the same time. This
is akin to the problem of multiple people coming up with an algorithm at the
same time. Many people may come up with similar solutions to math problems, so
it is probably not a good idea to give a person a monopoly on certain
mathematics for 20 years when other people are most defiinitely thinking about
similar ideas. Given a state of understanding of a field and a large number of
practitioners, it is quite likely that multiple people will come up with similar
solutions at approximately the same time. At the very least, independent
discovery of mathematics should be protected from

2. There is no difference between a "math problem" and a "word
problem". An application of an algorithm doesn't stop being math just
because someone decided that the symbols used in it have some real-world
meaning. It also means that a patent on a specific use of an algorithm covers
all uses of that algorithm. If a patent is mostly s description of a problem,
and if different algorithms for ending with a similar result are considered
equivalent, it may even stop people from using a whole family of algorithms.

3. Toasters toast. Blender blend. Computers compute. All computers do is compute
things, which means they are doing mathematics. They just do the "plug and
chug" kind of mathematics where the algorithms are already known. As we see
in the product formula example, if we consider fg to be an algorithm to
calculate the value of a certain function at a point, we can have algorithms
that turn algorithms into other algorithms by symbolic manipulation. All
computers can do is compute, but they do it quickly. Each layer above the
transistors and wires in a computer is a discrete mathematical abstraction of
the layer below it. This is akin to composing a finite set of functions, one for
each layer, in order to obtain an incredibly complex final function. But it's
still a function because the composition has a finite number of functions in it,
and each underlying function is also finite and discrete. However, it's still
mathematics.

4. Software consists of algorithms, but generally not proofs of those
algorithms. However, there are techniques used in software development that try
to help make it more likely that an algorithm is correct. People who write
software, would then be performing the same kinds of steps Newton might have
done after seeing the apple fall from the tree in the story. Developers will
probably not get to the point of proving that their algorithms are correct, or
even come up with a whole general theory, but there are steps software
developers take (such as pre- and postcondtions and automatic tests) to at least
somewhat prove and shore up the rigor of their algorithms. Computer scientists,
however seem to spend a lot of time proving that things are correct.

5. There's no difference between doing mathematics on paper vs doing it on a
computer. The computer just happens to do it faster. But, it can still only do
things that it's been instructed to do. Even if it appears that the computer
"came up with an algorithm", it really didn't. It's akin to finding
the derivative of a function. There was some algorithm for creating other
algorithms that was used.

The emphasis in math is on proofs, and the algorithms either motivate the proof,
or fall out of the proof, whereas in software development, the emphasis is on
algorithms with just enough "proof" to make it likely that the
algorithms work. But remember, Newton didn't really "prove" calculus
according to the rigor that would be required today, but he was still doing
math. Computer science bridges this split with people working in very
proof-oriented areas like computation complexity, and others working in applied
areas where statistics and benchmarks are more important.

[ Reply to This | Parent | # ]

Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )