|
Authored by: Anonymous on Saturday, October 13 2012 @ 11:43 PM EDT |
Anything that a modern computer can calculate, can be calculated using
nothing but AND, OR and NOT gates. You can build hardware using
nothing but AND, OR and NOT gates which executes the same instruction
set as any modern digital computer.
But that's not even the point! The point is that computation is abstract! It
is a mental process that manipulates symbols. An algorithm is a set of
symbol-manipulating steps that can be carried out by you, with pencil and
paper (or even in your head) but can also be carried out mechanically by
a computer circuit (or by a robot holding a pencil and paper... or by a
Turing machine printing symbols on a paper tape, and reading them
back).
It doesn't matter whether your computer hardware is made of logic gates
or memristors or Legos or giant insects or anything else. The computer
_programs_, the software, are mathematics and tge alolgorithms used in
that software are mathematics and the _universal algorithm_ implemented
in the hardware is mathematics.
COMPUTATION IS MATHEMATICS. When computers execute code,
they are doing mathematics. Everything about it is mathematical, from top
to bottom.[ Reply to This | Parent | # ]
|
|
Authored by: PolR on Sunday, October 14 2012 @ 12:06 AM EDT |
I doubt any computer has ever existed that consisted only of logic
gates. I know modern computers have resistors, capacitors, oscillators, fans,
mechanical storage devices, environmental sensors, keyboards, mice, monitors,
Ethernet ports, etc. All of these can and do impact calculations.
The argument doesn't depend on this. The real argument is about
the symbols, syntax and meanings. The boolean gates stuff is just a pedagogical
way to drive this point home.
For example, one of the first things
you learn about Turing machines is that one Turing machine can emulate two
separate Turing machines. The problem is, those Turing machines are assumed to
stay in lock step. Two separate CPUs with two separate oscillators will never
stay in lock step, so you need a random factor for one CPU to emulate two.
You are getting more technical than the scope of this
article.
The
contents of this file which is linked to in the Sources section of the
article explains how multiple non synchronized CPU are handled in the language
of lambda-calculus. Given that lambda-calculus is equivalent to a Turing machine
this demonstration doesn't suffer from the deficiency you mention.
Besides
here the argument given in this article doesn't rely on Turing machines. Your
objection is moot.
Furthermore, the proof that one Turing machine
is as good as two assumes that there is no data corruption. In the real world,
this isn't true either.
The definition of algorithm is
mathematics includes an explicit assumption that *if* the procedure is followed
without error the answer will be correct. Note the condition. It is not said no
error must ever be made. It says the answer is correct only when no error is
made during execution. This is why a theoretical proof covers only the case
where no error occurs. It is understood that things may not work as expected
when the procedure is not followed correctly and this make theoretical analysis
pointless. Unless of course the point of the algorithm is to automatically
detect and correct errors.
I don't believe you can argue that
software that interacts with people is mathematics without arguing that people
are mathematics.
Did you read the article until the end? It sounds
like you didn't notice the contents of the last section. People interacting with
computer are inputting symbols in the machine and reading symbols going out of
the machine. This is still manipulation of symbols isn't it?
You will find
in the file mentioned above how this particular issue is resolved.
If the goal is to argue that software that runs on real hardware
is mathematics, you're going to have to abandon Turing machines and Turing
computable and the definition of algorithm you learned in CS.
Register machines and RASP is doing fine. Lambda-calculus does
just fine. There is more to computation theorythan Turing machines.
Read the
material I have presented in the Sources section of the algorithm. Read the
definition of Standard ML. Read the PDF file I have linked to both in the
article and this comment. You will even find in this PDF how to solve in
lamda-calculus the very problems you said I didn't solve. You will find how to
handle interactions with external sources of input and destinations of output.
You will find how to handle randomness and much more.
[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, October 14 2012 @ 12:03 PM EDT |
Hang on a sec, you've just brought the word "real" in there...
Software is NEVER written to run on a REAL machine (well, that's probably not
quite true, BIOSes and drivers and stuff like that... but even then it's written
to a Turing machine) but software is ALWAYS written in a language (even if
that's machine code), and ALL languages are interchangeable (that's a
mathematical proof) and all languages target an ideal (or Turing) machine. The
fact simple Turing machines can be implemented in hardware means that that
software is capable of running on real machines :-)
So basically, there is a clear line between software (that is written to a
Turing machine), and the hardware that emulates a Turing machine and runs the
software. Hardware running a program isn't maths, but the program itself is.
Cheers,
Wol[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, October 14 2012 @ 03:36 PM EDT |
It is gates all the way down.
But Richard Feynman tells a story from his days with the Manhattan
Project. One of the best computers available in 1944 was on its way and as
the team waited, they tested the algorithm/program by assigning tasks to
people and then simulating the processing with the data they were
intending to feed the machine. Their throughput
exceeded the promised specs of the machine they had ordered!
Donald Knuth, in The Art of Programming, likens the algorithm to a recipe.
In short, it is not necessary and sufficient for there to be a machine for
instructions to be algorithmic.[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Monday, October 15 2012 @ 02:56 AM EDT |
Software is written to run on an *abstract machine model*. Software remains
purely abstract symbol manipulation.
There are very few, if any, exceptions. (If you find one, it might be
patentable. The last one I saw was the "software to make the hard drive
crawl across the floor".)
Combining particular hardware with software to achieve bizarre behavior like
making your hard drive 'walk' -- that might be patentable.
A new hardware design is obviously patentable.
Software which is designed to run on an *abstract machine model*, like 99.99% of
all software, is just pure symbol manipulation, and is not patentable.[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Monday, October 15 2012 @ 02:43 PM EDT |
Very nice point. Don't worry about those replies that say there are only logic
gates to be considered when speaking of software. See
http://en.wikipedia.org/wiki/Hardware_random_number_generator Software
is built that relies on what such inputs (it would not function as intended
without a proper random number generator)
Software is more than what PoiR makes it out to be; he/she (?) doesn't
consider the input-output relation (which is more complicated in the online-
computation that you introduce) even in the section "Sources".
The best example I can think of is Eliza, http://en.wikipedia.org/wiki/ELIZA
To describe software such as this as math is just simply stretching definitions.
It is of course striking that the PoiR article doesn't offer a definition of
software against which ELIZA can be matched.[ Reply to This | Parent | # ]
|
- Very nice point - Authored by: Anonymous on Monday, October 15 2012 @ 05:59 PM EDT
- Another angle? - Authored by: Anonymous on Monday, October 15 2012 @ 07:12 PM EDT
- Another angle? - Authored by: PolR on Monday, October 15 2012 @ 07:25 PM EDT
- patentable? - Authored by: Anonymous on Monday, October 15 2012 @ 07:55 PM EDT
- Another angle? - Authored by: Anonymous on Monday, October 15 2012 @ 08:01 PM EDT
- Another angle? - Authored by: PolR on Monday, October 15 2012 @ 08:13 PM EDT
- Another angle? - Authored by: Anonymous on Monday, October 15 2012 @ 08:58 PM EDT
- Another angle? - Authored by: PolR on Monday, October 15 2012 @ 09:21 PM EDT
- Another angle? - Authored by: Anonymous on Monday, October 15 2012 @ 09:34 PM EDT
- What is math - Authored by: Anonymous on Wednesday, October 17 2012 @ 01:56 PM EDT
|
Authored by: celtic_hackr on Monday, October 15 2012 @ 09:50 PM EDT |
You say computers are more than logic gates. Certainly true, but misleading at
best.
The question isn't whether computers are composed of more than logic gates, but
whether the Central Processing Unit is composed of more than logic gates.
But more basic than that is what are logic gates constructed of?
I can build logic gates from nothing but diodes, resistors, and capacitors. So
what's your point about computers needing more than logic gates. Although modern
logic gates are built from either transistors or resistors and transistors.
The rest of your argument in that first point is also wrong. As I sit here
typing on a keyboard, I have another computer to my right which has no mouse, no
keyboard, and no mechanical storage device. Oscillators can be built from
transistors, and so is irrelevant.
As for the rest of your comment, well perhaps. I'm not a big fan of using the
software is math argument. But won't say it is not. Nor am I a big fan of
applying the Turing test to ideal or real computers. I think it's a dead and
irrelevant horse. Am I being sacrilegious? Perhaps.
The problem is, software is being granted patents for things that are:
a) obvious,
b) too vague,
c) already discovered,
or d) all of the above.
There have been a few software discoveries that perhaps were/are worthy of a
patent. But those numbers are extremely small, if not non-existent.
The biggest issue is there is no one qualified to evaluate software patents in
the entire chain of the patent process. [ Reply to This | Parent | # ]
|
|
|
|
|