...Actually, I disagree that that's the distinction. I
see the distinction
as being between abstract and concrete.
Software can only ever take
numbers and alter them (in
myriad and labyrinthine ways).
Programs and
programming languages from machine language to
javascript have been designed to
leverage progressively more
basic underlying programs in order to make it
easier to
instruct a computer. The fact that many once-difficult
things are
now easy is well-known to those versed in the
arts, but may not be obvious to
the layman. Stuff that's
done in software is really easy *now* because of all
of
the algorithms that have already been vetted in the past
(math is often
like that).
Software is abstract, not concrete (it
cannot cause a direct
physical effect). That's part of the
*reason* that algorithms and
thoughts
are not the same thing as inventions.
Ideas and phenomena exist now.
They are discovered, not
created. Inventions do *not* exist now, and must be
created. They may rely on a recently-discovered idea, but
it is not the idea
that makes the invention...it's the new
machine that is needed to bring the
invention to life.
The phenomenon of electromagnetism existed before it
was
discovered. The idea for a device that could take a voltage
or current
travelling through some conductive wire and
multiply or divide it based on the
number of windings in
that wire and another wire in close physical proximity in
an
of itself was not an invention. The physical transformer
*was*.
I
think the fundamental difference in viewpoint of
programmers and
non-programmers hinges on how deeply one
understands math and how deeply one
understands the
operation of today's general purpose computers (PCs,
smart/dumb-phones, mainframes, etc) --- 'Murphy was an optimist'
-O'Toole's Commentary on Murphy's Law [ Reply to This | Parent | # ]
|