|
Authored by: BitOBear on Sunday, October 14 2012 @ 10:30 AM EDT |
Computers, long ago, were actually "digital" as in based on numbers
from zero (0) to nine (9). These digital electronic computers were designed
based on the mechanical computers that had wheels that encoded the ten distinct
values as unique positions of a cog.
The distinction here is that each memory "bit" had ten unique values.
This system sucked for various reasons, so it was abandoned.
We call modern binary computers and on-off signaling "digital" because
when the actual transition from base 10 (digital) to base 2 ("binary",
or more correctly "discrete") technology took place the _culture_ was
newly and firmly invested in the word "digital" for its own sake as an
icon of future technology.
Technically you are watching "discrete" cable not "digital"
cable and everywhere you normally use the word "digital" in your
understanding is, by strict definition, incorrect.
It was easier to _redefine_ the word digital than it was to steer the ship of
dreams to the correct word: "discrete".
So really, the entire industry is plagued by marketing from root to tip. So much
so that even the technologists have given up making the other technologist
people think rigorously.
There is no reason to be surprised that such a mish-mash of jargon that is
modern computing is antithetical to the precision of language needed for the
law.
8-)
[ Reply to This | Parent | # ]
|
|
Authored by: Ian Al on Sunday, October 14 2012 @ 12:58 PM EDT |
I'm nodding furiously, but I would add that, as well as signed and unsigned
numbers, processors don't really understand numbers. Treating words made up of
binary bits as though one bit is more significant than another is an endian
concept in the processor designer's mind. Processors really only 'understand'
AND, OR and NOT.
Your comment about a system comprising a multiplicity of servers and clients
interconnected by an interweb is spot on. Even a functional description should
be limited to the symbolic meaning of the data flowing between the computers and
the algorithmic manipulation of the symbols in each computer.
It could be by describing the symbol manipulation at the highest level of
software rather than the symbols used at processor level since the high level
software is what the inventor directly understands (allegedly!), but any
description of an invention divorced from one of the algorithmic layers in the
software has no specific meaning in patent law because that is the 'machinery'
of the invention.
Oh yes, did I mention that a process is not a machine or a system? If you have
invented a process, don't confuse us by patenting a system devised to run the
process.
---
Regards
Ian Al
Software Patents: It's the disclosed functions in the patent, stupid![ Reply to This | Parent | # ]
|
|
|
|
|