|
Authored by: jesse on Sunday, October 14 2012 @ 10:03 AM EDT |
The problem with "objects" as I see it is that it depends on the
context of use. A car is an object for instance.
A data structure can be an object in the context of a particular programming
construct... This is mixing a word used for physical items with those of data
construction in a paper for a beginner in software. As such, the programming
context of an object doesn't exist in the readers mind.
Another approach to explain the problem is that a programming object is a
collection of symbols in a particular dialect. Thus again making things
confusing.
So avoiding the jargon of "object" is a good thing to do.[ Reply to This | Parent | # ]
|
|
Authored by: soronlin on Sunday, October 14 2012 @ 10:20 AM EDT |
Because it confuses matters. Objects are abstractions composed of multiple
symbols. Therefore the same arguments apply to them. Mathematics has objects,
for example complex numbers and matrices. However they are implemented in both
computer languages and other mathematical languages as collections of symbols.
Anyone who has done high-school maths knows how to multiply two matrices by
multiplying and adding their various elements. Although later course-work might
just say AxB, the same algorithm is invoked. We can treat A as a single symbol
or we can say it's a collection of symbols; both are correct. In a computer,
each of the number symbols that make up A are themselves collections of the
symbols that we call bits.
There is a complexity there, because computers only deal in bits. The symbols
called integers and real numbers are not exact copies of the symbols with the
same name in mathematics. They behave differently. In mathematics integers never
overflow and real numbers never suffer from rounding errors. However bits in a
computer are the same as bits in mathematics. So if we confine the argument to
bits, we keep it simple.
It is perfectly possible to make the same argument about any symbol in a
computer, but to do so rigorously would involve proving that the computer's
implementation of the symbol as a collection of bits was a valid implementation
of the abstract maths symbols, within the confines of the particular algorithm.
That's a whole lot of complex maths and arcane and abstract argument that would
just confuse a court.
There's also a danger that a court would understand only half the argument and
conclude that "A=B+C" was not maths because computer numbers were not
the same as mathematical numbers.
So I think PoIR chooses the right approach in talking about symbols in general,
but only being rigorous about bits.[ Reply to This | Parent | # ]
|
|
Authored by: PolR on Sunday, October 14 2012 @ 12:27 PM EDT |
By objects you mean abstract objects? The courts have a hard time understanding
what and abstract idea is. Abstract objects are a subcategory of this. Symbols
are a concept which is easier to understand and more amenable to a fact based
analysis.
Read again the section titled "Mathematics Is a Language". Symbols are
mentioned in the definition of the fundamental concepts of mathematical logic.
There is a strong theoretical basis for using symbols.
[ Reply to This | Parent | # ]
|
|
|
|
|