decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books

Gear

Groklaw Gear

Click here to send an email to the editor of this weblog.


You won't find me on Facebook


Donate

Donate Paypal


No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
WRONG | 758 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
Math Is Manipulation of Symbols
Authored by: BitOBear on Sunday, October 14 2012 @ 10:39 AM EDT
Most computers are _incpable_ of loading a single bit (e.g. exactly one zero or
one) from any memory source anywhere. And when you _do_ that single bit will be
zero-padded or masked-off when it is read or written (respectively) to/from the
register where it lived with its siblings.

The get single bit instructions are actually somewhere around ten years old and
there are only a couple of them.

Most computers must load at least a byte (eight bits) at once.

Any computer running Windows CE can only load a 32bit word at once (each
character in a string is 32bits long in windows CE and there is no legal
"char" type) This was to "simplify" the instruction set, but
it explains why the platform was such a memory pig. Plus it had to explode
everything it got from the internet from 8bit to 32bit representation by doing
this ugly copy-shift-mask operation on every freaking byte. Go Microsoft Design
Guys! 8-)

So really, you can not operate on bits, only bytes (e.g. symbols) in (almost
all) computers.

[ Reply to This | Parent | # ]

not to get repetitive...
Authored by: BitOBear on Sunday, October 14 2012 @ 10:47 AM EDT
There are _lots_ of instructions that manipulate zero symbols. You have
previously said "they don't count" because they are "just
housekeeping". They do make the difference between programs that work and
thouse that don't.

Those instructions include, but probably aren't limited too:

Memory barriers (read barrier, write barrier, full barrier)
Hlt (non-deterministic "pause", on the intel platform).
Lock (bus locking, e.g. prevent others reading and writing for one instruction
cycle).
NoOp (do nothing for one instruction cycle).

That is, these instructions do not read, write, nor manipulate any symbol at
all, and yet they are necessary to prevent or enable a myriad of behaviors
within real modern computers.

You don't agree. I am not going to bang my head against your wall of denial
again here. I'm not even going to read the response since we have reached
impasse over this before.

Their existence in the real world remains as mute accusation of the non-math
basis of _some_ computer programs whether you agree or not.

[ Reply to This | Parent | # ]

WRONG
Authored by: Anonymous on Sunday, October 14 2012 @ 11:34 AM EDT
WRONG.

that math is an interreptation of lower math that tells the
device to send certain beams of light and that device is
math hard coded that tells the atoms and molecules in proper mathematical
sequences to do stuff.

one could truly argue that all devices are in fact math
because they are doing a sequential system of math. Go on
USA try and keep your retarded systems cause the rest of the
world
has stopped laughing its not beyond funny its sad....
no wonder your world rank in math is 32...

[ Reply to This | Parent | # ]

Some followup on your description:
Authored by: Anonymous on Monday, October 15 2012 @ 03:17 AM EDT
Computer instructions are of the following types:
1) Do a computation: take the contents of memory cell A, and the contents of
memory cell B, and use them to compute a value which is then put in memory cell
C
2) Wait.

There are a couple of things the computer does *without* instructions.

Input is one of them. Input, in a typical modern computer, just gets dumped
into particular memory cells by the input subsystem. (To see whether they've
changed, software must issue instructions to copy the value of those cells to
other cells on a regular basis.)

Output is another. The output subsystem in a modern computer typically simply
reads a particular set of memory cells and blasts them to the output device at
some interval.

(There are some other I/O methods, but they actually don't add anything
interesting to the process of writing software. It still is just a matter of
writing a completely abstract list of computational steps, and encoding those
steps as numbers for the "Universal Computing Machine" to deal with.)

[ Reply to This | Parent | # ]

  • nope. - Authored by: jesse on Monday, October 15 2012 @ 08:31 AM EDT
    • nope. - Authored by: PolR on Monday, October 15 2012 @ 08:58 AM EDT
Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )