decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books

Gear

Groklaw Gear

Click here to send an email to the editor of this weblog.


You won't find me on Facebook


Donate

Donate Paypal


No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
Also dynamic vs. static | 400 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
Also dynamic vs. static
Authored by: Anonymous on Friday, May 11 2012 @ 08:19 PM EDT
[NOTE: The following post is pretty much irrelevant to this case. Go ahead and ignore this, unless you want to know more, and didn't know where to look.]

The words "static" and "dynamic" are used in many different contexts in computer science. But usually, "dynamic" has a connotation of "changing" or "at run-time" and static means "not changing" or "at compile time".

Compilers usually have an "optimizer" which performs "static optimizations" by looking at the program (which is usually in some intermediate form at the time, neither source code nor the final binary output format) and analysing it and doing transformations that should make it run faster. The reason these are "static optimizations" is because they only use static (unchanging) information about the program -- that is, the program itself! They don't use any "dynamic" (changing) information, because that is information that can only be gathered at run-time : very short-term information about how things are going in a particular run. A "dynamic optimization" would be something like "this function has been called 100 times very recently, so we know it is hot and we will optimize it more". Or another example might be "of the last 100 times this if-test was executed, the condition was false 90% of the time. So we'll generate a version of the code that assumes it will be false, and is faster for that case."

From another point of view: Dynamic optimizations occur while the program is running and use information collected during that same run. Dynamic optimizations react to the dynamically-changing run-time conditions of the program. Generally only JIT ("just-in-time") compilers do dynamic optimizations, because they can generate new code whenever they want, in response to the dynamic information they have collected.

Traditional static compilers (also known as "off-line" compilers, or "ahead of time" compilers) only have a few options. They have to decide what code to emit, once and for all, during the compiling (before the program is even running). They can usually use only static information, and do only static optimizations (the most common kind). Some of them can also do something called "profile-guided optimization" (PGO), where you first compile an "instrumented" version of the program, and then you run that version and the instrumentation stuff collects the dynamic info while you run it (a "profile"). Then you compile it again, and this time it uses your collected profile to make better optimization decisions. If the profile says the if-statement's condition is true 90% of the time, the optimizer can take advantage of that to generate better code (which branches are taken/not taken is dynamic info, that a traditional compiler doesn't usually have). But even with PGO, this is still considered a "static optimization" because the program is not running at the time the optimization is done, and because the optimized code is generated once, and not changed anymore at run-time. (So if the optimizer guesses wrong, it can't correct its mistake later. And if your "profile" is not very representative, the optimizations might actually make the program slower.)

A JIT ("just in time") compiler is a compiler that runs while your program is running. Some Java VMs have one of these (the Sun/Oracle VM has one called "HotSpot", unless they've changed its name). A JIT compiler generates all of its executable code while the program is running. Its optimizations have to be lightweight and fast, because you pay for them while the program is running. So they collect dynamic information to find out which parts of the program are most important, and spend more time doing a better job of optimizing those. The same method might be re-compiled several times, with different optimizations each time, as the VM learns how important it is. Also, when the program conditions change (i.e. if-statement used to be 90% true, but lately it seems to be 80% false) a JIT compiler can notice this and re-optimize the method to be better in the new situation. This is why it's a "dynamic" optimization.

As for resolving references: "static linking" is done "off-line". You combine different chunks of code together into one, and resolve the symbolic references between them (chunk A uses a method called Foo from chunk B, so when linking them into a program, the linker decides where exactly it's going to put Foo, and then it can replace that symbolic reference in A with the actual address. It forgets that it calls a function called Foo, and just remembers that it calls a function at address 12345.) "dynamic linking" is the same thing, but it happens "on-line" (while the program is running). When a Windows application loads a .DLL, the program resolves symbolic references to the functions in the DLL (with DLLs, these references are called "imports"). But this is done by the loader, at run-time.

Another example of a "dynamic optimization" would be a Polymorphic Inline Cache (PIC). This is a self-modifying code technique used in some Smalltalk VMs. Each call site keeps a small cache of methods that it has recently invoked. Before doing a full method lookup, it checks to see if one of the cached methods is the correct one. The cache is made with self-modifying code for performance reasons. On a "cache miss", it does a full method lookup and then updates the self-modifying code so that next time, if the same message is sent, it will be fast.

[ Reply to This | Parent | # ]

Also dynamic vs. static
Authored by: Anonymous on Friday, May 11 2012 @ 08:41 PM EDT
static process

oxymoron?

[ Reply to This | Parent | # ]

OMG another analogy avalanche
Authored by: mexaly on Friday, May 11 2012 @ 10:25 PM EDT
Y'all are preaching to the choir here; how do you think it's going with the
actual jury? Do you know they can't hear you?

---
IANAL, but I watch actors play lawyers on high-definition television.
Thanks to our hosts and the legal experts that make Groklaw great.

[ Reply to This | Parent | # ]

Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )