In 2009, Norway decided that ODF was to be used, rather than OOXML, but called for further study and input from the IT industry, and I gather Microsoft put its shoulder to the wheel, and guess what? Now there is a new report (in Norwegian) that compared the two and concludes that neither is suitable, and both should be kept "under observation". So, what should the government use as their standard if the report's conclusions are accepted? What's left? Previous legacy binary formats, one assumes?
Wait. No, it didn't. It didn't compare OOXML, the ISO standard, with ODF. It compared *ECMA-376*, which is not an ISO standard, with ODF 1.1, which is, but which no one much uses any more, practically speaking, last I heard, since everyone has moved to 1.2. What's the logic there?
Yes. That's what happened. I kid you not. ECMA-376. Whose idea was that? I mean, if you are going to compare standards, why not compare the two ISO standards? And if it doesn't have to be an ISO or other international standard for the comparison, why not use ODF 1.2? The study acknowledges it solves problems he found in 1.1.
Lars Marius Garshol, working for Bouvet ASA, explains his assignment on his blog:
Now, I was asked to consider two specifications only: ECMA-376:2006, which is the very first OOXML standard (not the one later published by ISO), and ODF 1.1. For what possible purpose? In order to end up concluding that Norway shouldn't use either one, perchance? Would that not give Microsoft time to get its useless OOXML "standard" into shape, matching up with what its proprietary products actually do? If you read the credits in the report, you might form an opinion.
The study is in Norwegian, of course, but Google Translate renders the Credits section like this:
2.5 Credits Notice a decided tilt? Doug Mahugh, Alex Brown, Gareth Horton, and Jesper Lund Stocholm, the usuals pushing OOXML, and Mahugh and Stocholm left cynical comments on Garshol's blog about the study, on what a great study he did, as if it was news to them he was doing it, as I read their comments. Here's Stocholm's comment:
In the work of this report has helped many behind the scenes in different ways.
A hearty thanks to:
Stian Danenbarger, Bouvet, for answers to questions about the scenarios.
Doug Mahugh, Microsoft, for answers to questions versions of OOXML and identification of OOXML
version of the documents.
Alex Brown, Griffin Brown, for answers to questions about the identification of ODF version and information
about the contradictions in the ODF and OOXML.
Knut-Erik Gudim, Software Innovation, for information on using the OOXML and ODF in fagsystemer.
Gareth Horton, Data Watch for information about the implementation of OOXML and ODF fagsystemer.
Jesper Lund Stocholm, Ciber, for information on the implementation of OOXML and ODF fagsystemer,
and information on the status of OOXML and ODF in the Danish public sector.
- Stefan Görling for information about the status of OOXML and ODF in the Swedish public sector.
- MURAT Makoto (Last Name First Name) for information on the status of OOXML and ODF in Japanese
Jaeho Lee, University of Seoul, for information on the status of OOXML and ODF in the Korean public
- Tommi Kartavi, Finnish municipality Federation (Suomen Kuntaliitto) for information about the status of
OOXML and ODF in the Finnish public sector.
Morten Haug Frøyen, Education Directorate, for feedback on the scenarios and requirements in the public
Ellen Weberg, Education Directorate, for feedback on the scenarios and requirements in the public sector.
Dag Sverre Henriksen, Education Directorate, for feedback on the scenarios and requirements in the public
Ola Thoen, Government Administration Services, for feedback on the scenarios and requirements in public
Torgeir Strype, Ergo Group, for information about the use of OOXML and ODF in the public sector in general
especially in fagsystemer.
Morten Græsby, Bouvet, for feedback on the methodology and approach, as well as feedback on the second
- Geir Ove Grønmo, Bouvet, for feedback on the methodology and approach.
Hi Lars, The guy just told us his methodology, and it was to read a lot of comments on the Internet and then compare two formats that aren't even at issue. Do these guys realize we already lived through "Get the Facts" and we know how it works?
The report you have written is very interesting and for the first time, it seems that the one writing the report has actually looked into each spec and not just copied unsubstantiated claims off the internet.
Thank you for this :-)
Here's Microsoft's Mahugh's opening sentence:
Interesting report, Lars. I look forward to reading the final version (in an English translation, of course). Maybe he should have written, I look forward to *writing* the final version. In fact he goes on to suggest more worrisome thoughts the guy can fling at ODF, should he be so inclined, in the subtle way true artists employ on behalf of Microsoft:
FYI, past versions of Symphony were indeed based on the OOo 1.1.4 code base, but the version currently in beta is based on the OOo 3.x code used by Go-OO and others.
It's aimed at the question of how many tools there are for each format.
See for example slide #4 from the Symphony v-Next presentation at last year's OOoCon: http://conference.services.openoffice.org/index.php/ooocon/2009/paper/viewFile/120/114
The report includes a
section on a Microsoft-sponsored interoperability study, but never
mentions the work of the ODF Interoperability and Conformance TC or
the Plugfest work by the OpenDoc Society in the Netherlands.
No one from OASIS or any of the usuals in the ODF Alliance on the list, I see. And no IBM people that I can make out. I guess the Microsoft contingent was so helpful, he didn't need to look at the other side?
By the way, when New York State was writing up its report on what to use a couple of years ago, I was asked to attend a meeting about it, and someone I long had associated with Microsoft's interests pushed the idea that neither OOXML nor ODF was ready for prime time. So I'm not surprised to see this thought being pushed now. It just took longer than I expected. Until Microsoft's broken-wing OOXML is ready for prime time, meaning it actually works in reality, it's not in its interests for ODF to be accepted.
Here's how the study was done:
Together these two documents run to 6783 pages, which was a bit much for me to digest and consider in the limited number of hours I had at my disposal. I therefore decided to focus on the specification of the specific functionalities in the list above (except the first one), and to look for general reports of problems in the two specifications to get a feel for the quality of each. Is *that* why the OOXML dudes have been posting comment after comment and blog post after blog post attacking ODF? I was wondering why they thought that would help. The report's author says in a comment, "My main source for the assertion that there are lots of errors is the ODF error database..." He acknowledges that, "I realize that the testing I describe here is very superficial..."
Well, yes, yes it is. It comes across to me like Microsoft dumped a pile of talking points on his desk, and he wrote them up.
He asks for feedback, though, for the next version of the report. Here's his conclusion, after saying he couldn't recommend either:
Having said that, ODF 1.2 looks like it will satisfy nearly all the shortcomings with ODF 1.1 that my report identifies. Similarly, it looks like the next OOXML version (ISO/IEC 29500:2008 amendment 1) will solve most of the OOXML issues. If the implementors follow up and improve their converters things will look much brighter. Unfortunately, this is going to take a couple of years. Poor Norway. So what should they use for the next two years? Stick with Microsoft's proprietary software, perchance? Is Norway not interested in sticking to ISO or other international standards?
So my conclusion in the report is that both standards should be listed as "under observation" for all usage areas.
Here's Rob Weir's comment on the blog:
Hi Lars, One must admire his patience and persistence. As it happens, Stéphane Rodriguez, on his 'OOXML is defective by design' blog, just tested application-level interoperability using Office 2010:
Maybe there is more to back this up in your full report, but there appears to be a "logical leap" from your implementation tests to your conclusions about the standards. For example, OOXML has 120 pages on change tracking, which you say is "solid". But then you report that it was only interoperable with NeoOffice. Does your report explain why this is so? And why you conclude that the spec has problems, when that appears to contradict what you said earlier about that section of the spec being "solid"?
The Kesan and Shah study a few years ago had a similar approach, and that approach also caused their paper to fall short of illuminating what causes situations like this. Unless you are willing to grapple with what exactly is causing a specific interoperability flaw in an implementation, you are unable to distinguish between the cases of:
A) Implementations that are unable to implement a feature interoperably.
Historically], we've seen A avoided even without standards, such as the level of interoperability with legacy 1-2-3 wk1 files or Office XP-era doc files, based on vendor file format disclosures. And B is still possible even in the presence of mathematically perfect specifications.
B) Implementations that are unwilling to implement a feature interoperably, or even to implement it at all.
I'm not saying that it is impossible to identify an interoperability defect and trace it directly to a defect in the standard. But it is incorrect to assume that 100% of interoperability defects are caused by errors in the standard, and 0% are caused by implementation defects and 0% are caused by intentional efforts to reduce interoperability. To really understand what is going on, you need to talk to the vendors, and ask them why a particular feature does not work. In my experience working with ODF implementers, A is almost never the reason.
It is worth noting that standards like HTML achieved their highest levels of interoperability, not by publishing amendments and corrigenda, but by encouraging vendors to implement the standards fully. In other words, interoperability was only achieved by a change in attitude by the vendors, not by a change in the standard. Often we'll see a desire for interoperability to occur first with the vendors, then worked out technically and only then standardized. That's how we got EcmaScript. In many -- perhaps most -- cases the standard does not drive interoperability, but documents that interoperability that is already being achieved. To take it in the other direction is like saying a marriage license makes a couple fall in love.
If you look at the ODF story, we went from Microsoft refusing to implement it to Microsoft committed to implement it for at least the next 10 years. None of this involved the change of even a single line in the ODF standard.
In any case, you might also want to look at the W3C Note "Variability in Specification" which gives a much-needed framework to discuss the relationship between specification conformance and interoperability.
Also, I wonder, as you took into account "recent developments in the field", did you find room to mention the several multi-vendor ODF Plugfests we've had? Or the work of the OASIS ODF Interoperability and Conformance TC? (We have a recent report on the topic you should read, if you have not already). I think these activities are very relevant. I don't read Norwegian but I see your report cites the Microsoft-sponsored work at Fraunhofer, but I didn't see any mention of any interop work at OASIS or the OpenDoc Society's Plugfests. I hope this is an oversight.
For example, "The State of ODF Interoperability" report here:
In any case, if you have indeed identified "lots of errors" in ODF 1.1, I hope you will submit a list of them, either to the OASIS ODF TC's comment list directly, or to SC34/WG6. Although I think the impact on interoperability is minor, we're always pleased to fix reported errors.
Just out of generosity I'll be doing tests with Office 2010, the latest version their crappy product suite, allowing them to take the time (no less than 3 years) worth of improvements for features that are supposed to be part of Office 2007 already (which RTM'ed in Nov 2006). Specifically I'll be focusing on Excel 2010 since I've already mentioned a lack of application-level interoperability in my article about content controls in Word (pointing out that so-called XML features like this actually require running Word instances to work, contrary to the open standard pledge). Of course, it isn't the case, and the graphics will make you laugh, in the painful way Microsoft makes us laugh, watching them at work. He concludes:
A good example, when it comes to application-level interoperability, is what is stored in the Windows clipboard when you copy/paste content to or from an Excel spreadsheet. Indeed, Microsoft Office has been used to store binary formats in the clipboard, i.e. proprietary, therefore limiting interoperability across applications. Scenarios like this include the ability to copy/paste content with high fidelity between a running instance of Excel and a running instance of another application (related to spreadsheets or not, assumging it's a OOXML client). Therefore a good measure of whether or not Microsoft has improved interoperability in the OOXML timeframe is whether Excel 2010 actually stores OOXML in the clipboard. Let's see if that is the case....
The conclusion is that there is actually no way for an OOXML consumer application to rely on the "standard" OOXML to interoperate with Excel 2010 (and Excel 2007). It's all back to binary formats even though, by registering proprietary formats in the clipboard, the Excel team had years to implement this opportunity to store real OOXML in there. What is a "standard" good for if it's not used by the one and only reference application out there? This is lousy engineering at its worst. Or, for what matters, lock-in strategy. And sadly, that is the conclusion I reach as well, that the lack of interoperability is deliberate, that it will continue, and that Microsoft and its little helpers will use it to say ODF shouldn't be used "either". Then, once they finally get OOXML working with itself and Microsoft's proprietary products, then they'll say we should all use it, if we want a seamless experience.
You think I'm being cynical? They are already saying things just like that about Google Docs:
"They are claiming that an organization can use both (Office and Google Docs) seamlessly," writes Payne in response to Google's post. "This just isn’t the case." Would you like to see Google's response? It says it all:
As evidence, he offers up the accompanying video, showing a loss of features such as watermarks in the transfer from Office to Google Docs and back again. Microsoft says it has built Office Web Apps to preserve that "fidelity" when making the shift from desktop to online collaboration and back again.
It says a lot about Microsoft's approach to customer lock-in that the company touts its proprietary document formats, which only Microsoft software can render with true fidelity, as the reason to avoid using other products.
And so there you are. Microsoft won't change, I don't believe, but no company, not even Microsoft, lasts forever, so what will you do when they go belly up and you or your grandkids have nothing to use to legally open and read those Microsoft-created documents? That, not just interoperability this minute, is the real issue for governments, of course. The Norway study didn't even think of that, to the extent that I have information about it. If any of you read Norwegian, perhaps you can read the report and give us a little more specifics. Meanwhile, Google Translate does a clear enough job to catch the drift. Here's the info on where to send your feedback:
At Google, we have a dedicated team of engineers (they call themselves the Data Liberation Front) who work to make it easy to get your information out of Google products. Of course, the real task for the Docs team is not interoperability with Office, it's to improve team collaboration to increase productivity and reinvent the way people work.
We ask for feedback on the report and in particular the following points: You can leave comments on his blog, too, but this is the report page for comments. Please be respectful and kind, but clear if you comment. I'm sure no one enjoys getting rude emails or comments. I know I don't. So if you wish to provide feedback, stick to provable facts, technical issues, or respectful suggestions. The idea is to be helpful.
1. Does the composition of the standards that are assessed in the report, a representative selection of compounds of standards for editable document formats?
2. If no, describe the (n) composition (s) of standards you are missing and explain why they should have been involved with.
Short comments can be posted in the comments below. Further comments and formal inquiries can be sent by e-mail to kristin.kopland @ difi.no .
3. Provides usage scenarios in the report a representative sample of scenarios for exchanging editable documents?
4. Do you agree with the assessments made, and there are other factors that should be considered?
The truth is, no one can interoperate with Microsoft's software, unless they let us. And so far, they actually don't let us past a point. Standards, however, are supposed to be usable equally by everyone, now and forever. Is that so hard to understand?