a 'mooh' point

clearly an IBM drone

IBM: Thumbs up for OOXML!

Today news broke that ANSI (the US national standardisation guys) recently voted on the subject of approving OOXML as an "American National Standard".

The text of the ballot was:

Approval to Adopt the International Standards listed below as American National Standards:

  • ISO/IEC 29500-1:2008 (...) Part 1: Fundamentals and Markup Language Reference
  • ISO/IEC 29500-2:2008 (...) Part 2: Open Packaging Conventions
  • ISO/IEC 29500-3:2008 (...) Part 3: Markup Compatibility and Extensibility
  • ISO/IEC 29500-4:2008 (...) Part 4: Transitional Migration Features

A total of 20 organisations/entities voted and the result was

  • Approve: 12
  • No: 0
  • Abstain: 2
  • Not voted: 2

The details are here:

DateOrganizationYesNoAbstainNot Yet
TOTAL 12 0 2 4
03/16/2009 Adobe Systems       X
04/13/2009 Apple Inc X      
04/15/2009 Department of Homeland Security X      
03/16/2009 DMTF       X
04/09/2009 Electronic Industries Alliance X      
03/16/2009 EMC X      
03/16/2009 Farance, Incorporated       X
03/16/2009 Google       X
04/15/2009 GS1 US     X   Comments
04/13/2009 Hewlett Packard Co X      
03/24/2009 IBM Corp X      
04/15/2009 IEEE     X   Comments
04/08/2009 Intel X      
03/18/2009 Lexmark International X      
03/17/2009 Microsoft X      
03/16/2009 NIST X      
03/19/2009 Oracle X      
03/16/2009 US Department of Defense X      

An interesting vote here is naturally the vote of "International Business Machines Corp", otherwise known as IBM. It seems they now support OOXML - good for them.

I think it is an extremely positive move from IBM and I salute them for finally getting their act together and supporting OOXML. I also hope IBM will tread in the footsteps of Microsoft in terms of TC-participation and join us in SC34/WG4 to contribute to the work we do. I think it is positive for the industry that Microsoft finally joined OASIS ODF TC last summer, and I hope IBM will do the same with SC34/WG4 - we need other vendors besides Microsoft at the table. I also hope this means that IBM will speed up support for OOXML in either Lotus Symphony or OpenOffice.org. The support for OOXML in other applications than Microsoft Office 2007 is ridiculously low.

Thank you, IBM - you really made my day.

Smile

PS: I appologize for the colors of the table above

 

Lo(o)sing data the silent way - all the rest of it

Ok - this post is going to be soooo different than what I had envisioned. I had prepared documents for "object embedding" and "document protection" but when I started testing them, I soon realized that only Microsoft Office 2007 implemented these features - at least amongst the applications I had access to. These were:

Microsoft Office 2007 SP2

OpenOffice.org 3.0.1 (Windows)

OpenOffice.org 3.0.1 (Mac OS X)

NeoOffice (Mac)

iWorks 09 (Mac)

The reason?

  • OOo3 doesn't fully support object embeddin
  • OOo3 doesnt support document protection
  • iWorks doesn't support object embedding at all
  • iWorks doesn't support document protection

So I'll just give you one example of what will happen when strict documents come into play - when applied to document protection.

Document protection is the feature that allows an application to have a user enter a password and unless another user knows of this password, he or she cannot open the document in, say, "write-mode". There is no real security to it, though, it is simply a hashed password that gets stored in the document.

This data is stored in the "settings.xml"-file in the document, and this was rather drastically changed during the ISO-process.

If you use Microsoft Office 2007 to protect your document, it will result in an XML-fragment like this:

[code:xml]<w:documentProtection
  w:edit="readOnly"
  w:enforcement="1"
  w:cryptProviderType="rsaFull"
  w:cryptAlgorithmClass="hash"
  w:cryptAlgorithmType="typeAny"
  w:cryptAlgorithmSid="4"
  w:cryptSpinCount="100000"
  w:hash="XbDzpXCrrK+zmGGBk++64G99GG4="
  w:salt="aX4wmQT0Kx6oAqUmX6RwGQ=="/>[/code]

You will have to look into the specification to figure out what it says, but basically it tells you that it created the hash using the weak algorithm specified in ECMA-376.

But as I said, this was changed during the BRM. Quite a few of the attributes are now gone for the strict schemas, and my take on a transformation of the above to the new, strict edition is this:

[code:xml]<w:documentProtection
  w:edit="readOnly"
  w:enforcement="1"
  w:algorithmName="typeAny"
  w:spinCount="100000"
  w:hashValue="XbDzpXCrrK+zmGGBk++64G99GG4="
  w:saltValue="aX4wmQT0Kx6oAqUmX6RwGQ=="/>[/code] 

'Only thing I am a bit unsure about is the value for the attribute "algorithmName", but I guess it would be "typeAny". The result? Microsoft Office 2007 detects that the document has been protected, but it cannot remove the protection again - presumably due to the new attributes added to the schemas. I thought about creating new values using e.g. SHA-256 as specified in the spec, but the chances that Microsoft Office 2007 would detect this in unknown attribute values are almost nothing, so I didn't bother doing this. Feel to play around with it yourself.

The Chase

We need a namespace change for the strict schemas - and am thinking about ALL of the strict schemas including OPC. If we don't do it this way, my estimate is that we will lose all kinds of data - and the existing applications will not (as they behave currently) inform their users of it. Making existing applications break is a tough call, but I value data/information integrity more than vendors needing to update a bit of their code.

And as for the conformance attribute? Well, the suggestion as it is currently is to enlarge the range of allowed values of this attribute. Somehow I think it makes sense to enlarge the range as well.I think it would make sense to have the values one of

  • strict
  • transitional
  • ecma-376

or something similar. Then when we make a new revision at some point in the future, we can add version numbers to them at that time. Changing the namespaces will also make it possible to use MCE to take advantage of new features of IS29500 while maintaining compatibility with existing applications supporting only ECMA-376 1ed. (more about this later)

And what should the schemas be named?

Well, they are currently like "http://schemas.openxmlformats.org/wordprocessingml/2006/main" . So an obvious choice would be "http://schemas.openxmlformats.org/wordprocessingml/JLUNDSTOCHOLM/main"

Smile

... or maybe simply "http://schemas.openxmlformats.org/wordprocessingml/main" would be better? Of course it introduces easy causes for errors for developers, so maybe "http://schemas.openxmlformats.org/wordprocessingml/iso/main" would be even better?

Losing data the silent way - ISO8601-dates

In Prague we spent quite some time discussing how to deal with the fact that applications supporting ECMA-376 1st Ed. not necessarily support ISO/IEC 29500:2008 strict as well. Our talks revolved primarily around how major implementations dealt with the modified functionality of the elements <cell> and <v> in SpreadsheetML now that ISO-dates are allowed as content of the <v>-element. But “dates in spreadsheets” is not the only place where changes occurred. Changes were also made to other areas, including

  • Object embedding
  • Comments in spreadsheets
  • Hash-functions for document protection

This will be the first post in a series of posts evolving around how IS29500 differs from ECMA-376 and how existing applications behave when encountering a document with new content. What I will do here is to create some sample documents and load them in the applications I have access to that supports OOXML the best. In my case these are Microsoft Office 2007 SP2, OpenOffice.org 3.0.1 and NeoOffice for Mac and Apple iWorks. If you want to contribute and you have access to other applications, please let me know the result and I’ll update the article with your findings. If you have access to Microsoft Office 2007 SP1, I'd really like to know. When the series is done I’ll post a bit about MCE and how it might help overcome some of the problems I have highlighted (if we’ll get to change the namespace for the strict edition of IS29500 schemas)

I should also note that as the series progresses, the examples I make will increase in complexity. A consequence of this will be that my examples will be more of a “magic-8-ball-type prediction” than “simple examples of IS29500-strict documents”. Since there is not a single application out there supporting IS29500-strict, the examples will be my “qualified guesses” to how applications might interpret IS29500-strict when they implement it.

ISO-8601 dates in SpreadsheetML

Let me first touch upon the problem with dates in SpreadsheetML since this was the problem we talked about the most. Gareth Horton from the UK national body hand-crafted a spreadsheet document with these new dates. I have modified his example a bit to better illustrate the point. Files are found at the bottom of this post.

In the original submission to ISO dates were persisted in SpreadsheetML as “Julian numbers” (serial representation) and subsequently formatted as dates using number format styles.

[code=xml]<sheetData>
  <row r="1">
    <c r="A1" s="1">
      <v>39904</v>
    </c>
  </row>
  <row r="2">
    <c r="A2" s="1">
      <v>39905</v>
    </c>
  </row>
(…)
  <row r="10">
    <c r="A10" s="1">
      <v>39913</v>
    </c>
  </row>
</sheetData>[/code]

So the above would create a column with 10 rows displaying the dates from April 1st to April 10th.

Let’s change one of the cells to contain a date persisted in ISO-8601 format.

[code=xml]<row r="9">
  <c r="A9" s="1" t="d">
    <v>2009-04-09T01:02:03.04Z</v>
  </c>
</row>[/code]

So the cell contains an ISO-8601 date and it is formatted using the same number format as the other cells. I have added a bit of additional data to the spreadsheet to illustrate the problem with using formulas on these values.

Result

The interesting thing to investigate iswhat happens when this cell is loaded in a popular OOXML-supporting application. Note here that the existing corpus of implementations supporting OOXML supports the initial edition of OOXML, ECMA-376 1st Ed.So they would have no way to look into the specification and see what to do with a cell containing an ISO/IEC 8601 date value.

Microsoft Excel 2007 SP2

As you can see Excel 2007 screws up the content of the cell. And on top of that, should you try to manipulate the content of the cells with formulas, they are also basically useless. The trouble? Well, you are not notified that Excel 2007 does not know how to handle the content of the cell, so chances are that you’ll never find out – until you find yourself in a position where there are real consequences to the faulty data and kittens are killed.

OpenOffice 3.0.1 Calc

 

 

The result here is almost the same. Data is lost and the user is not notified.

NeoOffice for Mac

 

Again we see the same result. This is not so strange, since the latest version of NeoOffice shares the same code base as OOo 3.0.1 so behavious should be the same.

iWorks 09 Numbers

 



Wow, so for iWorks on the Mac, the user is actually warned that something went wrong. Only trouble is - it does not warn you that the content of the cell is not valid - it informs you that the system cannot find the font "Calibri".

Conclusion

It is pretty hard to conclude enything but "this sucks!". None of the applications warn the user that they have lost data - and they all do exactly that - loose data.

Original file: Book1.xlsx (8.82 kb)

Modified file: 

book2.xlsx (8.22 kb)

The actual work we did in Prague

I thought I’d try to outline a bit what we actually did and what constituted our work in Prague.

The agenda framing our work throughout these three days was this:

  1. Opening 2009-03-24 09:00
  2. Roll call of delegates
  3. Adoption of the agenda
  4. Schedule for publication of reprints or Technical Corrigenda
  5. Defect reports
  6. Future meeting (face-to-face and teleconferences)
  7. Any other business
  8. Extension proposals from member bodies and liaisons
  9. Conformance testing
  10. Closing

The vast majority of our work was in item number 5 on the agenda and each and every single minute was used discussing the defect reports – including in lavatories, on our way to work, on our way back from work, during lunch, dinner, breaks and drinks … in short – we discussed DRs 24/7. This was as it was supposed to be – this was really the reason for all of us being in Prague.

The initial list of DRs we discussed was this (just to give you an idea of what we talked about):

08-0001 — DML, FRAMEWORK: REMOVAL OF ST_PERCENTAGEDECIMAL FROM THE STRICT SCHEMA
08-0002 — PRIMER: FORMAT OF ST_POSITIVEPERCENTAGE VALUES IN STRICT MODE EXAMPLES
08-0003 — DML, MAIN: FORMAT OF ST_POSITIVEPERCENTAGE VALUES IN STRICT MODE EXAMPLES
08-0004 — DML, DIAGRAMS: TYPE FOR PRSET ATTRIBUTES
08-0005 — PML, ANIMATION: DESCRIPTION OF HSL ATTRIBUTES LIGHTNESS AND SATURATION
08-0006 — PML, ANIMATION: DESCRIPTION OF RGB ATTRIBUTES BLUE, GREEN AND RED
08-0007 — DML, MAIN: FORMAT OF ST_TEXTBULLETSIZEPERCENT PERCENTAGE
08-0008 — DML, MAIN: FORMAT OF BUSZPCT PERCENTAGE VALUES IN STRICT MODE EXAMPLE
08-0009 — WML, FIELDS: INCONSISTENCY BETWEEN FILESIZE BEHAVIOUR AND EXAMPLE
08-0010 — WML: USE OF TRANSITIONAL ATTRIBUTE IN TBLLOOK STRICT MODE EXAMPLES
08-0011 — WML: USE OF TRANSITIONAL ATTRIBUTE IN CNFSTYLE STRICT MODE EXAMPLE
08-0012 — SCHEMAS: SUPPOSEDLY INCORRECT SCHEMA NAMESPACE NAMES

I think it’d be fair to say that we have come a long way since the time we were discussing if it was possible to use XSLT to simulate bit-switching or if an OOXML-file was “proper XML”.

For each of the DRs we covered we discussed if the DR was a technical defect or an editorial defect, what the possible implications of the DR would be to existing documents and existing implementations and if the DR belonged in a corrigendum (COR) or if it was an amendment (AMD). It was quite tedious work, but we managed to cover quite a lot of ground in the three days.

Corrigendum or amendment?

One of the first things to accept when working in ISO is that there are quite the number of rules to comply to. As it turns out, it is not our prerogative to decide if a DR goes into “the COR bucket” or if it goes into “the AMD bucket” – there are rules for this. The ISO directives section 2.10.2 state that

A technical corrigendum is issued to correct [...] a technical error or ambiguity in an International Standard, a Technical Specification, a Publicly Available Specification or a Technical Report, inadvertently introduced either in drafting or in printing and which could lead to incorrect or unsafe application of the publication

If the above is not the case, the modification should be handled as an amendment.

Still, there are quite a lot of DRs that fall into the more gray outskirts of this definition. So to facilitate our work we made some guiding principles, and these principles were discussed at the SC34 plenary in Prague:

[…] in the interest of resolving minor omissions in a timely fashion, WG4 plans to apply the following criteria for deciding that the unintentional omission or restriction of a feature may be resolved by Corrigendum rather than by Amendment. All of the following criteria should be met for the defect to be resolved by Corrigendum:

  1. WG 4 agrees that the defect is an unintentional drafting error.
  2. WG 4 agrees that the defect can be resolved without the theoretical possibility of breaking existing conformant implementations of the standard.
  3. WG 4 agrees that the defect can be resolved without introducing any significant new feature.

Unless all the above criteria are met, the defect should be resolved by Amendment.

Of course we will still have to do an assessment for each and every DR we look at, but it is our view that these principles will help us quite a bit along the way and to have a more expeditious workflow. Notice also the wording “WG4 agrees”. A very small number of DRs falls clearly into the COR- or AMD-bucket, so it is not possible to regard these principles as a mere algorithm with a deterministic result. The principles requires WG4 to agree to the categorization of DRs so we’ll actually have to sit down and talk everything through.

On the first day (or was it second?) we also touched briefly upon the subject of modifying decisions made at the BRM. The delegates at the BRM were nothing but normal people, and due to the short timeframe of the meeting, errors likely occurred. At some point or another, someone will discover we made a mistake and put a DR on our table. At this point we will have to figure out if we think the decisions made at the BRM are now cast in stone or if they should be treated by the same criteria as the other DR we receive. As I said, we just touched upon the subject and didn’t reach any conclusions to this. If you have any thoughts regarding this, please let me (and us) know. My personal opinion on this subject is, that we in WG4, at this point in time, should be extremely careful when thinking about reversing decisions made at the BRM.

And finally, I thought I’d give you some pointers about what is in the pipeline of blog entries (I don’t have a sophisticated system as some people, so I’d just give you a small list of topics at the top of my mind these days:

  • Markup Compatibility and Extensibility
  • Conformance class whatnots
  • Namespace changes and the considerations about doing it or not
  • Why should we care about XPS?
  • Why I like the ISO model
  • Maintenance of IS26300 in ISO

 

Maintenance of IS26300 in SC34

The streets of Prague are buzzing with rumours coming out of the work in the working groups of SC34 and SC34 itself as SC34 is currently having its plenary meeting in Prague.

It seems that SC34 has done the only clever thing to do - to create an Ad Hoc Group (AHG) to have responsibility of maintaining IS26300. I applaud the decision to do so, and it has in my view been a long time coming.

The details and scope of the group is yet to be seen, but I am glad that SC34 has chosen to create it. There is only one entity responsible for maintaining ISO standards, and that is ISO. Maintenance of IS26300 has falling between two chairs at the moment, where WG1 was initially responsible for it, but it has been preoccupied with other tasks. Also, I think the maintenance agreement of IS26300 has been mentally prohibiting any work being done.

The upside of this is that there is now a group in SC34 responsible for receiving defect reports submitted by NBs. One group is responsible for preparing reports to OASIS and the get the responses back in the ISO system.

This is a clear improvement and it is a sign and a statement that we believe that IS26300 is too important to not have a group responsible for its maintenance in ISO.

Smile

WG4 meetings in Prague

Wow – this has been a tough week. I arrived at the hotel here in Prague (I am currently waiting in Prague Airport for my flight back to Copenhagen) at around 21:00. I met Doug in Copenhagen and flew with him to Prague and in the airport we ran into Kimmo. After 15 minutes in my hotel room I went down to the bar to get a “welcome to Prague”-beer. After another 15 minutes I crawled back to my room completely devastated due to a flu I hadn’t been able to get rid of. 5 seconds later Florian called and ordered me to get my ass down in the basement wine-bar where he was having drinks with Doug and Megan. I went back to my room when the bar closed at around half past midnight, did some last-minute updates/tweets and almost cried myself to sleep because of near-death-like fatigue.

… and the meetings hadn’t actually started yet.

The next morning the meetings started with a joint session between WG4 and WG5 at the Czech Standardisation Institute. A total of 31 delegates attended this initial meeting. Apart from the SC34 officers (SC34 chair, SC34 secretariat, WG4 convener), there were delegates from Canada, China, Czech Republic, Denmark, ECMA, Finland, France, Germany, Korea, Norway, South Africa, UK and USA. We had quite a lot of work on our table for these three days, and we immediately got to work after the initial pleasantries. A rough list of categories to be dealt with was “Defect reports”, “Rules of engagement” (or “Prime directives”), “Future work”, “Roadmap for future editions/corrections” and “Planning of future meetings and tele-conferences”.

If you’ve been following my twitter-feed (and the ones of Alex, Doug an Inigo) you’ll already have a notion of the insanely interesting things we talked about. But for those not following me (and you should!!!) we talked about sexy things like whether “named ranges” in spreadsheets were defined on the workbook-level or the worksheet-level, whether a reference to Unicode 5 implied dependencies of XML 1.1, whether xml:space applied to whitespace-only-nodes or just to trailing- and leading whitespace in element content, whether font-substitution algorithms in OOXML had a bias for Panose-fonts and if “Panose” really meant “Panose1” and suttle differences between the Panose-edition of Hewlett-Packard and the one of Microsoft (as far as I understood it, anyway)

Can you imagine all the fun we had?

And you know what? We didn’t stop talking about it during lunch, dinner nor brakes. As Doug noted in one of his tweets, the only difference between session and breaks was that during session, only one person talked at any given time.

Well, apart from all this fun, we made an enormous amount of progress. A total of about 169 defect reports have been submitted to us until this point, and we processed almost all of them. We didn’t close all of them, but we managed to process the most important ones and prepare ourselves for our first tele conference in mid April. We laid down some ground principles upon which we will make decisions in the future and we talked about a set of “Prime directives” to form a mental basis for our work (think: The three Laws of Robotics).

In short – it was a good week. I’ll post a series of blog posts in the next weeks outlining the results we achieved (and did not achieve) including both the extremely boring ones as well as the more controversial ones. So Watch this space …

PS: I almost forgot. Microsoft sponsored a dinner/buffet for the participating experts on Wednesday. But what was even cooler was that they had lined up a bunch of Ferraris and Lamborghinis for us outside the restaurant, and we could just take a pick to choose a car to take home. Mine was red! Is that wicked or what?

To the nitwits from <no>ooxml.org: Take it home, boys!

OASIS to JTC1: Bye, bye ...

Ever since the hoola about OOXML-approval there has been quite some discontent in the ISO community regarding how ODF TC has fulfilled its obligations after IS26300 approval. A few meetings have taken place to "amend the harsh feelings" and now some preliminary results have been sent to the NBs for consideration. For those with ISO privileges the documents [1], [2] can be found in the SC34 document repository.

There has been a lot of debate as to where maintenance of ODF should take place, be it in OASIS via ODF TC or via some construction as with OOXML, where the originating TC is included (assimilated) into SC34 and maintenance and development takes place there. I really don't care where these activities take place. I just want the best qualified people to do it.

Now, the documents deal with a definition of principles and a more specific definition of "who takes care of what?"-items. When reading through the documents, I couldn't help getting the feeling that what OASIS was essentially telling JTC1 was "It's my way or the highway".

JTC1 and OASIS have come to the following agreement around maintenance: 

  • OASIS ODF TC takes care of maintenance and development of ODF. 
  • National body participation in this work is encouraged to take place in ODF TC by either direct membership, via the "Comment mail list" or via TC Liaison (I didn't know JTC1/SC34 had one of those in ODF TC)
  • OASIS will submit each approved edition of ODF to JTC1/S34 for approval to make sure that approved standards are equivilant.

I completely agree on item 1) and 3) above, but item 2)? In the paper there is not a single sentence on how the procedures in JTC1 fit into all this. Why are there no wording regarding voting procedures in SC34? If ODF TC comes up with something new and "substantially different", it should be submitted using the "PAS submitter status" of OASIS (similar to the Fast track procedure ECMA used with OOXML). But a PAS submission requires voting in SC34 and if the vote fails (or substantial concern is raised), a BRM is scheduled. If the comments are fixed, the result of the BRM will be an "errata-sheet" and a new vote takes place.

Suppose the post-BRM vote approves the submitted ODF edition

  • what will OASIS do with the errata-sheet?
  • what are the principles for getting them back into the OASIS-approved edition of ODF?
  • what is the time frame?

Is the truth really, that OASIS doesn’t want JTC1/SC34 to do anything to ODF but rubber-stamp it when it comes our way?

When the original ODF 1.0 was submitted to JTC1, a maintenance plan was agreed upon. It had two small but really important words in it: "as is". The maintenance agreement said (AFAIR) that JTC1/SC34 was expected to approve future editions of ODF "as is". In other words, what OASIS managed to get JTC1 to agree to was essentially: "Don't look at it, don’t' open it, don't flip through it, just - don't touch it. Get a hold of the ISO-approval stamp, stamp it and send it back to us".

The only possible conclusion is that OASIS does not want any direct ISO-involvement in development of ODF.

That is fine - the ODF TC should do what they find best. But I am wondering if that also means, that OASIS will not send future editions of ODF to JTC1 for approval? Surely, OASIS can't live with the reputation of having their standards simply rubber-stamped by ISO? 

You may also ask why it is not good enough for JTC1-members to contribute to ODF through ISO. Well, OASIS is a vendor-consortium and the interests of the vendors seem to be somewhat different than the interests of the national bodies. If you look at the contributions of Murata Makato and Alex Brown through the ODF Comment list, it is clear that their interests in quality in schemas, constructs and the specification itself was not prioritized in the TC at all. To me a mix of vendor interests and national bodies is the best way to ensure high quality in any specification, but the proposed agreement between JTC1 and OASIS seems to cut out the national bodies acting as "national bodies"

I think it is a good idea to ISO-approve ODF in the future. But JTC1 needs to send a clear signal to OASIS saying, that is it fine that they want the “Seal of ISO” and we welcome them. But in order to have the cake, OASIS must eat it too. The ISO package must come with two items, 1) the ISO quality stamp and 2) national body involvement. You cannot just have the stamp! It should be emphasized that it is the prerogative of the national bodies to process the standards that come their way and that cutting them off and have them do nothing but rubber-stamping the specification is completely unacceptable.

The proposed maintenance proposal will be discussed at the JTC1/SC34 plenary in Prague on Friday, and I hope all national bodies have understood the ramifications of approving the maintenance agreement. I suggest the plenary responds by saying to JTC1/OASIS: "Thank you for your suggestion for a maintenance plan for ODF, but come back again when we as  national bodies have a solidly founded role in the maintenance of the specification".

Struck by the Wrath of Roy "Kahn" Schestowitz

As the real work of maintaining OOXML in ISO has begun, I have had some time to ponder over events throughout the last year - starting with the BRM in Geneva in February 2008.

Being in Geneva was really hard work, negotiating all day in a 120-seat plenum while in the evening preparing suggestions in coorporation with other delegates from other countries. It was fun, but hard, nevertheless. I remember sitting on my bed in the hotel room trying to sort out everything while trying to keep up with the debates happening outside our meeting room (a defecto radio silence had been initiated voluntarily by the more prominent bloggers around the world, so no information was being released to the people desperate for the slightest amount of information).

One of the tools I used was to keep track of the sites referring to my blog and one evening as I sat eating Swiss chocolate on my bed in the hotel, I noticed a new referral from Google Groups.

link

link

link

Versioning of OOXML (thank you for all the fish)

One of the most pressing matters we had to deal with in Okinawa was a question raised by quite a few people including members of the national body of Switzerland as well as hAl on the blogs of Alex Brown, Doug Mahugh and yours truly:

How can you tell if a document is generated using the original set of schemas or the new (improved) ones?

The truth is: you can’t.

Well, at least not at the moment. You can get a hint from sniffing at various parts of the document, but there is no definitive way to do it. We all agreed that we had to come up with a solution, and we discussed (at length in session as well as during breaks, dinners and sight-seeing) what to do.

Roughly speaking, there are a few ways we could do it, including

  • Changing the namespace-name of the schemas
  • Expand the conformance attribute to indicate version of OOXML
  • Adding an optional version attribute to the root elements of the documents (WordpressingML, SpreadsheetML and PresentationML) defaulting to the original edition of  ECMA-376.

Version attribute

Let me start with the last option, since it is the easiest one to explain and understand.

ODF has a “version”-attribute in the root element of ODF-documents. It is defined in the urn:oasis:names:tc:opendocument:xmlns:office:1.0-namespace, so when creating e.g. an ODF spreadsheet using OOo 3, you will see the following xml-fragment:

[code:xml]<?xml version="1.0" encoding="UTF-8"?>
<office:document-content
  xmlns:office="urn:oasis:names:tc:opendocument:xmlns:office:1.0" (...)
  office:version="1.2">
</office:document-content>[/code]

The above would tell you to use version 1.2 of the ODF-spec – currently being drafted by OASIS.

We could do a similar thing with OOXML, that is, having an optional version-attribute with the version number of the applied flavor of OOXML. This approach would have some clear advantages. First and foremost it would allow all the existing applications supporting OOXML to do absolutely nothing to their existing code base to continue to be able to read and process OOXML-files in ECMA-376 1st Ed format. It would also enable them to use any existing schema-validation of content and all existing files in ECMA-376 would still be perfectly valid.

Expanding the conformance attribute

Another thing to do would be to expand the new conformance attribute. At the BRM in Geneva a new conformance attribute was added to the root elements to display to which version of OOXML the document conforms. You will perhaps recognize this XML-fragment
[code:xml]<w:document conformance=”strict”>
</w:document>[/code]
We could also use this attribute and add version information to it. A way to do it would be
[code:xml]<w:document conformance=”transitional-1.0”>
</w:document>[/code]
for the ECMA-376 1st Ed and something else for any subsequent versions.

Fixing or solving?

The problem with the two alternatives mentioned above is that they provide an immediate fix, but they are in no way panaceas for the issue of versioning. In Geneva we split up OOXML into 4 distinct parts and tried the best we could to make sure, that they were “islands” within themselves. So in the original submission’s Part 2 dealing with OPC, there were dependencies to WordPressingML (AFAIK) and these were removed. The result is that you can now refer to ISO/IEC 29500-2 should you in your implementation need a packaging format where OPC suits your needs. The basic idea was exactly this; to provide a way for other standards to be able to “plug in” to OOXML and reuse specific parts of it.

The two fixes described above provide a fix for the problem with versioning of “the document stuff”; text documents, spreadsheets and presentations – but they do nothing for Part 2 and Part 3 (under the assumption that Part 4 will not change). The trouble is - this is not only a theoretical problem. ECMA TC46 working with XPS (Xml Paper Specification) has based the package format for XPS on OPC. But it is difficult for them to refer to ISO/IEC 29500-2 OPC since it is not possible to distinguish the namespace name from its predecessor ECMA-376 1st Ed. So unless we figure out a solution, they will have to refer to ECMA-376 1st Ed (and it was my impression that they’d prefer to refer to ISO OPC instead).

This is kind of annoying or maybe even embarrassing. We (the ISO process) chose to split up OOXML to allow reuse – but the first time someone knocks on our door and wishes to do exactly that – we (unless we find a solution to this problem) will have to say: “Well, we didn’t actually mean it”.

Change the namespace-name

An entirely different approach would be to change the namespace name(s) of IS29500. The original names where along the lines of

http://schemas.openxmlformats.org/package/2006/content-types
http://schemas.openxmlformats.org/package/2006/relationships
http://schemas.openxmlformats.org/spreadsheetml/2006/main
(…)

So an alternative solution would be to change the values of the namespace name. The names above could be changed to

http://schemas.openxmlformats.org/package/IS29500-2008/content-types
http://schemas.openxmlformats.org/package/ IS29500-2008/relationships
http://schemas.openxmlformats.org/spreadsheetml/IS29500-2008/main

(I would have liked to use colon as seperator between the ISO project number and year, but according to http://www.w3.org/TR/REC-xml/#sec-common-syn, it seems colons are not allowed in namespace names.)

What would be the consequence of this?

The up-side

Basically, changing the namespace name would solve the problem with distinguishing between ECMA-376 1st Ed and IS29500:2008. It would be trivial to distinguish content based on either standard and it would apply to all parts of the specification. Actually, it would apply to all schemas in the specification, so it would enable someone to create a document based on ECMA-376 OPC, IS29500 WordpressingML and ECMA-376 DrawingML (even though this is permitted in the current version of OOXML). It would also give us the chance to have a fresh start with IS29500:2008 and give us a clean slate for our further work.

The down-side

Changing the namespace is sadly not a silver bullet – unfortunately the free lunch comes with nausea as well. The trouble is – by changing the namespace, applications that support ECMA-376 will break if they try to load documents based on IS29500 since the namespace will be foreign to them.

The question is, though: shouldn’t they?

The purpose of XML namespaces are to identify the vocalulary of the elements of an XML-fragment. So the real question could be: are we talking about a new vocabulary when going from ECMA-376 to IS29500:2008? Are the changes from the BRM so drastic that we wouldn’t expect applications supporting ECMA-376 to be able to load documents conforming to IS29500?

Well, it was of importance to ECMA and most of the delegates at the BRM to ensure that whatever we did to change the specification did not render existing nonconformant. We succeeded quite well in doing just this,  so one could argue that the changes were not that big. However, this just concerns the transitional schemas. If you remember, the changes in schema structure were quite big. We divided one big chunk of schemas into two categories, “strict” and “transitional” and I would indeed argue that we changed the vocabulary by doing just that. We changed it from defining a vocabulary with a complete mess of legacy-stuff and new stuff into two separate piles with one “going-forward-vocabulary” and one “going-backwards-vocabulary”. Isn’t that big enough to change the namespace name?

Do it right the first time

At the WG4-meeting I was actually advocating for a simple addition of a version attribute and solve the bigger namespace problem at a later time for a revision of OOXML, but the more I think about it, the more I am convinced this is the wrong way. We are in a position right now where there are no applications out there supporting the full set of IS29500. Not changing the namespace name will not make the problem go away – it will just postpone the issue, and if we wait, the problem will become increasingly bigger as applications will surface with support for IS29500. The problem will be even bigger if you have a long list of supporting applications and not – as now – none a single one.

The more I think about it, the more I am sure the right way to do it is

  1. Add a new version attribute to the root elements defaulting to “1.0” which would be ECMA-376 1st Ed. IS29500:2008 would have version “1.1”.
  2. Change the namespace name for IS29500 in a matter as outlined above.

Vendors in the process of implementing IS29500 will then have to add some code to their application to support this.

But – I am in no way sure I have covered all angles. Am I missing something here?

Smile

Post WG4-meetings in Okinawa

 

Last week (week 4 of 2009) we had the first face-2-face meeting in SC34/WG4 on the Japanese island of Okinawa. Since there is quite a big overlap between the participants of WG4 and those of WG5, the two groups meet at the same time and place to minimize travel costs and time away.

Quite a lot of people had chosen to take the "small" trip to Okinawa, and at roll-call the first day, a total of 22 people sat around the table in the meeting room. Of these were 6 from ECMA and 14 represented various national bodies (of these were 3 employed by Microsoft)

How's that for full disclosure, eh?

The purpose of the meeting was to get started maintaining OOXML and to discuss what to do in the future. We were also to discuss the already submitted DRs and see what we could do about these.

One of the first things I realized on that morning was, that by participating in standardization in ISO (and from what I hear, also most other standardisation organisations) you need to accept following a certain number of rules. As it turns out, we are in no way free to fix problems in the spec, we are in no way free to make new additions of the spec etc. As it turns out, there are rules constraining all of these activities. So the project editor (Rex Jaeschke) took us on a lengthy trip down "ISO-regulation-lane". The idea was to give us all some knowledge of the rules and terms (as in 'nouns') used in the directives so that we would all be on the same, first page moving forward. The basis for the walk-through was a document prepared by the editor and it is available on WG4's website.

DRs

Quite a lot of DRs were submitted to WG4 before the meeting. I think the total number was about 25-30, and they ranged from fixing spelling errors to clarification of the text and schema changes. The first thing we discussed was how to categorize the DRs. The "buckets" were "defects" and "amendments" and how to distinguish between editorial defects and technical defects. We quickly agreed that focus should initially be to verify and aprove any DRs relating to decisions from Geneva that had not made it into the final text. ECMA also had quite a big batch of DRs submitted before the meetings, but since they were not submitted in time for everyone to look at them, we did not make any decisions about these - ECMA just went through them in detail and we discussed each of them.

Details we discussed were certainly of world-changing importance, such as the difference between the text fragments "nearest thousands of bytes" and "nearest thousand bytes", the allowed content of string-literals and intricate details of the xml:space-attribute in an XML-element based on the XML 1.0 specification. Still, it was quite entertaining and it was delightful to sit back and simply overhear the discussions of people that really know what they were talking about.

Comment collection form

ECMA has set up a comment collection form to submit DRs from interested national bodies. It has already been set to use by the Japanese national body and it seems to serve its purpose just fine. Hopefully it will enable us to improve data qualityof the incoming DRs. We gave feedback to the application to Doug Mahugh from ECMA and hopefully he will see to that the suggestions are implemented (especially mine!)

Smile

We discussed at length the concept of "openness" and how we should apply it to our work, and I will cover my feelings for this in detail in a top-post a bit later.

Last minute impressions

This was my second trip to Japan and I must say that I am getting more and more excited about it for every trip. The culture is fantastic and it is a good challenge to be in a part of the world, where you don't speak the language and is incapable of reading almost any signs. I did get a bit of "Lost in Translation"-feeling on my trip back (+40 hrs!), but it was really a good trip. Two thumbs up for the convener, Murata-san who showed us how a splendid host acts and shows their guests a great time.

All in all I also think we had some productive days on Okinawa. We managed to deal with quite a few DRs and to set up work-processes for the future and I am sure we will benefit in the near future of the work we did. It was also interesting to watch the "arm-wrestling" between the national bodies and ECMA. We were on the same page in most cases, but it was interesting to be part of the discussions where we were not. It will be interesting to see how this will evolve in the future. ISO is a bit different than, say, OASIS because of the involvement of national bodies. Where the basis for most of the groups in OASIS is "vendors", it is quite orthogonal to this in ISO where this concept does not really exist. Some of you may remember Martin Bryan's angry words at the plenary in Kyoto about vendor participation and "positions" vs. "opinions" and I am looking forward to take part in these discussions in WG4 as well as here.

 


Additional resources

Below are a couple of links that might be of interest to you

SC34 WG4 public website

SC34 website

(and for Okinawa-related activities)

Alex Brown's write-up about day 0, 1, 2 and 3-4 of the meetings

Doug Mahugh's summary of what took place

Pictures taken by the secretariat

Picture-stream from Doug Mahugh

Picture stream from Alex Brown

Picture stream from Jesper Lund Stocholm (me!)

Twitter stream from Doug Mahugh

Twitter stream from Alex Brown (notice the l33t-speek Twitter-tag Alex uses!)

Twitter stream from Jesper Lund Stocholm

Bonus for those of you waiting for the credits at the end of the movie:

The day I arrived I was met by Murata-san and Alex Brown in the lobby of the hotel. They were on their way to dinner at a restaurant called "Kalahaai" in the "American Village" of Naha. The dinner took place in a restaurant with live Japanese music from a group called "Tink Tink". Their music was really amazing. The last evening we went there again, and Shawn and I were listening completely baffled to the music and on-stage talks of the performers. It was an amazing experiance to sit in the restaurant not understanding a single word they said - and still not being able to stop listening to them.



(courtesy of Doug Mahugh)

And look at this picture. Thanks to Doug's tele/wide/fish-eye-whatever-lense on his camera, I look like an absolutely mad-/maniac man! No girls were hurt during this, I should point out.


(courtesy of Doug Mahugh)

Smile