intertwingly

It’s just data

WHATWG/W3C Collaboration

I’ve been having fun working on the URL Living Standard. The first change I landed was to convert the spec from Anolis to Bikeshed. Here’s the before and after after. And just for fun, here is the beginning on 2014 and beginning of 2013. The point being that arbitrary snapshots of living standards do exist.

Along the way, I’ve been named by my employer’s AC member to be a member of the W3C WebApps Working Group, and invited to become a member of the WHATWG organization on GitHub. I’ve been named as co-editor of the spec in both organizations, and at that point the fun abruptly stopped. Apparently, the larger political issues that I had successfully avoided in the past moved front and center.

Here’s what I said in September:

While I am optimistic that at some point in the future the W3C will feel comfortable referencing stable and consensus driven specifications produced by the WHATWG, it is likely that some changes will be required to one or both organizations for this to occur; meanwhile I encourage the W3C to continue on the path of standardizing a snapshot version of the WHATWG URL specification, and for HTML5 to reference the W3C version of the specification.

Now it is time for me to spell out how I see that happening.

I’ll start out by saying that I continue to want the WebApps WG to follow through on its charter obligation to continue to publish updates to the URL Working Draft. And once updates resume, I want to work on making doing so entirely unnecessary. While this may sound puzzling, there is a method to my madness. I want to establish an environment where an open discussion of this matter can be held without anybody feeling that there are options that are closed to them or that there is a gun to their head.

Next I’ll state an observable fact: there exists people who value the output of the W3C process. The fact that there are people who don’t doesn’t make the first set of people go away or become any less important. Note that I said the output of the W3C process. People who value that don’t necessarily (or even generally) want to observe or participate in the making of the sausage.

What they value instead is regular releases and making the bleeding edge publicly available. And for releases, what they care most about are the items that are covered during a W3C Transition (example). In particular, they are interested in evidence of wide review, evidence that issues have been addressed, evidence that there are implementations, and the IPR commitments that are captured along the way.

Some have (and do) argue that these needs can be met in other ways. Not everybody is convinced of this. I’m not convinced. In particular, the existence of a bugzilla database with numerous bugs closed as WORKS4ME without explanation doesn’t satisfy me.

To date, those needs have intentionally not been met by the WHATWG. And an uneasy arrangement has been created where specs have been republished at the W3C with additional editors listed, in many cases in name only. Those copies were then shepherded through the W3C process. Many are not happy with this process. I personally can live with it, but I’d rather not.

I said that this will require changes by one or both organizations. I will now say that I expect this to require cooperation and changes by both. I’ll start by describing the changes I feel are needed by the WHATWG, of which there are three.

  1. Agree to the production of planned snapshots. And by that I mean byte-for-byte copies. As a part of this that would mean the identification of "items at risk" at early stages of the process, and the potential removal of these items later in the process. These snapshots will need to meet the needs of the W3C, primarily pubrules, and only linking to W3C approved references. Even though it should have to go without saying, apparently it needs to be said: those specs need to be snark free. Finally I'll go further and suggest that those snapshots be hosted by the W3C, much in the way that the W3C hosts WHATWG's bugzilla database and mailing list archives.

  2. Participation in the production of Transition Requests. That would involve providing evidence of wide review and evidence that issues are addressed. It also could include, but doesn't necessary require, direct participation in the transition calls.

  3. Understanding and internalizing the notion that the combination of an open license coupled with begin unwilling or unable to address a perceived need by others is a valid reason for a fork. Yes, I know that the W3C hasn't adopted an open license themselves, and I believe that is wrong too. But that doesn't change the fact that an open license plus an unmet need is sufficient justification for a fork.

I’ll close my discussion on the WHATWG changes I envision with a statement that participation in the W3C process (to the extent described by #1 and #2 above) is optional and will likely be done on a spec by spec basis. Editors of some WHATWG specs may not chose not to participate in this process, and that’s OK, I simply ask that those that don’t recognize the implications of this choice (specifically #3 above).

Responsibility for advancing specs for which the WHATWG editors voluntarily elect to participate in the process would fall to a sponsoring W3C Working Group. Starting to sponsor, ceasing to sponsor, and forking a spec would require explicit W3C Working Group decisions. As a general rule, Working Groups should only consider sponsoring focused, modular specifications.

Here’s what sponsoring would (and most importantly, would not) involve:

  1. No editing. As suggested above, snapshots produced by the WHATWG would be archived, but these archives would be byte-for-byte beyond the changes involved in archiving itself (example: updating stylesheet links to point to captured snapshots of stylesheets). The one possible exception to this would be in the updating of normative references, but this would only be done with the concurrence of the WHATWG editors.

  2. Participation would be limited to the production of Transition Requests. This would include providing evidence of wide review, evidence that issues are formally addressed, recording and reporting of Formal Objections, collecting patent disclosures, etc.

That’s it. Of course, the process will remain the same for documents that are copied and shepherded instead, but I see no reason that WebApps WG couldn't sponsor the WHATWG URL standard through this process, the HTML WG couldn't do the same for the DOM standard, the I18N WG couldn't do the same for the Encoding standard, etc.

While everybody may come into a sponsorship collaboration with the best intentions, we need to realize that things may not always go as planned. There may be disagreements. It has been known to happen. When such occurs:

  1. Everyone involved should work very hard to resolve the dispute as the consequence of breakage is very bad all around.

  2. If no agreement can be reached, the W3C Working Group will likely stop the sponsorship of the specific spec involved in the dispute.

  3. If a Working Group stops sponsoring a spec, the Working Group could still fork that spec - but that would be a suboptimal solution for both W3C and WHATWG. It would also re-inflame the debates between organizations.

  4. Nonetheless, since each organization has different criteria, we must recognize that this could happen; especially for large, broad, complex specs. Accordingly it makes sense for both organizations to continue the trend towards smaller and more modular specifications

I have no idea if others are willing to go along with this, but I hope that this concrete proposal helps anchor this discussion. I invite others that are inclined to do so to suggest revisions or to create proposals of their own. As an example, since the above describes an environment of collaboration and sharing of work, perhaps co-branding may be worth exploring?

This clearly will take time. As an editor of the URL specification, I’d like to propose that it be the first test of this proposal. In the meanwhile, I plan to spend my time coding.

For those that wish to dig further, a few links:


pegurl.js

pegurl.js is the result of two days worth of work.  While it is undoubtedly buggy and incomplete, it does pass 255 out of 256 tests and that last test is wrong.  For comparison: results from other user agents.

Current work products and future work

...


WHATWG URL vs IETF URI

I’ve been looking into differences between the WHATWG URL Living Standard and the combination of RFC 3986 and RFC 3987.  I’ve come up with an indirect but effective way to identify the differences using urltestdata.txt and addressable.

...


Dreamhost upgrade

Dreamhost upgraded my server to Ubuntu 12.04.  I noticed things breaking in preparation for the move, and things that broke after the move.  If you see something not working correctly, please let me know.


The URL Mess

tl;dr: shipping is a feature; getting the URL feature well-defined should not block HTML5 given the nature of the HTML5 reference to the URL spec.

This is a subject desperately in need of an elevator pitch.  From my  perspective, here are the three top things that need to be understood.

...


New Toy

New laptop for work: MBP 15.4/2.6/16GB/1TBFlash.  First time I ever went the Apple route.  I did so as I figured with those specs, I could run multiple operating systems simultaneously.  So far, so good.  I’m using VirtualBox to do so.

Notes for Mac OS X 10.9, Ubuntu 14.04, Windows 8.1, and Red Hat Enterprise Linux 6.5.

...


Travisizing My Projects.

Today, I got a pull request from Ryan Grove to make nokogumbo work on Ruby 2.1 and add Travis support.  Very cool.  I was surprised how easy it was to set up.

A few hours later I got ruby2js to work on Ruby 2.0 and 2.1 and added Travis supportWunderbar worked right out of the box.


Frameworks as Stepping Stones

Joe Gregorio: But something else has happened over the past ten years; browsers got better. Their support for standards improved, and now there are evergreen browsers: automatically updating browsers, each version more capable and standards compliant than the last. With newer standards like HTML Imports, Object.observe, Promises, and HTML Templates I think it’s time to rethink the model of JS frameworks. There’s no need to invent yet another way to do something, just use HTML+CSS+JS.

I’m curious as to where Joe believes that these features came from.

...


Technology behind Whimsy.apache.org

W

Slides for my ApacheCon talk.  Right/left goes to the next/previous section, up/down for navigating with a section.

The demo is unfortunately only available to ASF committers (for privacy reasons, as it exposes email addresses).


Angular.rb example

Tim Bray: If hating this is wrong, I don’t want to be right.

Perhaps you would like this better?  :-)

...


Time Warner Rate Hike

Backdrop:

With that context, today I got in the mail notification that my rates are set to go up by 60% as my “Promotional” rates (Seriously?  A twenty two year long promotion?) will be expiring.  After spoofing my User Agent as the chat function doesn’t recognize my browser/operating system combination, I verified this is indeed the plan with “Veronica”.  I was then provided a transcript and directed to an online survey when promptly logged me off without submitting my feedback once I had completed it.

I plan to follow up with @TWC_Help.


Wunderbar JQuery filter

W

I got a suggestion to look into React.js, a JavaScript library which is focused on the problemspace that Angular.js’s directive addresses.

One of the ways React.js facilitates the creation of web components is via JSX which mixes “XML” with JavaScript.  The “XML” is “desugared” into React.DOM calls.

Based on this idea, I created a Wunderbar jquery filter to “desugar” Wunderbar calls into JQuery calls.  The tests show some of the conversions.  I also updated my Bootstrap modal dialog directive to make use of this: before => after.


Ruby2js += underscore.js

When compared to Ruby, JavaScript doesn’t have as much functional support built in.  Underscore.js fills that gap for many.  Underscore.js, in turn, was inspired by Ruby’s Enumerable module.  A underscore filter (tests) completes the mapping.

In many cases, the resulting JavaScript is formed by applying a number of filter rules.

...


Ruby2JS Attribute=>Property support

Ruby2JS now maps Ruby attributes to JavaScript properties.

...


HTML5 Mode Links

Based on a suggestion by Tim Bray, I converted my board agenda Angular.js application to use html5 mode.  The process was straightforward:

1) add the following to your application configuration:

$locationProvider.html5Mode(true).hashPrefix('!')

2) Add a <base> element to my generated HTML, indicating which part of my path was “owned” by the server.

3) Convert my relative links.  Based on how my application was structured:

I’ve not yet tested it with Internet Explorer <= 9, but the Angular.js docs indicate that it should work there too.


Software in 2014

Tim Bray: We’re at an inflection point in the practice of constructing software. Our tools are good, our server developers are happy, but when it comes to building client-side software, we really don’t know where we’re going or how to get there.

While I agree with much of this post, I really don’t think the conclusion is as bad as Tim portrays things. I agree that there are good server side frameworks, and doing things like MVC is the way to go.

I just happen to believe that this is true on the client too – including MVC. Not perfect, perhaps, but more than workable. And full disclosure, I’m firmly on the HTML5-rocks side of the fence.

...


Wunderbar Tutorial

W

I’ve begun work on a Wunderbar tutorial.

Feedback welcome.


Angular.rb update

It does indeed turn out that language macros can reduce the amount of Angular.js boilerplate configuration to a minimum.  In the process I’ve spun off ruby2js is a standalone supporting library.


Angular.rb

I’m looking into what it would take to make it easier to produce Angular.JS client applications using a server coded in Ruby.  The approach I’m taking is to convert idiomatic Ruby into idiomatic AngularJS JavaScript.

Demo.  Corresponds roughly to tutorial step 4Example outputSpecs.


Mavericks

Did a clean install of Mavericks on my test mac-mini.  Things to be aware of for next time:

xcode-select --install

sudo ln -s /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/ /Applications/Xcode.app/Contents/Developer/Toolchains/OSX10.9.xctoolchain

sudo mkdir -p /usr/local/lib; sudo ln -s /usr/local/mysql/lib/libmysql* /usr/local/lib