The many lives of JavaScript

I recently worked out that I’ve learned, on average, a new programming language every two to three years. These many languages have been part of my toolbox for somewhere between four to six years before falling away to make room for new technologies. However, there is one programming language that has been a major part of my programming repertoire for almost 22 years now – and that is JavaScript.

My JavaScript programming skills have recently undergone a major renaissance as I’ve adopted JavaScript 6 (aka ECMAScript 2015), for most of my coding. The way I write code today is nothing like the code I wrote just one year ago – and I’ve gone back and largely modernised all active code to be consistent. Today’s programming style uses modern frameworks and is far more object oriented and asynchronous. There are many new features which have totally updated how I write code. Proper (while still limited) classes with mixins have replaced the ugly prototype mechanism I used to use for object orientation. Let and const declarations have caught latent bugs that were hidden in my code. Arrow functions (aka. lambda expressions) and promises have streamlined code that once was quite clunky. The list goes on…

Even my tools have changed. Microsoft’s surprisingly excellent Visual Studio Code has replaced the hodgepodge of tools I once used. We’re in the process of integrating Jasmine and Karma to the process. JavaScript Semistandard Style (no, I still like semicolons) has ensured a very clean code base – as well as catching a multitude of errors and sins.

LifeOfJavaScript

All this change got me thinking about the four lives of JavaScript that I have worked through. Way back when, JavaScript had an awkward birth at the hands of Netscape as the lesser stepchild of the new Java programming language from Sun that was taking away all the attention. JavaScript was just a way to glue Java applets together in the browser. The problem is, Java applets really sucked.

Microsoft quickly saw the value of JavaScript though, and launched their own effort to steal Netscape’s baby. And so, JavaScript was stolen, renamed JScript and made to be the adopted sibling of Microsoft’s other scripting language, VBScript. One good bit progress that Microsoft made was to sponsor the standardisation of the language, although the resulting name of ECMAScript was another in a long string of unfortunate names the language has had to endure.

As JScript, JavaScript was to become an integral part of Microsoft’s entire ActiveX strategy. A lot of really cool technologies (yes, really) came of this allowing JScript to go beyond the browser. As an application extension language, it found its way into the XMetaL XML editor as the customisation technology. We used it and many of the ActiveX technologies to great effect when we implemented California’s bill drafting system. However, it didn’t just end there. We were able to use it on the server-side through Classic ASP and as a shell scripting language through the Windows Script Host. For a Microsoft-centric programmer, this era of JavaScript was a glorious one.

However, ActiveX was seriously flawed. It was entirely proprietary and riddled with problems. Microsoft abandoned it almost as quickly as they had adopted it – moving on to .Net where JScript.net was a non-starter. As Microsoft’s interest in ActiveX and even Internet Explorer waned in the early 2000s, life as a JavaScript programmer became ever gloomier. While the capabilities were awesome, there was obviously no future.

At this point, we made the somewhat painful decision to move away from Microsoft’s outmoded view of the Internet and go back to the basics. While it meant giving up a lot of capability, in the end it was an excellent decision for it pointed to the future. One tiny aspect of Microsoft’s ActiveX vision, the XMLHttpRequest object, escaped from Microsoft and gave rise to a whole different way of programming – Asynchronous JavaScript and XML (AJAX). This development and the emergence of new browsers, first Firefox and then Google’s Chrome with its V8 JavaScript engine, breathed new life into JavaScript.

Freed from Microsoft’s grip, JavaScript has flourished. The past decade has seen a plethora of new technologies. Isomorphic JavaScript (or Universal JavaScript) blurs the distinction between coding for the server and the client. In fact, technologies like Electron turn web-based application development back to the desktop where you can get the best of both worlds.

When I look back on the code I wrote during the ActiveX era (yes, we still support it), it looks prehistoric. Modern JavaScript is so much more capable and flexible than the clunky rendition we had back when COM-based ActiveX was supposed to change the world. As I mentioned earlier, how I program now is completely different – asynchronous programming is a difficult but very worthwhile skill to acquire.

Looking to the future, I see three paths. On one side is a mature but polarising platform that is dominated by Oracle. Oracle’s dominance ensures stability but also deters innovation. Looking to the other side, one finds another mature but polarising platform that is dominated by Microsoft. Here too, Microsoft’s dominance ensures stability but also deters innovation. The result is that it seems that both paths have now had their heyday. You don’t hear very much aspirational news from either technology path anymore — what it must have felt like programming a mainframe in COBOL at the height of the C/C++ era.

The third path seems to be the path of the future – staking out a middle ground that neither technology giant can stomp on. Sure, Google is a technology giant that plays a strong role, but they’re still reasonably well regarded by the development community at large (for now). It is this middle ground that has been the most fertile for new technologies – and JavaScript is right in the thick of it. There are so many new technologies it’s hard to keep track of them all — AngularJS, Node.js, React, Express.js, to name but a few. While this third path can play well with both of the other two, for me it is the path that truly points to the future.

This brings us to the fourth life for JavaScript – building on the momentum of the past decade to mount a credible challenge for enterprise apps. While I initially dismissed many of the new features of the language as mere syntactic sugar, my experience with it has shown it to be more. I now write much better code. I believe we’re on the verge of an explosion in JavaScript-enabled applications that will blur the distinction between the platforms, between the desktop and the browser, and between the server and the client. This is truly an exciting time, once more, to be developing in JavaScript.

It goes without saying, but stay tuned for more…

The many lives of JavaScript

Implementing an Akoma Ntoso Editor

Yes, we’ve now built a full real-world legislative drafting editor using the final release of the new OASIS standard for legislative XML known as Akoma Ntoso. No, it wasn’t easy, but drafting tools never are. While our project is not yet a finished implementation, it shows that Akoma Ntoso is adaptable to some of the most challenging demands it will face as a world-wide standard for digital legislation.

Akoma Ntoso is a very ambitious standard. It strives to anticipate all the possible needs that jurisdictions around the world will have while also planning for a wide range of useful applications that can be built on top of the data. The result is a sophisticated schema with many more features than any one implementation will ever need.

The trick is being able to mould Akoma Ntoso to fit the unique needs of a jurisdiction while also providing a user experience that is natural and fits the problem space exactly. This was the challenge that led us to develop a custom web-based XML editor. After surveying the available market of web-based editors, we quickly found that none would be sufficiently adaptable to allow Akoma Ntoso to realize its true potential.

There are two aspects of building an Akoma Ntoso editor that have required particular attention:

  1. Adapting Akoma Ntoso to fit the jurisdiction’s Documents
    If you’ve taken a look at Akoma Ntoso, you know that it’s jam-packed full of tags and features, far more than are ever necessary in a single implementation. Trying to create a single comprehensive implementation of it all, a one-size-fits-all approach, will only yield an overly complicated and unusable tool that will be suitable to nobody. At the same time, despite Akoma Ntoso’s efforts to cover all possible scenarios, there are still gaps in the schema where specifics details to individual jurisdictions are not covered. Akoma Ntoso anticipates this shortcoming by providing a pattern-centric mechanism for extending a set of generic elements to fill in the gaps.
    AkomaNtosoSubset.pngAn authoring tool needs to hide or omit the unused parts of Akoma Ntoso, adapt the parts that are being used to fit the specific requirements of a jurisdiction, and allow for extension of Akoma Ntoso using the generic mechanism for extension in such a way that these extensions would appear to be seamless. As it turns out, almost a third of the elements we’ve implemented are extension elements. The result is an editor that allows a fully compliant Akoma Ntoso document to be drafted (correct by construction), while at the same time ensuring that the document fully complies with the jurisdiction’s model for how that document be represented.
  2. Adapting the editor to fit the jurisdiction’s Document
    XML authoring tools don’t just work out of the box. Rather, they’re toolkits that allow documents that conform to a specific schema or model to be authored. How much flexibility this toolkit provides dictates the type of documents that can be authored. Sadly, it’s difficult for any editor to provide infinite flexibility in any dimension – so very careful consideration is necessary to understand whether or not the editor can be adapted to the need.When we at Xcential implemented California’s bill drafting system a decade ago, we used XMetaL because it provided an extensive customisation capability. Unfortunately, at the outset we failed to realize that XMetaL’s change tracking capabilities were limited and not customisable. When the full challenge of redlining became clear to us well into the project, we realized we were using an editor that couldn’t do the job. Thankfully, the project was able to get (and pay for) the necessary extensions to XMetaL without too much delay.

    One way to understand this problem is in the diagram below. On the left is the intrinsic capability offered by the authoring tool. On the right is a jurisdiction’s requirement. As XML authoring tools are toolkits, there is always a gap between the intrinsic capabilities on the left and the requirements on the right – and this gap must be closed one way or another. One way is to using any programming API offered to add customisations (shown as A). Another way is to limit the jurisdiction’s requirements (shown as B) to better suit the capabilities of the tool. Usually, it takes a combination of both to arrive at a suitable outcome. If the gap cannot be closed (shown as C), then the project is likely doomed to disappointment or even failure.EffortVsCapability.png

    One thing we learned early on is that, when it comes to legislative documents, there really isn’t a lot of wiggle-room in the requirements. The form of the documents is often dictated by long established traditions and good luck trying to change that. This is one case where the expression “It will take an Act of Congress” can be quite literally true.

    This means that the gap will have to be closed through customization and the effort (and risk) to do so will be quite substantial. XMetaL, way back in 2002, provided an extensive set of programmatic APIs to work from, and that very nearly wasn’t enough. Unfortunately, the newer web-based editors haven’t, for many reasons, come close to matching XMetaL’s level of customisability.

Building our own authoring tool

Understanding the challenges of Akoma Ntoso, our customer’s demanding requirements, and the limitations of the state-of-the-art in web-based authoring tools, we embarked on a project several years ago to build our own XML authoring tool. The result is now used in a number of applications. It’s been quite a challenge – and that’s an understatement. Building a highly configurable web-based XML authoring tool that is truly a step ahead of the old desktop editors of twenty years ago has required us to truly harness every aspect of modern web technologies and methodologies.

The result is an XML authoring tool especially adapted to the needs of Akoma Ntoso. However, it’s not just an Akoma Ntoso editor. It’s an XML authoring tool, capable of adapting to any reasonable XML scheme — for the legislative field, regulatory field, or any similar field where the demands of structured documents require a sophisticated level of customization.

If you want to see our tool in action in a bespoke implementation, here’s an early peek:

https://www.youtube.com/watch?v=CTAad2E-9Y4&feature=youtu.be

(This link shows a dated version at this point. It shows the editor as it was around December of 2016. We’ve advanced quite a bit since then — in both the intrinsic capabilities of the editor an in the capabilities built into the bespoke customisation)

Implementing an Akoma Ntoso Editor

Connected Information

As a proponent of XML for legislation, I’m often asked why an XML approach is better than a more traditional approach using a word processor. The answer is simple – it’s all about connected information.

The digital end point in a legislative system can no longer be publication of PDFs. PDFs are nothing but a kludgy way to digitize paper — a way to preserve the old traditions and avoid the future. Try reading a PDF on a cell phone and you see the problem. Try clicking on a citation in a PDF and you see the problem. Try and scrape the information out of a PDF to make it computer readable and you see the problem. The only useful function that PDFs serve is as a bridge to the past.

The future is all about connected information — breaking the physical bounds of what we think of as a document and allowing the nuggets of information found within them to be connected, interrelated, and acted upon. This is the real reason why the future lies with XML and its related technologies.

In my blog last week I provided a brief glimpse into how our future amending tools will work. I explored how legislation could be managed similar to how software is managed with GitHub. This is an example of how useful connected information becomes. Rather than producing bills and amendments as paper documents, the information is stored in a way that it can be efficiently and accurately automated — and made available to the public in a computer readable way.

At Xcential, we’re building our new web-based authoring system — LegisPro. If you take a close look at it, you’ll see that it has two main components. Of course, there is a robust XML editor. However, at the system’s very heart is a linking system — something we call a resolver. It’s this resolver where the true power lies. It’s an HTTP-based system for managing all the linkages that exist in the system. It connects XML repositories, external data sources, and even SQL databases together to form a seamless universe of connected information.

We’re working hard to transform how legislation, and indeed, all government information is viewed. It’s not just about connecting laws and legislation together through simple web links. We talking about providing rich connections between all government information — tying financial data to laws and legislation, connecting regulatory information together, associating people, places, and things to government data, and on and on. We have barely started to scratch the surface, but it’s clear that the future lies with connected information.

While we today position LegisPro as a bill authoring system — it’s much more than that. It’s some of the fundamental underpinnings necessary for a system to transform government documents of today into the connected information of tomorrow.

Connected Information

Data Transparency Breakfast, LEX US Summer School 2015, First International Akoma Ntoso Conference, and LegisPro Edit reveal.

Last week was a very good week for my company, Xcential.

We started the week hosting a breakfast put on by the Data Transparency Coalition at the Booz Allen Hamilton facility in Washington D.C.. The topic was Transforming Law and Regulation. Unfortunately, an issue at home kept me away but I was able to make a brief pre-recorded presentation and my moderating role was played by Mark Stodder, our company President. Thank you, Mark!

Next up was the first U.S. edition of the LEX Summer School from Italy. I have attended this summer school every year since 2010 in Italy and it’s great to see the same opportunity for an open dialog amongst the legal informatics community finally come to the U.S. Monica Palmirani (@MonicaPalmirani), Fabio Vitali, and Luca Cervone (@lucacervone) put on the event from the University of Bologna. The teachers also included Jim Mangiafico  (@mangiafico) (the LoC data challenge winner), Veronique Parisse (@VeroParisse) from the European Union, Andrew Weber (@atweber) from the Library of Congress, Kirsten Gullickson (@GullicksonK) from the Office of the Clerk at the U.S. House of Representatives, and myself from Xcential. I flew in for an abbreviated visit covering the last two days of the Summer School where I covered how the U.S. Code is modeled in Akoma Ntoso and gave the students an opportunity to try out our new bill drafting editor — LegisProedit.

After the Summer School concluded, it was followed by the first International Akoma Ntoso Conference on Saturday, where I spoke about the architecture of our new editor as well as how the USLM schema is a derivative of the Akoma Ntoso schema. We had good turnout, from around the world, and a number of interesting speakers.

This week is NCSL in Seattle where we will be discussing our new editor with potential customers and partners. Mark Stodder from Xcential will be in attendance.

In a month, I’ll be in Ravenna once more for the European LEX Summer School — where I’ll be able to show even more progress towards the goal of a full product line of Akoma Ntoso tools. It’s interesting times for me.

The editor is coming along nicely and we’re beginning to firm up our QuickStarter beta plans. I’ve already received a number of requests and will be getting in touch with everyone as soon as we’re ready to roll out the program. If you would like to participate as a beta tester — or if you would just like more information, please contact us at info@xcential.com.

I’m really excited about how far we’ve come. Akoma Ntoso is on the verge of being certified as an official OASIS standard, our Akoma Ntoso products are coming into place, and interest around the world is growing. I can’t wait to see where we will be this time next year.

Data Transparency Breakfast, LEX US Summer School 2015, First International Akoma Ntoso Conference, and LegisPro Edit reveal.

Coming soon!!! A new web-based editor for Akoma Ntoso

I’ve been working hard for a long time — building an all new web-based editor for Akoma Ntoso. We will be showing it for the first time at the upcoming Akoma Ntoso LEX Summer School in Washington D.C.

Unlike our earlier AKN/Editor, this editor is a pure XML editor designed from the ground up using the XML capabilities that modern browsers possess. This editor is much more robust, more precise,  and is very scalable.

NewEditor

Basic Features

  1. Configurable XML models — including Akoma Ntoso and USLM
  2. Edit full documents or portions of large documents
  3. Flexible selection and editing regardless of XML structure
  4. Built-in redlining (change tracking) supporting textual AND structural changes
  5. Browse document sources with drag-and-drop.
  6. Full undo & redo
  7. Customizable attribute editor
  8. Search and replace
  9. Modular architecture to allow for extensive customization

Underlying Technology

  1. XML-based editing component
    • DOM 4 support
    • XPath Support
    • CSS Styling
    • Sophisticated event model
  2. HTTP-based resolver architecture for retrieving documents
    • Interpret citations
    • Deference URLs
    • WebDAV adaptors to document repositories
    • Query repositories with XQuery or databases with SQL
  3. AngularJS-based User Interface using HTML5
    • Component modules for easy customization
  4. XML repository for storing documents
    • Integrate any XML repository
    • Built-in support for eXist-db
  5. Validation & Publishing
    • XML Schema validator
    • XSL-FO publishing

We’ll reveal a lot more at the LEX Summer School later this month! If you’re interested in our QuickStart beta program, drop me a note at grant.vergottini@xcential.com.

Coming soon!!! A new web-based editor for Akoma Ntoso

Upcoming U.S. and European events related to Akoma Ntoso

In my last blog post I covered the public review of the new proposed Akoma Ntoso (LegalDocML) standard for legal documents. Please keep the comments coming. In order to comment, please send email to legaldocml-comment@lists.oasis-open.org. If you wish to subscribe to this mailing list, please follow the instructions at https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=legaldocml

In addition, there are three upcoming events related to Akoma Ntoso which you may wish to participate in: (this list coming from Monica Palmirani, the chair of the OASIS LegalDocML technical committee)

1. Akoma Ntoso Summer School, 27-31 July, 2015, George Mason University, Fairfax, Virginia (USA): http://aknschool.cirsfid.unibo.it
Registration fee: http://aknschool.cirsfid.unibo.it/logistics/registrations-and-fees/
Application Form: http://aknschool.cirsfid.unibo.it/wp-content/uploads/2015/05/ApplicationForm.pdf
Brochure:
http://aknschool.cirsfid.unibo.it/wp-content/uploads/2015/05/brochure_2015_US_DEF.pdf
Deadline: end of June, 2015.

2. IANC2015 (First International Akoma Ntoso Conference): August 1st, 2015, George Mason University, Fairfax, Virginia (USA)
Brochure: http://aknschool.cirsfid.unibo.it/wp-content/uploads/2015/05/AKN-CONFERENCE1.pdf
Call for contributions:
http://www.akomantoso.org/akoma-ntoso-conference/call-for-contributions/
Deadline: June 19th, 2015.

3. Summer School LEX2015, 7-15 Sept. 2015, Ravenna, Italy: http://summerschoollex.cirsfid.unibo.it
Registration fee: http://summerschoollex.cirsfid.unibo.it/?page_id=66
Application Form: http://summerschoollex.cirsfid.unibo.it/wp-content/uploads/2010/04/ApplicationForm2.pdf
Brochure:
http://summerschoollex.cirsfid.unibo.it/wp-content/uploads/2015/05/brochure_2015_LEX1.pdf
Deadline: July, 15th, 2015.

I have been participating in the European LEX Summer school every year since 2010 and find it to be both inspirational and very valuable. If you’re interested in understanding where the legal informatics field is headed, I encourage you to find a way to attend any of these events. I will be speaking/teaching at all three events.

Upcoming U.S. and European events related to Akoma Ntoso

Akoma Ntoso (LegalDocML) is now available for public review

It’s been many years in the making, but the standardised version of Akoma Ntoso is now finally in public review. You can find the official announcement here. The public review started May 7th and will end on June 5th — which is quite a short time for something so complex.

I would like to encourage everything to take part in this review process, as short as it is. It’s important that we get good coverage from around the world to ensure that any use cases we missed get due consideration. Instructions for how to comment can be found here.

Akoma Ntoso is a complex standard and it has many parts. If you’re new to Akoma Ntoso, it will probably be quite overwhelmed. To try and cut through that complexity, I’m going to try and give a bit of an overview of what the documentation covers, and what to be looking for.

There are four primary documents

  1. Akoma Ntoso Version 1.0 Part 1: XML Vocabulary — This document is the best place to start. It’s an overview of Akoma Ntoso and describes what all the pieces are and how they fit together.
  2. Akoma Ntoso Version 1.0 Part 2: Specifications — This is the reference material. When you want to know something specific about an Akoma Ntoso XML element or attribute, this is the document to go to. In contains very detailed information derived from the schema itself. Also included with this is the XML schema (or DTD if you’re still inclined to use DTDs). and a good set of examples from around the world.
  3. Akoma Ntoso Naming Convention Version 1.0. This document describes two very interrelated and important aspects of the proposed standard — how identifitiers are assigned to elements and how IRI-based (or URI-based) references are formed. There is a lot of complexity in this topic and it was the subject to numerous meetings and an interesting debate at the Coco Loco restaurant in Ravenna, Italy, one evening while being eaten by mosquitoes.
  4. Akoma Ntoso Media Type Version 1.0 — This fourth document describes a proposed new media type that will be used when transmitting Akoma Ntoso documents.

This is a lot of information to read and digest in a very short amount of time. In my opinion, the best way to try and evaluate Akoma Ntoso’s applicability to your jurisdiction is as follows:

  • First, look at the basic set of tags used to define the document hierarchy. Is this set of tags adequate. Keep in mind that the terminology might not always perfectly align with your terminology. We had to find a neutral terminology that would allow us to define a super-set of the concepts found throughout the world.
  • If you do find that specific elements you need are missing, consider whether or not that concept is perhaps specific to your jurisdiction. If that is the case, take a look at the basic Akoma Ntoso building blocks that are provided. While we tried to provide a comprehensive set of elements and attributes, there are many situations which are simply too esoteric to justify the additional tag bloat in the basic standard. Can the building blocks be used to model those concepts?
  • Take a look at the identifiers and the referencing specification. These parts are intended to work together to allow you to identifier and access any provision in an Akoma Ntoso document. Are all your possible needs met with this? Implicit in this design is a resolver architecture — a component that parses IRI references (think of them as URLs) and maps to specific provisions. Is this approach workable?
  • Take a look at the basic metadata requirements. Akoma Ntoso has a sophisticated metadata methodology behind it and this involves quite a bit of indirection at times. Understand what the basic metadata needs are and how you would model your jurisdictions metadata using this.
  • Finally, if you have time, take a look at the more advanced aspects of Akoma Ntoso. Consider how information related to the documents lifecycle and workflow might be modeled within the metadata. Consider your change management needs and whether or not the change management capabilities of Akoma Ntoso could be adapted to fit. If you work with complex composite documents, take a look at the mechanisms Akoma Ntoso provides to assemble composite documents.

Yes, there is a lot to digest in just a few weeks. Please provide whatever feedback you can.

We’re also now in the planning stages for a US LEX Summer School. If you’ve followed my blog over the years, you’ll know that I am a huge fan of the LEX Summer School in Ravenna, Italy — I’ve been every year for the past five years. This year, Kirsten Gullikson and I convinced Monica and Fabio to bring the Summer School to Washington D.C. as well. The summer school will be held the last week of July 2015 at George Mason University. The class size will be limited to just 30, so be sure to register early once registration opens. If you want to hear me rattle on at length about this subject, this is the place to go — I’ll be one of the teachers. The Summer School will conclude with a one day Akoma Ntoso Conference on the Saturday. We’ll be looking for papers. I’ll send out a blog with additional information as soon as it’s finalized.

You may have noticed that I’ve been blogging a lot less lately. Well, that’s because I’ve been heads down for quite some time. We’ll soon be in a position to announce our first full Akoma Ntoso product. It’s an all new web-based XML editor that builds on our experiences with the HTML5 based AKN/Editor (LegisPro Web) that we built before.

This editor is composed of four main parts.

  1. First, there is a full XML editing component that works with pure XML — allowing it to be quite scalable and very XML precise. It implements complex track changes capabilities along with full redo/undo. I’m quite thrilled how it has turned out. I’ve battled for years with XMetaL’s limitations and this was my opportunity to properly engineer a modern XML editor.
  2. Second, there is a sophisticated resolver technology which acts as the middleware, implementing the URI scheme I mentioned earlier — and interfacing with local and remote document resources. All local document resources are managed within an eXist-db repository.
  3. Third, there is the Akoma Ntoso model. The XML editing component is quite schema/model independent. This allows it to be used with a wide variety of structured documents. The Akoma Ntoso model adapts the editor for use with Akoma Ntoso documents.
  4. And finally, there is a very componentised application which ties all the pieces together. This application is written as an AngularJS-based single page application (SPA). In an upcoming blog I’ll detail the trials and tribulations of learning AngularJS. While learning AngularJS has left me thinking I’m quite stupid at times, the goal has been to build an application that can easily be extended to fit a wide variety of structured editing needs. It’s important that all the pieces be defined as modules that can either be swapped out for bespoke implementations or complemented with additional capabilities.

Our current aim is to have the beta version of this new editor available in time for the Summer School and Akoma Ntoso conference — so I’ll be very heads down through most of the summer.

Akoma Ntoso (LegalDocML) is now available for public review