Using the <hcontainer> Element Properly

When I started my blog five years ago, I said would try not to get too technical. Overall, I’ve stuck to that. However, with Akoma Ntoso now essentially standardised, I think it is time to start covering some areas of it in a little more technical detail. So, from time to time, I’m going to delve into a little technical mumbo jumbo to cover some subjects that come up frequently.

In this blog, I want to cover the proper use of the <hcontainer> element. Akoma Ntoso has rich support for hierarchical documents as legal documents tend to be strongly hierarchical. Consequently, there is a large selection of element tags to choose from. During the standardisation effort, we tried to identify as many hierarchical constructs we could find in legal documents, but it was impossible to identify every single construct in every single jurisdiction around the world. Indeed, we sometimes decided that some hierarchical levels were just too unique to a specific jurisdiction to warrant inclusion in a standard intended for worldwide adoption. Sometimes, having too many tags is worse than having not enough, especially when there is a way to handle the outlier cases.

So, what is the proper way to use the <hcontainer>? The <hcontainer>, or hierarchical container, is a generic element intended to be use to invent and element that is needed but not found among the existing Akoma Ntoso hierarchical elements. The @name attribute is used to define the name of the new element you’re inventing. For this reason, the value of the @name should be consistent with the element naming convention of Akoma Ntoso:

  • The name should be lowerCamelCase.
  • The name should be in British English rather than another variant of English or another language. (Yes, we have two exceptions to this rule in Akoma Ntoso, one because the English form didn’t exist and one because we didn’t notice a spelling variation)
  • The name should not already exist in Akoma Ntoso.

One question that comes up from time to time is whether an <hcontainer> can be used to define an element that already exists, but in another language. For instance, could I define <hcontainer name=”artículo”> to define a Spanish article rather than use <article>? While there is nothing that prevents this practice, that would not be in the spirit of Akoma Ntoso. A large part of the motivation of Akoma Ntoso is to promote both data and tool interoperability. Localising the element tags completely undermines Akoma Ntoso as a standard. You might as well simply use your own schema. Please consider the consumers of your data when facing this question, not just the producers.

[We use an alternate mechanism, provided by our tools, to present a localized term to the user rather than the element name.]

Another question I’ve have been has to do with hierarchical levels that might not have a formalised name at all. I’ve come across this a number of times, in a number of ways. First, it’s often an issue of very old documents where the document hierarchy was either not formalised or not explicitly stated and conversion involves some degree of guessing. Second, there are sometimes lower levels, for instance, below the section level, where the level names have simply not been formalised or are used inconsistently. Third, I’ve come across a case where the upper levels, above the section, were not named in because the corresponding concepts didn’t really exist in the language used in that jurisdiction. For these cases, we use the <level> element.

The <hcontainer> is a very useful element in Akoma Ntoso. It’s a key part of the design of the schema that allows it to be easily adapted to any legislative tradition. However, it should be used judiciously — only when there isn’t already an alternative.

 

Using the <hcontainer> Element Properly

LEX Summer School 2017

For the past two weeks I’ve been in Italy attending the LEX Summer School and Akoma Ntoso Developer’s Workshop at the Ravenna campus of the University of Bologna. This is my eighth summer school in Ravenna and my tenth overall LEX Summer School including the two U.S. editions. It’s always one of the highlights of my year.

With Akoma Ntoso as a standard now all but completed, a product about to debut, and a couple Akoma Ntoso projects to our name, I thought it would be a good time to reflect how far we have come. Bill Gates once said “Most people overestimate what they can do in one year and underestimate what they can do in 10 years.” This is a case of that. At times, the progress is frustratingly slow and arduous, but when you look back how far we’ve come in 8 years, we’ve made pretty good progress.

When I arrived at the first summer school I attended back in 2010, I had never heard of Akoma Ntoso — let alone learned how to pronounce it. A lot of the discussion still revolved around whether using purpose-built XML tools or re-purposing office productivity software was the way to go. Did the world really need Akoma Ntoso or was Open Office’s XML formats adequate? What about Microsoft’s Office Open XML? Was it an alternative?

We don’t discuss that anymore — the answer is obvious. As Luca Cervone commented to me, all of a sudden the other approaches look so old-fashioned. In fact, the presentations that did still use that approach were apologetic that their decisions dated back to the early 2000s when the answer was less clear.

What we now see is the value of putting data first and paper second. Making paper take the back seat in order to take advantage of the inherent power of treating legislation as data is now clearly the way to go. We see this in all the innovative capabilities that were on display — from the advanced amending tools we’ve worked with the UK and Scottish Parliaments to develop, the rich ontology support tools being developed in several projects, to the various comparison and analysis capabilities that were on show. XML enables all of these capabilities, in ways that other approaches simply cannot.

Another change in the eight years is the extent to which Akoma Ntoso has been embraced, particularly in Europe:

  • In April of this year, the Chief Executive Board of the United Nations approved the use of Akoma Ntoso as the documentation standard throughout the entire system after a detailed analysis. (Akoma Ntoso began as a project of the UN Department of Economic and Social Affairs (UN/DESA) a decade ago).
  • Numerous projects at both the European Parliament and European Commission are now based on Akoma Ntoso, although perhaps in a bit of a disjoint manner.
  • The project I’ve devoted a lot of my life to over the past two years at the U.K. and Scottish Parliaments is committed to Akoma Ntoso. You can watch a video of an early version here.
  • The Italian Senate is adopting Akoma Ntoso to some extent, and the Italian Chamber of Deputies are considering following suit.
  • There are projects underway in Switzerland and South America to adopt Akoma Ntoso.
  • Even the U.S. House of Representatives has a prior commitment to support Akoma Ntoso in some way.

This is all very good progress and much more is simmering in the background.

One of my goals at this LEX Summer School was to start laying the seeds for an open framework API that would allow interoperable plugins to be developed that work with all Akoma Ntoso-based platforms. Here, Luca surprised me by showing the new open source Akomando toolkit. This is a JavaScript toolkit, to be made available via NPM, GitHub, and other means shortly, that will provide the basic utilities one needs to easily process XML. As the LIME editor and Xcential’s LegisPro are largely technologically aligned on modern and open web technologies, this toolkit is a natural fit for both applications. I think this is a very exciting development and one we plan to take advantage of as soon as possible.

So, all in all, not bad. Now it’s time to start building on that momentum. We have lots of ideas percolating that will be revealed in the months to come. I’m looking forward to doing another retrospective at the ten year mark.

LEX Summer School 2017

The Sun is Rising on Akoma Ntoso — and LegisPro too!

Two great news piece of news this week! First, the documentation for Akoma Ntoso has now been officially released by OASIS. Second, we’re announcing the latest version of  our LegisPro drafting platform for Akoma Ntoso, codenamed “Sunrise”.

After several years of hard work, we’ve made a giant step towards our goal of setting an international XML standard for legal documents. You can find the documents at the OASIS LegalDocML website. A special thanks to Monica Palmirani and Fabio Vitali at the University of Bologna for their leadership in this endeavour.

legispro250Later this week, Xcential will be announcing and showing the latest version of “Sunrise” version of LegisPro, at both NALIT in Annapolis, Maryland and at the LEX Summer School in Ravenna, Italy. This new version represents a long-planned change to Xcential’s business model. While we have a thriving enterprise business, we’re now focusing on also providing more affordable solutions for smaller governments.

Part of our plan is to foster an open community of providers around the Akoma Ntoso standard for legislative XML. With Akoma Ntoso now in place as a standard, we’re looking for ways to provide open interfaces such that cooperative tools and technologies can be developed. One of my goals at this years summer school in Ravenna is to begin outlining the open APIs that will enable this vision.

LegisPro

The new edition of LegisPro will be all about providing the very best options:

  1.  It will provide a word processing like drafting capability your drafters demand — along with the real capabilities you need:
    • We’re not talking about merely providing a way to style a word processing document to look like legislation.
    • We’re talking about providing easy ways to define the constructs you need for your legislative traditions, such as–
      • a configurable hierarchy,
      • configurable tagging of important information,
      • configurable numbering rules,
      • configurable metadata,
      • oh, and configurable styles too.
    • We’re talking about truly understanding your amending traditions and providing the mechanisms to support them, such as–
      • configurable track changes, because we understand that a word processor’s track changes are not enough,
      • as-published page and line markers, because we understand your real need for page and line numbers and that a word processor’s page and line numbering is not that,
      • robust typography, because we know there’s a quite a difference between the casual correspondence a word processor is geared for and the precision demanded in documents that represent laws and regulations.
  2. It will be as capable as we can make it — for real-world use rather than just a good demo:
    • We’re not talking about trying to sell you a cobbled together suite of tools we built for other customers.
    • We’re talking about working with specialists in all the sub-fields of legal informatics to provide best-of-breed options that work with our tools.
    • We’re talking about making as many options available to you as we know there is no one-size-fits-all answer in this field.
    • We’re talking about an extensible architecture that will support on-board plug-ins as well as server-side web-services.
    • We’re talking about providing a platform of choices rather than a box of pieces.
  3. It will be as affordable as we can possibly make it:
    • We’re talking about developing technologies that have been designed to be easily configured to meet a wide variety of needs.
    • We’re talking about using a carefully chosen set of technologies to minimize both your upfront cost and downstream support challenges.
    • We’re talking about providing a range of purchasing options to meet your budgetary constraints as best we can.
    • We’re talking about finding a business model that allows us to remain profitable — and spreads the costs of developing the complex technologies required by this field as widely and fairly as possible.
  4. It is as future-proof as we can possibly make it:
    • We’re not talking about trying to sell you on a proprietary office suite.
    • We’re talking about using a carefully curated set of technologies that have been selected as they represent the future of application development — not the past — including:
      • GitHub’s Electron which allows us to provide both a desktop and a web-based option, (This is the same technology used by Slack, WordPress, Microsoft’s Visual Studio Code, and hundreds of other modern applications.)
      • Node.js which allows us to unify client-side and server-side application development with “isomorphic JavaScript,”
      • JavaScript 6 (ECMAScript 2015) which allows us to provide a truly modern, unified, and object-oriented programming environment,
      • Angular and other application frameworks that allow us to focus on the pieces and not how they will work together,
      • CSS3 and LESS that allows us to provide state-of-the-art styling technologies for the presentation of XML documents,
      • the entire XML technology stack that is critical for enabling an information-centric rather than document-centric system as is appropriate for the 21st century,
      • and, of course, using the Akoma Ntoso schema for legislative XML to provide the best model for sharing data, information, tools, and other technologies. It’s truly a platform to build an industry on.
  5. It is as open as we can possibly make it:
    • We’re not talking about merely using an API published by a vendor attempting to create a perception of openness by publishing an API with “open” in the name.
    • We’re talking about building on a full suite of open source tools and technologies coming from vendors such as Google, GitHub, and even Microsoft.
    • We’re talking about using non-proprietary protocols such as HTTP and WebDAV.
    • We’re talking about providing an open API to our tools that will also work with tools of other vendors that support Akoma Ntoso.
    • And, while we must continue to be a profitable product vendor, we will still provide the option of open access to our GitHub repositories to our customers and partners. (We’ll even accept pull requests)

Our goal is to be the very best vendor in the legislative and regulatory space, providing modern software that helps make government more efficient, more transparent and more responsive. We want to provide you with options that are affordable, capable, and planned for the future. We want to do whatever we can to allay your fears of vendor lock-in by supporting open standards, open APIs, and open technologies. We want to foster an Akoma Ntoso-based industry of cooperative tools and technologies as we know that doing so will be in the best interests of everyone — customers, product vendors, service providers, and the people who support them. As someone once told me many years ago, if you focus on making the pie as large as you can, the crumbs left on the knife will be plenty enough for you.

Either come by our table at NALIT in Annapolis or join us for the Akoma Ntoso Developer’s Conference in Ravenna at the conclusion of the LEX Summer School to learn more. If neither of these options will work for you, you can always learn more at Xcential.com or by sending email to info@xcential.com.

The Sun is Rising on Akoma Ntoso — and LegisPro too!

Escaping a Technology Eddy

Do you need to escape a technology eddy? In fluid dynamics, an eddy is the swirling of a fluid that causes a reverse current against a downstream flow. It often forms behind a major obstacle. The swirling motion of an eddy creates resistance to forward motion by creating a backward force. Eddies are also seen in air and electromagnetic systems.

I see a similar phenomena in my work that I’m going to coin a technology eddy. A technology eddy forms in organisations that are risk adverse, have restricted budgets, or simply are more focused on software maintenance of a major system rather than on software development. Large enterprises, in particular, often find their IT organisations trapped in a technology eddy. Rather than going with the flow of technological change, the organisation drifts into a comfortable period where change is more restricted to the older technologies they are most familiar with.

TechnologyEddy

As time goes by, an organisation trapped in a technology eddy adds to the problem by building more and more systems within the eddy — making it ever more difficult to escape the eddy when the need arises.

I sometimes buy my clothing at Macy’s. It’s no secret that Macy’s, like Sears, is currently struggling against the onslaught of technological change. Recently, when paying for an item, I noticed that their point-of-sale systems still run on Windows 7 (or was that Windows Vista). Last week, on the way to the airport, I realised I had forgotten to pack a tie. So, I stopped in to Macy’s only to find that they had just experienced a 10 minute power outage. Their ancient system, what looked to be an old Visual Basic Active Directory app, was struggling to reboot. I ended up going to another store — for all the other stores in the mall were up and running quite quickly. The mall’s 10 minute power outage cost Macy’s an hour’s worth of sales because of old technology. The technology eddy Macy’s is trapped in is not only costing them sales in the short term, it’s killing them in the grand scheme of things. But I digress…

I come across organisations trapped in technology eddies all the time. IT organisations in government are particularly susceptible to this phenomena. In fact, even Xcential got trapped in a technology eddy. With a small handful of customers and a focus on maintenance over development for a few years, we had become too comfortable with the technologies that we knew and the way in which we built software.

It was shocking to me when I came to realise just how out-of-date we had become. Not only were we unaware of the latest technologies, we were unaware of modern concepts in software development, modern tools, and even modern programming styles. We had become complacent, assuming that technology from the dawn of the Millennium was still relevant.

I hear a lot of excuses for staying in a technology eddy. “It works”, “all our systems are built on this technology”, “it’s what we know how to build”, “newer technologies are too risky”, and so on. But there is a downside. All technologies rise up, have a surprisingly brief heyday, and then slowly fade away. Choosing to continue within a technology eddy using increasingly dated technology ensures that sooner or later, an operating system change or a hardware failure of an irreplaceable part will create an urgent crisis to replace a not-all-that-old system with something more modern. At that point, escaping the eddy will be of paramount importance and you’ll have to paddle at double speed just to catch up. This struggle becomes the time when the price for earlier risk mitigation will be paid — for now the risks will compound.

So how do you avoid the traps of a technology eddy? For me, the need to escape our eddy became most apparent as we got exposed to people, technologies, and ideas that were beyond the comfortable zone in which our company existed. Hearing new ideas from developers beyond our sphere of influence and being exposed to requirements from new customers made us quickly realize that we had become quite old-fashioned in our ways. To stay relevant you must get out and learn — constantly. Go to events that challenge your thinking rather than reinforce it.

Today we are once more a state-of-the-art company. We’ve adopted modern development techniques, upgraded our tools, upgraded our technologies, and upgraded our coding skills. These changes allow us to compete worldwide and build software for multiple customers in a fully distributed way that spans companies, continents, and time zones.

I hope we’ll remember this lesson and focus more on continuous improvement rather than having to endure a crash course of change every few years.

 

Escaping a Technology Eddy

The many lives of JavaScript

I recently worked out that I’ve learned, on average, a new programming language every two to three years. These many languages have been part of my toolbox for somewhere between four to six years before falling away to make room for new technologies. However, there is one programming language that has been a major part of my programming repertoire for almost 22 years now – and that is JavaScript.

My JavaScript programming skills have recently undergone a major renaissance as I’ve adopted JavaScript 6 (aka ECMAScript 2015), for most of my coding. The way I write code today is nothing like the code I wrote just one year ago – and I’ve gone back and largely modernised all active code to be consistent. Today’s programming style uses modern frameworks and is far more object oriented and asynchronous. There are many new features which have totally updated how I write code. Proper (while still limited) classes with mixins have replaced the ugly prototype mechanism I used to use for object orientation. Let and const declarations have caught latent bugs that were hidden in my code. Arrow functions (aka. lambda expressions) and promises have streamlined code that once was quite clunky. The list goes on…

Even my tools have changed. Microsoft’s surprisingly excellent Visual Studio Code has replaced the hodgepodge of tools I once used. We’re in the process of integrating Jasmine and Karma to the process. JavaScript Semistandard Style (no, I still like semicolons) has ensured a very clean code base – as well as catching a multitude of errors and sins.

LifeOfJavaScript

All this change got me thinking about the four lives of JavaScript that I have worked through. Way back when, JavaScript had an awkward birth at the hands of Netscape as the lesser stepchild of the new Java programming language from Sun that was taking away all the attention. JavaScript was just a way to glue Java applets together in the browser. The problem is, Java applets really sucked.

Microsoft quickly saw the value of JavaScript though, and launched their own effort to steal Netscape’s baby. And so, JavaScript was stolen, renamed JScript and made to be the adopted sibling of Microsoft’s other scripting language, VBScript. One good bit progress that Microsoft made was to sponsor the standardisation of the language, although the resulting name of ECMAScript was another in a long string of unfortunate names the language has had to endure.

As JScript, JavaScript was to become an integral part of Microsoft’s entire ActiveX strategy. A lot of really cool technologies (yes, really) came of this allowing JScript to go beyond the browser. As an application extension language, it found its way into the XMetaL XML editor as the customisation technology. We used it and many of the ActiveX technologies to great effect when we implemented California’s bill drafting system. However, it didn’t just end there. We were able to use it on the server-side through Classic ASP and as a shell scripting language through the Windows Script Host. For a Microsoft-centric programmer, this era of JavaScript was a glorious one.

However, ActiveX was seriously flawed. It was entirely proprietary and riddled with problems. Microsoft abandoned it almost as quickly as they had adopted it – moving on to .Net where JScript.net was a non-starter. As Microsoft’s interest in ActiveX and even Internet Explorer waned in the early 2000s, life as a JavaScript programmer became ever gloomier. While the capabilities were awesome, there was obviously no future.

At this point, we made the somewhat painful decision to move away from Microsoft’s outmoded view of the Internet and go back to the basics. While it meant giving up a lot of capability, in the end it was an excellent decision for it pointed to the future. One tiny aspect of Microsoft’s ActiveX vision, the XMLHttpRequest object, escaped from Microsoft and gave rise to a whole different way of programming – Asynchronous JavaScript and XML (AJAX). This development and the emergence of new browsers, first Firefox and then Google’s Chrome with its V8 JavaScript engine, breathed new life into JavaScript.

Freed from Microsoft’s grip, JavaScript has flourished. The past decade has seen a plethora of new technologies. Isomorphic JavaScript (or Universal JavaScript) blurs the distinction between coding for the server and the client. In fact, technologies like Electron turn web-based application development back to the desktop where you can get the best of both worlds.

When I look back on the code I wrote during the ActiveX era (yes, we still support it), it looks prehistoric. Modern JavaScript is so much more capable and flexible than the clunky rendition we had back when COM-based ActiveX was supposed to change the world. As I mentioned earlier, how I program now is completely different – asynchronous programming is a difficult but very worthwhile skill to acquire.

Looking to the future, I see three paths. On one side is a mature but polarising platform that is dominated by Oracle. Oracle’s dominance ensures stability but also deters innovation. Looking to the other side, one finds another mature but polarising platform that is dominated by Microsoft. Here too, Microsoft’s dominance ensures stability but also deters innovation. The result is that it seems that both paths have now had their heyday. You don’t hear very much aspirational news from either technology path anymore — what it must have felt like programming a mainframe in COBOL at the height of the C/C++ era.

The third path seems to be the path of the future – staking out a middle ground that neither technology giant can stomp on. Sure, Google is a technology giant that plays a strong role, but they’re still reasonably well regarded by the development community at large (for now). It is this middle ground that has been the most fertile for new technologies – and JavaScript is right in the thick of it. There are so many new technologies it’s hard to keep track of them all — AngularJS, Node.js, React, Express.js, to name but a few. While this third path can play well with both of the other two, for me it is the path that truly points to the future.

This brings us to the fourth life for JavaScript – building on the momentum of the past decade to mount a credible challenge for enterprise apps. While I initially dismissed many of the new features of the language as mere syntactic sugar, my experience with it has shown it to be more. I now write much better code. I believe we’re on the verge of an explosion in JavaScript-enabled applications that will blur the distinction between the platforms, between the desktop and the browser, and between the server and the client. This is truly an exciting time, once more, to be developing in JavaScript.

It goes without saying, but stay tuned for more…

The many lives of JavaScript

Implementing an Akoma Ntoso Editor

Yes, we’ve now built a full real-world legislative drafting editor using the final release of the new OASIS standard for legislative XML known as Akoma Ntoso. No, it wasn’t easy, but drafting tools never are. While our project is not yet a finished implementation, it shows that Akoma Ntoso is adaptable to some of the most challenging demands it will face as a world-wide standard for digital legislation.

Akoma Ntoso is a very ambitious standard. It strives to anticipate all the possible needs that jurisdictions around the world will have while also planning for a wide range of useful applications that can be built on top of the data. The result is a sophisticated schema with many more features than any one implementation will ever need.

The trick is being able to mould Akoma Ntoso to fit the unique needs of a jurisdiction while also providing a user experience that is natural and fits the problem space exactly. This was the challenge that led us to develop a custom web-based XML editor. After surveying the available market of web-based editors, we quickly found that none would be sufficiently adaptable to allow Akoma Ntoso to realize its true potential.

There are two aspects of building an Akoma Ntoso editor that have required particular attention:

  1. Adapting Akoma Ntoso to fit the jurisdiction’s Documents
    If you’ve taken a look at Akoma Ntoso, you know that it’s jam-packed full of tags and features, far more than are ever necessary in a single implementation. Trying to create a single comprehensive implementation of it all, a one-size-fits-all approach, will only yield an overly complicated and unusable tool that will be suitable to nobody. At the same time, despite Akoma Ntoso’s efforts to cover all possible scenarios, there are still gaps in the schema where specifics details to individual jurisdictions are not covered. Akoma Ntoso anticipates this shortcoming by providing a pattern-centric mechanism for extending a set of generic elements to fill in the gaps.
    AkomaNtosoSubset.pngAn authoring tool needs to hide or omit the unused parts of Akoma Ntoso, adapt the parts that are being used to fit the specific requirements of a jurisdiction, and allow for extension of Akoma Ntoso using the generic mechanism for extension in such a way that these extensions would appear to be seamless. As it turns out, almost a third of the elements we’ve implemented are extension elements. The result is an editor that allows a fully compliant Akoma Ntoso document to be drafted (correct by construction), while at the same time ensuring that the document fully complies with the jurisdiction’s model for how that document be represented.
  2. Adapting the editor to fit the jurisdiction’s Document
    XML authoring tools don’t just work out of the box. Rather, they’re toolkits that allow documents that conform to a specific schema or model to be authored. How much flexibility this toolkit provides dictates the type of documents that can be authored. Sadly, it’s difficult for any editor to provide infinite flexibility in any dimension – so very careful consideration is necessary to understand whether or not the editor can be adapted to the need.When we at Xcential implemented California’s bill drafting system a decade ago, we used XMetaL because it provided an extensive customisation capability. Unfortunately, at the outset we failed to realize that XMetaL’s change tracking capabilities were limited and not customisable. When the full challenge of redlining became clear to us well into the project, we realized we were using an editor that couldn’t do the job. Thankfully, the project was able to get (and pay for) the necessary extensions to XMetaL without too much delay.

    One way to understand this problem is in the diagram below. On the left is the intrinsic capability offered by the authoring tool. On the right is a jurisdiction’s requirement. As XML authoring tools are toolkits, there is always a gap between the intrinsic capabilities on the left and the requirements on the right – and this gap must be closed one way or another. One way is to using any programming API offered to add customisations (shown as A). Another way is to limit the jurisdiction’s requirements (shown as B) to better suit the capabilities of the tool. Usually, it takes a combination of both to arrive at a suitable outcome. If the gap cannot be closed (shown as C), then the project is likely doomed to disappointment or even failure.EffortVsCapability.png

    One thing we learned early on is that, when it comes to legislative documents, there really isn’t a lot of wiggle-room in the requirements. The form of the documents is often dictated by long established traditions and good luck trying to change that. This is one case where the expression “It will take an Act of Congress” can be quite literally true.

    This means that the gap will have to be closed through customization and the effort (and risk) to do so will be quite substantial. XMetaL, way back in 2002, provided an extensive set of programmatic APIs to work from, and that very nearly wasn’t enough. Unfortunately, the newer web-based editors haven’t, for many reasons, come close to matching XMetaL’s level of customisability.

Building our own authoring tool

Understanding the challenges of Akoma Ntoso, our customer’s demanding requirements, and the limitations of the state-of-the-art in web-based authoring tools, we embarked on a project several years ago to build our own XML authoring tool. The result is now used in a number of applications. It’s been quite a challenge – and that’s an understatement. Building a highly configurable web-based XML authoring tool that is truly a step ahead of the old desktop editors of twenty years ago has required us to truly harness every aspect of modern web technologies and methodologies.

The result is an XML authoring tool especially adapted to the needs of Akoma Ntoso. However, it’s not just an Akoma Ntoso editor. It’s an XML authoring tool, capable of adapting to any reasonable XML scheme — for the legislative field, regulatory field, or any similar field where the demands of structured documents require a sophisticated level of customization.

If you want to see our tool in action in a bespoke implementation, here’s an early peek:

https://www.youtube.com/watch?v=CTAad2E-9Y4&feature=youtu.be

(This link shows a dated version at this point. It shows the editor as it was around December of 2016. We’ve advanced quite a bit since then — in both the intrinsic capabilities of the editor an in the capabilities built into the bespoke customisation)

Implementing an Akoma Ntoso Editor

Becoming Agile

Lately we’ve become quite Agile. More and more, our government customers have started to impose Agile methodologies on us. While I’ve always thought of our existing methodologies as being quite nimble, adopting Agile and Scrum methodologies has required some adaptation on my part.

Early in the game, I started to find Agile to be more of a hindrance than a help. The drumbeat of each sprint was wearing me out – and I started to feel the inevitable effects of burnout creeping into the my every thought.

But then a remarkable thing happened. I found myself not only defending Agile, but advocating it for our other projects. I was quite surprised to find myself having become such a big supporter. So what changed?

Early on, Agile was new for all of us. Our team was new, geographically distributed in three different parts of the world, all 8 hours apart. That team consisted of representatives from a set of customers and several partners all learning to work together to build a challenging solution. We adopted the Scrum methodology and planned out a long series of two week sprints. Each sprint had a set of stories assigned to it as we set off to build the most awesome bill drafting system of all time.

ProgressVsRefinement

The problem was that the pace was too aggressive. In a software development project, you need to manage two different aspects – making forward progress by adding features while ensuring a sound implementation through refinement. Agile methodologies lean away from lots of up-front design. This makes it possible to show lots of forward momentum early, but the trade-off is that the design will need to be refactored often as new requirements are uncovered and added to the picture. We were too focused on the forward momentum and were leaving a trail of unfinished “programming debt” in our wake. This debt was causing me increasing anxiety as time marched on.

There is an important concept in Agile Scrum called the retrospective. It’s all about continuous improvement of the process. As we’ve grown as a team, we’ve become better at implementing retrospectives. These led to the most important change we’ve made – moving from a two week to a three-week sprint. We didn’t just add time to our sprints, we fundamentally changed the structure of a sprint. We still schedule two weeks’ worth of tasks to each sprint, but rather than just assuming that everything will work out just perfectly, we leave a week open for integration, testing, and development slack to be taken up by any refactoring that may have become necessary.

BritSprint

This third week, while arguably slowing us down, ends up helping by allowing us to emerge from each sprint in far better development shape to begin the next sprint. We just have to be disciplined enough to not try and squeeze regular development tasks into that third week. By working down programming debt continuously, subsequent sprints become more predictable. For various reasons, we temporarily returned to two week sprints and the problem of accumulating programming debt returned. The lesson learned is that you can’t build a complex system on top of a rickety foundation – you must continuously work to ensure a robust base upon which you are building. Without this balance, Agile just becomes a way to expedite a project at the expense of good development practices.

Another key change has been in how we use tools that help to do our work. As I mentioned earlier, our development teams are very distributed – around the world. It’s important that we be able to communicate very effectively despite the distance. Daily stand-ups with the entire team are not possible although we do ensure at least two meetings each sprint with the whole team. We use four primary tools – GitHub as our source code repository, AWS for our development and test servers, Slack for casual day-to-day conversation, and JIRA for managing the stories and tasks. It is the use of JIRA that has taken the most adaptation. Our original methodology was quite clumsy, but with each sprint we refine our usage to the point that it has become a very effective tool. Now, a dashboard presents me with a very clear picture of each sprint’s goals and everyone can monitor the progress towards those goals as the sprint progresses – there are no surprises.

Agile and Scrum are allowing a disparate group of customers and vendors to become a very highly performing software development team. We’re far from perfect, but with every sprint we learn more, make changes, and emerge as a better team than before.

 

Becoming Agile