Extensible Markup Language was once heralded as the lingua franca of e-government. More than six years down the road, however, it is still more of a regional dialect
It was with some pride that in 2002 the Australian Prudential Regulatory Authority (APRA) announced that it would be among the first such bodies in the world to support eXtensible Business Reporting Language (XBRL). As the government body responsible for supervision of Australia's financial services and insurance industries, APRA believed the standard's enablement of consistent financial reporting structures would make it extremely popular among the 10,000 institutions required to report regularly to APRA.
Built around the eXtensible Markup Language (XML) standard, XBRL seemed like a dead easy business case. The other alternative, after all, was to stick with what the companies were already doing: manually pulling many types of data from spreadsheets, enterprise systems and even manual ledgers, then collating the information and dutifully reporting to APRA. By configuring their systems to output XBRL documents and send them straight to APRA, these organizations could save considerable amounts of time, money and effort.
Two years later, though, most organizations still prefer the old-fashioned way. Other priorities have taken APRA's attention, pushing XBRL - and the related @APRA and Direct2APRA interfaces it built to support the expected rush of XBRL reports - into the background. Although a few large banks have invested heavily in XBRL, they represent just a drop in the bucket compared with APRA's overall reporting load.
"It's fair to say the take-up hasn't been as enthusiastic as we had perhaps hoped," says Colin Cayless, CIO at APRA. "We've done a fair bit of development on this, and were hoping that the marketplace and consulting firms would drive this harder. Using XBRL tags, they can link their existing systems directly into our application. But if you say 'XBRL' to people in most environments, people don't know what it is - and a lot of the institutions' underlying applications aren't easily amenable to applying XBRL."
APRA has not by any means given up on XBRL, but Cayless says the challenge in front of it now is not a technical one, but an educational one - something reported by similar organizations overseas. A potential form redesign next year may smooth the XBRL reporting process considerably, potentially fostering a renewed educational effort that would focus on promoting its benefits to stakeholders.
Spoken by Chance, Not Design
Improving communications between departments was one of the main reasons why governments around the world began showing early interest in XML, which emerged in 1998 as an official World Wide Web Consortium standard that built on the runaway success of its smaller cousin, HyperText Markup Language (HTML).
Integration into all manner of products came quickly, and XML's ability to add structure and meaning to data made it a philosophical companion to efforts to open up government data storage formats. Equally popular was XML's enablement of industry or function-specific markup schemas - XBRL is one, as are eBusiness XML (ebXML), Financial Products Markup Language (FpML), Math Markup Language (MathML) and myriad others with specific purposes.
Yet XML's flexibility may have muddied its message: even the official World Wide Web Consortium family of XML-based standards has grown, now including XPath, XPointer, XSLT and a handful of others. With so many acronyms floating around - and specific expertise needed to take advantage of each one - it is no wonder that many organizations that might otherwise be interested in XML's benefits have put it into the too-hard basket.
Little wonder. Research firm Gartner's 2004 latest Hype Cycle for XML technologies is speckled with XML derivatives still climbing the "peak of inflated expectations". Consistent with APRA's experience, XBRL is currently sitting in Gartner's "trough of disillusionment", with productive use expected in the two- to five-year time frame - and it is among the most mature of the lot, trailing ebXML just slightly.
Gartner's Hype Cycle suggests that productive use of most XML standards still remains several years off; the firm noted that strong interest but slow action have also hindered XML's take-up in the insurance, financial services and other industries. It has also forced vendors to reconsider their plans or even their entire businesses, as in the case of the XML database market, which Gartner dismissed as "not compelling enough to fuel a market".
Anecdotal evidence confirms this, suggesting that most government organizations are still in early phases of XML planning - except where the technology has been introduced incidental to introduction of a content management system (CMS), or as a data interchange format accompanying J2EE or .NET application development. XML is supported by newer versions of Microsoft Office and other commonly found applications, although Gartner has warned that this support will be of little use to most users.
Just because workers can create XML documents does not mean they will. XML is an enabler, not a destination, and will offer no real benefit without appropriate policy and framework formulation. Gartner has weighed in heavily in this area. Worse still for the technology's prospects, revelations that vendor positioning is diluting XML's value proposition - and that of Web services more broadly - have serious implications for its future as the lingua franca of the online era. "Vendor politics and user shortcuts could delay the adoption of new standards and destroy some of the strong qualities of Web services," Gartner warned.
Vendor positioning turned out to be a big problem for one state government department, which began looking for ways to leverage XML after investing in an XML-based content management system several years ago. Despite expectations that its XML-based content would be open and portable, however, the department found out that the system had used a proprietary XML schema to model the data, and the vendor would not convert the data unless a fee was paid.
Such off-putting experiences do little to strengthen faith in current XML tools. "We can see the benefits of [XML], but not so much in content management," a spokesperson for the government department offered. "We're not against it, but we just haven't seen significant benefits at this stage to make it something we would want to mandate."
The department is now "kicking around" the idea of a complete XML-based enterprise repository, which would provide a centralized content store to support a broader range of more meaningful and productive applications.
Similar stories abound in government departments that rushed into XML for XML's sake but failed to take an expansive enough view of the technology to make it suit business requirements. This presents philosophical problems for government organizations that were quick to show enthusiasm for the promise of open data exchange across government.
It certainly seemed like a good idea at the time, a point highlighted by NOIE's (now the Australian Government Information Management Office's [AGIMO's]) creation of an XML Clearinghouse that was intended to promote sharing of XML schemas across government. Now subsumed into AGIMO's Integrated Transactions Reference Group (ITRG), the project has become part of the broader Interoperability Technical Framework for the Australian government, which once again lays down the use of XML and a variety of Web services standards for inter-agency interoperability.
Big Dreams, Small Wins
In other words, the government still feels XML is a good thing.
Just how or whether it might be used in practice, however, is up to each individual agency. One application that demonstrates both the viability and unifying power of XML has been live for several years at the Department of Immigration, Multicultural and Indigenous Affairs (DIMIA), which has been partnering with the Department of Education, Science and Training (DEST) to tighten controls over the growing number of foreign students entering Australia to study.
Education has become a major export earner for Australia in recent years, delivering upwards of $4 billion into the country's coffers annually and growing steadily: nearly 200,000 overseas students are reportedly studying here at any given time.
Before each of those students can enter the country to begin studying, they need to register with appropriate educational authorities and complete the paperwork necessary for a student visa into Australia. This is a largely routine task that was, in the past, hobbled by large amounts of paperwork and a need for applicants to physically attend one of a small number of offices to lodge applications.
Once they were issued, however, enforcing the visas was difficult. Lack of structured communication between departments made it hard to match DEST information about students' course attendance with DIMIA details about their immigration status. Many visitors disappeared off the radar screen altogether, overstaying their visas or not bothering to attend the classes that were the ostensible basis for entry into Australia.
XML proved to be the perfect solution to this problem. In 2001, DIMIA launched eVisa, an online service that let overseas students lodge Australian visa applications online. Information is filled into a Web form, with responses and other relevant details bundled into an XML document that was forwarded, via an IBM MQSeries message queue, to DIMIA's back-end Integrated Client Service Environment (ICSE) application for processing and eventual storage in its mainframe-based DB2 data store.
Storing student data within a structured XML record allowed DIMIA to exchange easily that information with other agencies that needed it. An in-house application called Confirmation of Enrolment (CoE) regularly shares information on student visas and movements with DEST, which checks the details against its own PRISMS database of student enrolments and alerts DIMIA to any inconsistencies or changes.
With the more recent provision of features for overseas student sponsors to lodge visa applications on their behalf, eVisa is now handling several hundred thousand applications each year; the system has far exceeded the department's ability to provide similar service levels using manual processes.
Building an XML culture within DIMIA has required both technical and management accommodations: for example, an XML Working Group provides a centre of gravity for XML specialists and related business stakeholders.
Another ongoing issue lies in the disparities between DIMIA's XML schemas, which were built to model the ICSE environment rather than being structured with a vocabulary that would be meaningful to DIMIA's business. But these types of issues are gradually being worked out as the system is improved and XML is introduced into other areas of the department.
XML is a strategic direction for DIMIA over the long term, says deputy CIO Chris Gill. "We've still got a lot of legacy information that uses comma-delimited or set structure data exchanges, but over time we will move towards XML as we enhance our systems," he says. "There is a recognition [within the department] that this is clearly the direction we are going in."
Better Is Bigger
Although XML as defined works effectively, its implementation can be another thing entirely. Size, specifically, can become a major problem as applications struggle to cope with the increasing volumes of information being stored inside XML documents.
DIMIA found that the often unwieldy size of the large XML documents it was creating presented problems for its mainframes, and it is not the only department to have run into issues. Size was a major issue for the Distributed Systems Technology Centre (DSTC), which recently flipped the switch on the fourth Australian pilot site in the nationwide HealthConnect effort.
Designed to provide a single health-care record that can be exchanged between relevant health-care authorities as necessary, DSTC's solution is an XML-based implementation of the European OpenEHR standard that has been developed as a complement to the incumbent HL7 messaging format.
The current pilot, due to run into mid 2005, will evaluate the ease with which a diabetic patient's electronic health record can flow between various health-care providers in both public and private sectors. Success could breed rapid growth: having attracted a $2.9 million investment from the Commonwealth Department of Health and Ageing, in partnership with the General Practice Computing Group and Queensland Health, HealthConnect could potentially be rolled out nationwide across a variety of environments with a huge number of participants.
Record size became an issue early into DSTC's development project, since an appropriate XML schema needed to accommodate a mind-boggling array of different health-care information. Such large documents would, it was well understood, significantly slow down XML document processing and affect the system's long-term operation.
DSTC's solution was to split the XML schema into a two-tiered structure: a generic record management layer provides a medico-legally legitimate way to share records, while a second "archetype" layer defines validation rules or constraints for specific instantiations of information (for example, what types of information might be contained in an MRI report).
"Structuring the health record content was not easy at all," says Andrew Goodchild, senior research scientist with DSTC. "In the clinical domain you have medico-legal issues around record keeping, and you have continuously changing clinical requirements. You have to move away from designing specific schemas that try to do everything and become so big and unmanageable that you'll never be able to implement them. There are lots of little technical challenges in there that you never quite expect."
Many of those challenges revolve around the ever-present issue of business versus technology priorities. Although XML was clearly going to be underlying the OpenEHR records, DSTC designed the models first, then shaped the XML to reflect its functional priorities. "Designing straight to XML isn't the way of the future," Goodchild adds.
Spreading the XML Motivation
DSTC's experience in building the HealthConnect infrastructure confirms the unspoken truth about XML: it is one thing to get tools that enable creation of XML documents, but another thing altogether to get users to utilize those tools in meaningful and productive ways. If single, relatively contained projects like HealthConnect and DIMIA's eVisa present challenges, departments looking for broader ways to utilize XML have an even more significant effort on their hands - especially in convincing other stakeholders to play along.
Winning buy-in from customers will be a key goal of government intellectual property management organization IP Australia, which recently kicked off a pilot of an XML-based patent lodgment system designed to wean hundreds of patent attorneys off paper-based applications.
Complex patent applications can consume reams of paper, with each page currently scanned and entered into the system using optical character recognition. Since most of those applications already exist on the attorneys' systems in electronic form, the paper is an unnecessary step that IP Australia hopes can be eliminated using an ebXML-based system that digitally signs the bundle on the client side, then transmits it to IP Australia for processing.
Reflecting the flexibility of ebXML, the records will travel considerably: security outsourcer CyberTrust acts as an intermediary to verify the digital signatures on IP Australia's behalf, then forwards confirmed documents to IP Australia. A link to the World Intellectual Property Organization - which is mandating XML as the standard of choice for processing worldwide patent applications - will also allow seamless forwarding of applications desiring protection in other countries.
Strong past support for online trademark lodgment - 54 percent of trademark applications are now lodged online - suggests attorneys will warmly receive the new system. However, Paul Ayers, director of architecture and acting CIO with IP Australia, concedes the organization needs to look well beyond the technology to make the new system work.
"We've got to put this facility into place, but we won't have a really good handle on its uptake until we put it into production," he says. "We'll work with customers to embrace it; the biggest carrot for them is the reduction in costs. But even if their back-end system is in electronic form, they'll need to do some sort of transformation into an agreed XML schema."
Until the distant day when XML becomes a standard file format within government organizations, transformation from legacy formats will be essential to XML's uptake. Even for organizations that don't have XML in place yet, developing a clear migration strategy is an important obligation for any CIO - if only because Australia's official archivists want it to happen.
Perhaps the most enthusiastic adopter of XML within the government, the National Archives of Australia (NAA) will be increasingly encouraging departments to generate native XML data to facilitate ready archiving and retrieval of information it stores on their behalf. Although the NAA has many ways of handling non-XML data, Stephen Ellis, assistant director-general for digital government with NAA, believes XML-based archiving could be mandated for government departments in the not too distant future.
"We would hope to move towards having everything natively stored in a standardized format," he says. "Government agencies are enormously complex, have to be responsive to changes in policy and have a high level of accountability, and have a very real business interest in ensuring that the records they've created are reliable and can be readily retrieved. That's the message we've got to sell to government agencies; if they do all that right, good things will come out of that. XML is an enabling technology that is going to see us through the next decades."