Archive Portal Europe

donderdag 22 december 2011

DLM Forum’s triennial conference 2011: Interoperability – Applied interoperability: Experiences with reusing records management metadata

Applied interoperability: Experiences with reusing records management
Kuldar Aas, Deputy Chief, Digital Preservation Department, National Archives
of Estonia (Estonia)

Since the National Archives of Estonia also use the Tessella Safety Deposit Box I am always eager to learn from their experience, challenges and solutions.

I had the honor to give a presentation last November at the Estonian Digital Deposit Event in Tallinn, so was already a bit acquainted with the Estonian approach.

Kuldar Aas started by stating that (manual) pre-ingest and ingest procedures are not sustainable when dealing with digital records. Human intervention can’t cope with the data volume. This means that processes need to be automated and that we should reuse records management metadata as much as possible for archival purposes.

Aas offered a two folded solution: 1) focus on the originating ERMS: define strict protocols for transfer and ingest, Submission Information Package (SIP) standards and quality requirements, 2) choose a centralized solution.

The centralized solutions entails tooling for record creators during the pre-ingest and transfer process: the Universal Archiving Module 

Taken from the presentation by Kuldar Aas

A similar debate is present at the National Archives of the Netherlands: what should we automate and how do we go about doing this? Similar to the National Archives of Estonia tools have been developed to aid both archivist and records creator in creating SIP for ingest in the e-Depot and the de facto SIP standard MeDusA is propagated, but there are many issues that cannot be solved by this.

Aas recognizes this by stating that most work will have to be done at the moment of creation. In his conclusions he stated that quality of descriptions cannot be achieved if it is not already available in records management. He also stated that it might be wiser to  not demand to replace the current records management metadata but add something to it instead. Again a similar position.

There were also some practical conclusions:

  • start small: mandatory metadata set in UAM is currently rather limited
  • many agencies and ERMS vendors do not like the idea of prescriptive metadata schemas

Furthermore we should look ahead and at new ideas. The European Interoperability Framework provides new opportunities to develop solutions. Also there is a real need for clever semantic mapping engines, since by and large, semantic operability remains the larger issue in records and archival management. As was proven by the CRKM Metadata Broker (Clever Recordkeeping Metadataproject).

Aas expressed the hope MoReq2010 extensions could explore this venue of semantic interoperability.

DLM Forum’s triennial conference 2011: Interoperability – Metadata for interoperability: practical issues and advice Marc Fresko, Director, Inforesight (UK)

Metadata for interoperability: practical issues and advice
Marc Fresko, Director, Inforesight (UK)

Marc Fresko treated the audience with a view on metadata: why there are of use and which ones you need.

He also compared the different standards and guidelines on metadata and concluded that they couldn’t be mixed. He even doubted the value of some standards.

I tended to disagree on most of his comparisons, but he did have some valuable remarks on practical issues surrounding metadata:
  • Before you can develop your metadata elements, you need to understand the entities you want to describe;
    • start with an entity model
    • be prepared to let it evolve
  • One also needs
    • controlled vocabularies
    • governance regime
  • The metadata for one organization should
    • relate closely to specific software
    • resist the temptation to develop a metadata model without specific software in mind
  • Use a tabular representation
    • express the schema in XML
    • use software to relate the tables to the XML
  • Only include metadata you truly need
    • don’t include metadata just because you can
    • do include metadata that helps you manage

All very good and very practical points. 

What I really disagreed on however was the notion that all engineering should be done by the records managers. Nowhere the business owners or content owners were mentioned. I tend to think that metadata is not only about records management, but also and very much so on use.

In any event: it was a session that made me think. 

DLM Forum’s triennial conference 2011: Interoperability – Promoting good governance through records management

Promoting good governance through records management

Mikko Eräkaski, Project Manager & Armi Helenius, Senior Research Officer,
National Archives of Finland (Finland)

Mikko Eräkaski and Armi Helenius gave an excellent insight in the Finnish approach towards records management and interoperability.

In Finland there is legislation in place that forces public bodies to consider interoperability. This legal statute states that:

  • Public sector bodies must each plan and specify their own Enterprise Architecture.
  • Business systems must meet interoperability requirements.
  • Public sector bodies are obliged to use common services and information systems.

To support the act the National Archives have been working on the concept of the Lifecycle Management Plan (LCMP). LCMP’s define management requirements for the whole life-cycle of all records created or received in all business processes of an organization and the metadata required by SÄHKE2 (=national requirements for records management systems and processes).

The most interesting aspect of the LCMP approach in my opinion is the use of functional classification schemas as linking elements between the LCMP’s and business systems. Function class is a mandatory element in SÃKE2 and can be compared with the PIVOT approach in the Netherlands.

Taken from the presentation found on the DLM website 

The benefits of this approach is that if you are able to implement such schemes less effort needs to be put in cataloging and describing records: you could even gain real time information on transactions. The downside however is that it takes enormous efforts to manage such schemas. And thus issues on scalability and flexibility arise as well.

More on the LCMP and SÃKE2.

dinsdag 20 december 2011

DLM Forum’s triennial conference 2011: Interoperability - EAD and METS: a coherent set of metadata for digital data preservation

EAD and METS: a coherent set of metadata for digital data preservation
Lardinois Yves, Analyst, Archives Générales du Royaume (Belgium)

The session of Lardinois Yves was rather disappointing to me since it did not touch on something new or exciting. It was however a good introduction on both the EAD and METS standard.

He explained why metadata should be applied when dealing with digital records and that in his opinion there is no prevailing practical metadata standard that covers both  technical and archival needs.

By using EAD and METS as complementary sets a full metadata format is available. EAD providing metadata for archival management of structure, relations and description and METS providing for the necessary technical metadata.

With this combination one could tackle the three dimensions of interoperability:

  • temporal: interpretation of information over time for instance by using standardized description methods
  • software: reuse of data between applications
  • data: reuse of content by for instance using unique identifiers

More information on the mentioned sets of metadata for digital data preservation:

And of course our very own set (not a standard): MeDuSa (NL)

maandag 19 december 2011

DLM Forum’s triennial conference 2011: day 2 - Making Intelligent Information Control a Reality in Europe

13 December 2011
Making Intelligent Information Control a Reality in Europe
Rory Staunton, Managing Director, Strategy Partners (UK)

Rory Staunton did not waste time on niceties, but bluntly announced that the time had come for traditional archivists. Old records management seemed outdated to him. We now live in a time of coincidence. A coincidence of forces such as recession, new technology (at new prices) and increasing compliance.

Traditional records management seemed to fight these forces, but according to Staunton, we should be clever and use these forces. Secret records management: providing records management without people noticing you are doing it. A stark contrast to the popular image that we should bring records management to the (end)users.

Staunton set about how to use these forces.

Why does one need intelligent control? A question every archivist should be able to answer in my opinion. But Staunton provided us with some answers drawn from his own experience. Intelligent information control is needed because of the ever increasing and en enduring scrutiny we need to work under. Transparency comes at a price. Next to scrutiny there is operational safety for which we need intelligent control. Humans are humans and we make mistakes. Human disfunctioning needs technical checks: it needs intelligent information control. But despite our own shortcomings, people distrust technical solutions. One will trust the paper files stored in the cabinet behind them (it’s there isn’t it?), but will distrust digital files. Adding insult to injury: digital information management in some minds leads to too much transparency. We can record everything.

So a paradox seems to emerge: the compliance industry is growing, but the number of implemented records management systems is not. One feels to lose control over content. But in any event archives are still expected to serve compliance needs, but see little resources allocated to them to actually fore fill these needs.

But according to Staunton successful compliance is not so much about technic as it is cultural. One has to change the organization and the prevailing culture: fighting the three way chasm: records management altruism, IT departments are de facto run by IT vendors, Business is run by stockholders.

Records management is done more for the benefit of others (you can certainly find your own records), so records management is hardly done at all. MoReq2010 is considered a game changer by Staunton and Strategy Partners for business managers, consumers and government. ­MoReq2010 does not ‘fit’ in IT, because it does not consider IT as an strategy, but as a list of technologies. It’s about information compliance and as such is driven by business. Business considers costs and substitution of technology is costly. MoReq2010 supplies interoperability which renders substitution less costly. Combining cost reduction with records management MoReq2010 uses a horizontal instead of vertical approach to records management. So I think what Staunton says is that MoReq2010 frees records management of pure altruism.

IT departments are part of the problem since due to outsourcing they are de facto run by IT vendors. IT vendors by definition propagate technology solutions instead of human or customer solutions: the service level agreements you agree on do not concern control. You are very much not in control, since instead of service delivery they are about service performance. But this goes beyond new SLA´s. It´s about thinking about new organizational formats where legacy thinking is ousted and duty of care is welcomed: it should be about delivery of results.

By using financial arguments one can push MoReq2010 and records management. Under a recession stockholders and business owners are eager for financial opportunities. The theme of substitution should not be about technology choices. It’s a lifestyle choice. Substitution should have a Return On Investment (ROI): MoReq2010 can be used to gain compliance and enables a ROI when substituting technology. MoReq2010 saves money.

Recession and stockholders are never a good combination, but for records managers there are opportunities.

More information on Strategy Partners and Moreq2010.

woensdag 14 december 2011

DLM Forum’s triennial conference 2011: day 1 - Moreq2010

12 December 2011
Apples and oranges: An in‐depth comparison of MoReq2010 to other
international records management standards
Jon Garde, Author of MoReq2010 & Director of Journal IT (UK) 

Although some might frown upon the fact that conference attendees received a paper version of the MoReq2010 standard, Jon Garde revealed that this edition was in fact the 1.1 version of the standard. A version nowhere to be found online. Not even on the MoReq website.

Firstly he touched on some specific tools that accompanied the new standard:
  • The MoReq2010 export XML schema that describes how entities are exported, in what order, and the structure of the resulting XML data
  • The MoReq2010 Test Framework for Accreditation, Certification & Testing 
This however would not be the end. In the (near) future extension modules and translations would be developed.

After these announcements he set out to compare the MoReq2010 standard to other records management standards. This comparison should eventually lead to a contrast document.

Important differences according to John Garde are:
  • Requirements
    • Coverage: incomplete coverage of the standard is diverted by allowing extensions
    • Cardinality: level of specification per requirement.
  • Process: MoReq2010 relates to parts of the records management process, thus recognizing that some work is already done.
  • Vocabulary & concepts: be using a extended glossary the MoReq2010 can be made interoperable. By contrast ISO standards such as 23081 and 15489 are not interchangeable on this point.
Garde then made a distinction between so called ‘hard’ and ‘soft’ standards. Hard standards are standards that have a testing regime and which are interoperable via technical and semantic requirements: if you do not follow the rules exactly (comply), then it does not work. MoReq2010 is such a standard. Soft standards do not have a testing regime and compliancy is a value.

I do not know about the hardness of MoReq2010 and am tempted to say that I foremost would like to see that some (instead of no) rules are applied. We’ll work from there.

His presentation can be downloaded from the DLM website.

DLM Forum’s triennial conference 2011: day 1

12 December 2011
Setting the Scene: DLM Forum 2011 in Context
  •  Professor Julie McLeod, Head of Research, Northumbria University and DLM

Julie McLeod described the context of this DLM Conference by presenting the audience various views on the meaning of intelligent information control. It being the subtitle of the DLM Forum Conference on Interoperability & MoReq2010.

She did so by asking within the  Northumbria University what meaning intelligent information control should mean. Replies were plentiful and colourful: control, by some respondent, should actually mean access, thus touching on the delicate issue of privacy when talking about  interoperability. By another intelligent information control should be about providing for individual needs: information when I want and however I want. Someone else felt that it is about efficiency and cost reduction, while a different party thought it was a meaningless set of words. Intelligent information control could thus be considered meaningless, but at the same time be defined as an utopian concept. This definitely set the scene for the conference.

After her introduction McLeod talked about two UK initiatives that involved intelligent information control. First the NHS attempt to build a top down cooperation portal was described. This initiative which in many ways seems to be similar to the Dutch Electronic Patient Dossier however was deemed to be too complex and is now being dismantled into modular components. Sounded familiar as well.

Secondly she discussed the JISC digital infrastructure for higher education aimed to share information between researchers. Here again I found that similar projects could be found in the Netherlands such as DANS and the NCDD.

As it turns out projects that aim to achieve interoperability are only so within a certain context. I guess this is only natural, but widespread implementation of the concepts of MoReq2010 could change this or at least facilitate such a transnational interoperability.

Her presentation can be downloaded from the DLM website.

DLM Forum’s triennial conference 2011

Via this blog I would like to share my experiences during the DLM Forum’s triennial conference 2011 in Brussels.

Depending on my notes the posts will be longer or shorter, business like or informal. I just hope that anybody who reads this gets some valuable information out of it. In any event: it's my first entry in English and an easy and fun way to write a report on the DLM Conference.

If you are just interested in the presentations I very much recommend the DLM Forum website.

First of we start with the keynotes that fired off the conference.

12 December 2011
Keynote 1: DIGIT
·        Francisco García Morán, Director General of the European Commission’s Informatics Directorate‐General (DG DIGIT)

The mission of DIGIT is to enable the Commission to make effective and efficient use of Information and Communication Technologies in order to achieve its organisational and political objectives… According to the website

García Morán looked at the future of Europe and identified that ICT has a potential for growth. By using ICT smart growth is possible. He saw some barriers:

  •  interoperability
  • trust in digital services
  • fragmented digital markets
  • insufficient R&D
  • the slow implementation of e-government
DIGIT therefore aims to adopt an European interoperability strategy and framework. Implementation of this framework encompasses notions such as semantic interoperability, cataloguing of services, architecture and legislation.

DIGIT is working on an ERMS under the name of eDomec and Hermes Ares Nomcom.

The presentation can be downloaded from the DLM website.

Keynote 2: FEDICT
·         Jan Deprest, Chairman, Federal Agency for ICT Belgium (FEDICT)

FEDICT aims to achieve semantic interoperability and technical interoperability within the Belgium federation.
It develops and implements standards to achieve this goal.