Archive Portal Europe

zaterdag 9 februari 2013

De jonge onderzoeker over waardering en selectie

Het belooft een flink debat te worden op de komende KVAN dagen. Hopelijk gaat het niet alleen over bewaren en vernietigen, maar vooral over waarderen en selecteren. Want weggooien kan altijd nog.

Enige gedachten over het onderwerp heb ik hier opgetekend.

maandag 16 januari 2012

DLM Forum’s triennial conference 2011: Re‐Using Data : Lascaux 2.0: Retaining meaning & purpose for records in collaborative spaces

13 december

Lascaux 2.0: Retaining meaning & purpose for records in collaborative spaces
‐ Tim Callister, Information Management Consultant, UK National Archives (UK)


Tim Callister invited us to join him in an exploration of collaboration spaces. He employed the perspective of the story teller by bringing us back to the Lascaux cave paintings. Since the dawn of time people want to share information. The trouble is however that it is hard to interpret what people want to share: over space and over time.

Collaborative, dare we say 2.0 technology has provided us new ways to share, but has left us with the same challenge: how do we understand what people share?

In the UK Callister explained there are many applications in which sharing takes place: knowledge management, information exchange, transparancy and data re-use is performed with use of Sharepoint, Huddle, Civil Pages, Linkedin and Yammer. Proper interpretation of the data shared through these applications implies knowledge of original use of data. We need to know about the context.


He touched on a subject which has already been experienced in the Netherlands: the need to capture implicit   metadata. Much of the context is only available implicitly: that means that much of information on the who, the what and the why is stuck in people's heads. This means that digitization also calls for formalization. In order to answer the questions needed to perform preservation and interpretation over time metadata should be explicitly stored with the records.

The solutions remained the same according to Callister. One should have good retention schedules, proper FOI/DPA management, Access management and Knowledge management. If you can’t manage these re-use of data seems impossible.

The UK solution is found in centralization of certain tasks and responsibilities within Directgov.
Directgov as an organisation does two different things. It  provides access to online transactional services and it publishes government  information for citizens in one place.

The National Archives (TNA) then provides services for the management of information. It also participates in government ICT strategy. Through these channels standardization and linkage of information is promoted and executed. Furthermore TNA manages the Government Web Archive and in the Secure Web Archiving Project researcher look at archiving intranet en web-based platforms.

Initiatives well worth keeping track of.

His presentation is also found online.

donderdag 5 januari 2012

DLM Forum’s triennial conference 2011: Keynote: Preserving Communications for the Future / Common Features of Messages for Interoperability and Long Term Preservation

Keynote: Preserving Communications for the Future / Common Features of Messages for Interoperability and Long Term Preservation

Richard Jeffrey‐Cook, Head of Information & Records Management,
In‐Form Consult (UK)

Jeffrey-Cook focused on the challenge of continuity of information. For him this encompassed all forms of communications transactions: from text to social media communication. Not an uncommon perception for archivists. The challenge can be beaten by finding common ground between communications. Jeffrey-Cook argues that Moreq2010 provides such common ground.


The world of communication and technology changes at an ever increasing speed. The use, purpose and impact   of communication changes as well as do the instruments we use for communication. A case also made by Angelika Menne-Haritz.

All these instruments use different technology, have different functionality, employ different standards, but all potentially produce records.

Jeffrey-Cook points out that we need to search for common characteristics between all these forms of communication and identify which characteristics are important for interpreting and preserving records. In a many ways this confirms the ideas formulated in the Interpares projects. He also stressed the importance of maintaining the threads between communications. Proper interpretation is derived from the relationships between communications.

The biggest challenge will come from communications where little is standardized. The interaction by social media has little or no standardization. The maintenance of threads will be all important. We will also face problems when dealing with communications with interaction which are not intended to produce records. I think the first case study has already arrived.

Jeffrey-Cook concluded with the following remarks:


  • Distinguish  the message and it’s metadata from the technology
  • Identify the characteristics common to all communications
  • Separate the client application - the user interface into an extension module

His presentation can be found here.





donderdag 22 december 2011

DLM Forum’s triennial conference 2011: Interoperability – Applied interoperability: Experiences with reusing records management metadata


Applied interoperability: Experiences with reusing records management
metadata
Kuldar Aas, Deputy Chief, Digital Preservation Department, National Archives
of Estonia (Estonia)

Since the National Archives of Estonia also use the Tessella Safety Deposit Box I am always eager to learn from their experience, challenges and solutions.

I had the honor to give a presentation last November at the Estonian Digital Deposit Event in Tallinn, so was already a bit acquainted with the Estonian approach.

Kuldar Aas started by stating that (manual) pre-ingest and ingest procedures are not sustainable when dealing with digital records. Human intervention can’t cope with the data volume. This means that processes need to be automated and that we should reuse records management metadata as much as possible for archival purposes.

Aas offered a two folded solution: 1) focus on the originating ERMS: define strict protocols for transfer and ingest, Submission Information Package (SIP) standards and quality requirements, 2) choose a centralized solution.

The centralized solutions entails tooling for record creators during the pre-ingest and transfer process: the Universal Archiving Module 

Taken from the presentation by Kuldar Aas



A similar debate is present at the National Archives of the Netherlands: what should we automate and how do we go about doing this? Similar to the National Archives of Estonia tools have been developed to aid both archivist and records creator in creating SIP for ingest in the e-Depot and the de facto SIP standard MeDusA is propagated, but there are many issues that cannot be solved by this.

Aas recognizes this by stating that most work will have to be done at the moment of creation. In his conclusions he stated that quality of descriptions cannot be achieved if it is not already available in records management. He also stated that it might be wiser to  not demand to replace the current records management metadata but add something to it instead. Again a similar position.

There were also some practical conclusions:

  • start small: mandatory metadata set in UAM is currently rather limited
  • many agencies and ERMS vendors do not like the idea of prescriptive metadata schemas


Furthermore we should look ahead and at new ideas. The European Interoperability Framework provides new opportunities to develop solutions. Also there is a real need for clever semantic mapping engines, since by and large, semantic operability remains the larger issue in records and archival management. As was proven by the CRKM Metadata Broker (Clever Recordkeeping Metadataproject).

Aas expressed the hope MoReq2010 extensions could explore this venue of semantic interoperability.

DLM Forum’s triennial conference 2011: Interoperability – Metadata for interoperability: practical issues and advice Marc Fresko, Director, Inforesight (UK)


Metadata for interoperability: practical issues and advice
Marc Fresko, Director, Inforesight (UK)

Marc Fresko treated the audience with a view on metadata: why there are of use and which ones you need.

He also compared the different standards and guidelines on metadata and concluded that they couldn’t be mixed. He even doubted the value of some standards.

I tended to disagree on most of his comparisons, but he did have some valuable remarks on practical issues surrounding metadata:
  • Before you can develop your metadata elements, you need to understand the entities you want to describe;
    • start with an entity model
    • be prepared to let it evolve
  • One also needs
    • controlled vocabularies
    • governance regime
  • The metadata for one organization should
    • relate closely to specific software
    • resist the temptation to develop a metadata model without specific software in mind
  • Use a tabular representation
    • express the schema in XML
    • use software to relate the tables to the XML
  • Only include metadata you truly need
    • don’t include metadata just because you can
    • do include metadata that helps you manage

All very good and very practical points. 

What I really disagreed on however was the notion that all engineering should be done by the records managers. Nowhere the business owners or content owners were mentioned. I tend to think that metadata is not only about records management, but also and very much so on use.

In any event: it was a session that made me think. 

DLM Forum’s triennial conference 2011: Interoperability – Promoting good governance through records management


Promoting good governance through records management

Mikko Eräkaski, Project Manager & Armi Helenius, Senior Research Officer,
National Archives of Finland (Finland)

Mikko Eräkaski and Armi Helenius gave an excellent insight in the Finnish approach towards records management and interoperability.

In Finland there is legislation in place that forces public bodies to consider interoperability. This legal statute states that:

  • Public sector bodies must each plan and specify their own Enterprise Architecture.
  • Business systems must meet interoperability requirements.
  • Public sector bodies are obliged to use common services and information systems.



To support the act the National Archives have been working on the concept of the Lifecycle Management Plan (LCMP). LCMP’s define management requirements for the whole life-cycle of all records created or received in all business processes of an organization and the metadata required by SÄHKE2 (=national requirements for records management systems and processes).

The most interesting aspect of the LCMP approach in my opinion is the use of functional classification schemas as linking elements between the LCMP’s and business systems. Function class is a mandatory element in SÃKE2 and can be compared with the PIVOT approach in the Netherlands.

Taken from the presentation found on the DLM website 

The benefits of this approach is that if you are able to implement such schemes less effort needs to be put in cataloging and describing records: you could even gain real time information on transactions. The downside however is that it takes enormous efforts to manage such schemas. And thus issues on scalability and flexibility arise as well.

More on the LCMP and SÃKE2.