Skip to content


KeepIt course 5: Tools for Assessing Trustworthy Repositories

KeepIt course module 5, Northampton, 30 March 2010
Tools this module: TRAC, DRAMBORA
Tags Find out more about: this module KeepIt course 5, the full KeepIt course
Presentations and tutorial exercises course 5 (source files)

Our primary tools in KeepIt course 5 are TRAC and DRAMBORA. We’ve seen passing references to these tools in earlier modules: in course 1 (slides 2-3), course 2 (or maybe not, slide 5; yes we did, slide 6), and course 4 (slide 7). This gives a sense of how integral these tools are to a structured approach to digital preservation. In this module we’ll find out why, beginning here with TRAC, Trustworthy Repositories Audit and Certification.

[slideshare id=3664564&doc=keepit-course5-trust-100408050615-phpapp02]

TRAC was designed by committee (slide 8). Its approach is not unique, and there are similar tools such as the nestor Criteria Catalogue. Since these tools have not fully aligned none stands yet as a full global standard for measuring trust of digital repositories.

As the TRAC name suggests, there are two parts to the process: audit, that is, assessing conformance against a set of criteria; and certification, demonstrating conformance to an independent agent.

As with the PREMIS preservation metadata dictionary covered in course 3, TRAC exhibits a clear and logical structure that quickly becomes apparent on inspection, both in the structure of the 84 individual entities (slide 13) and the structure of the full checklist (slide 14), which is designed to assess three primary areas within TRAC:

  1. Organizational Infrastructure
  2. Digital Object Management
  3. Technologies, Technical Infrastructure, Security

To get a feel for this structure and apply the approach to some TRAC entries, we devised a short exercise that anyone can try. Using the procedure shown in slide 15, decide whether you can certify your repository, or not, for some given entries. To avoid self-selecting entries designed to show your repository in a glowing light, download and use the randomisable spreadsheet which lists all the TRAC entries in order. Notice that column A consists of randomly generated numbers, so if you sort the spreadsheet on this column it will randomise the entries. Before you do this, decide how many entries you wish to apply in the exercise and pre-select this many numbers, from 1-84. These will be the line numbers corresponding to the randomised entries you will use in the exercise.

Not all repositories will need to be formally certified for trustworthiness, which can be an intense and detailed procedure, as illustrated by one of the few published case studies, in this case a major e-journal archive, Portico (slides 18-21). Credit to Portico for revealing this example, which shows an exhaustive process such as TRAC will raise issues even among the best prepared and managed archives today.

Closer to home, one of our KeepIt exemplar repositories, eCrystals, has applied TRAC and reported its initial recommendations:

  • TRAC is open-ended and exploratory, and therefore more suited to repositories with an established long-term archival and preservation mandate.
  • At the current stage of development of the eCrystals data repository we recommend self-assessment using the DRAMBORA toolkit as an instrument.
  • The audit process in many ways is more important than actual certification, since it allows repositories to analyse and respond to their archives’ strengths and weaknesses in a systematic fashion.

We can test these recommendations immediately, because the next session will involve an extensive exercise with DRAMBORA.

Posted in Uncategorized.

Tagged with , , , .

0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

Some HTML is OK

or, reply to this post via trackback.