Part 1 | Open Infrastructures: Why they matter for long term access
Part 2 | Open Infrastructures: Control
If you want to go fast, go alone. If you want to go far, go together.
African proverb (allegedly, provenance unclear)
Knowledge is in the Network
Much of the cultural and intellectual meaning of an object does not accrue from attributes intrinsic to the object itself. Instead, when digitised, much meaning is derived from an object’s context and provenance relationships with other entities, captured as metadata. These could…
Part 1 | Open Infrastructures: Why they matter for long term access
Part 2 | Open Infrastructures: Control
If you want to go fast, go alone. If you want to go far, go together.
African proverb (allegedly, provenance unclear)
Knowledge is in the Network
Much of the cultural and intellectual meaning of an object does not accrue from attributes intrinsic to the object itself. Instead, when digitised, much meaning is derived from an object’s context and provenance relationships with other entities, captured as metadata. These could be alternative objects, but are often people, places and events that serve to situate the object within social, cultural and intellectual discourse.
However, any individual organisation is unlikely to hold authoritative information about all of these other entities. Consequently, the preservation of the knowledge and meaning embedded in this network of relationships becomes a community endeavour. For this to work, each organisation needs to make its content available so that others can use it. This is only likely to work at scale in an open, collaborative environment where organisations get together to build mutual trust that their links will remain open and persistent over time.
A key mechanism for such long-term linking is the use of persistent identifiers (usually expressed as a web URL) for digital objects, and the associated resolver services which ensure that identifiers continue to point to the relevant objects even though the infrastructure or even the organisation responsible for holding the object changes. With the explosive growth of online technologies, users’ interactions with digital materials have become increasingly machine-mediated. Advances in AI and virtual/augmented reality have driven richer experiences often developed by institutions themselves. In turn, persistent identifiers increasingly need to be accompanied by standard APIs and metadata to drive these experiences.
Again, such developments rely on open, community-driven development and adoption, but the provision of resolver services for persistent identifiers takes us into the realm of shared infrastructure. Such central services necessitate the creation of community-governed and sustained organisations to operate them, with careful provision to ensure that they remain open and cannot be subverted by a commercial takeover or domination by a minority of organisations. Done properly, persistent identifiers with associated APIs and metadata can help make the services built over preserved materials more sustainable in their own right.
Open Source Communities
Unlike businesses, memory organisations rarely compete based on operational IT excellence, or indeed at all. Even research or other grant funding is often approached collaboratively. With this in mind, such organisations would appear to be a natural fit for open source development of software, standards and content. Institutional requirements are broadly similar, differing mainly in scale. Everyone is interested in digital preservation, of course, but organisationally we share similar challenges around outreach (to scholars and/or the public), reporting to funders/supporters, and regular library and archive management, as well as standard business operations.
For challenges specific to our sector, collaborating on open solutions would seem to make sense. We must remember that “open” is not free, but rather about the longer-term saving related to greater control, transparency and standardisation. That being said, larger organisations with access to more expertise, resources and diverse perspectives can support smaller entities. There is equity in this, ensuring the survival of memories should be a shared endeavour and not be entirely dependent on resources.
Participation in open projects should also not be measured purely in terms of a financial contribution or technical development, although these are obviously crucial, basic needs. It is often forgotten that a healthy, vibrant community also benefits greatly from other types of contribution: organising meetings and events, preparing documentation and training materials, support at conferences, and mentions on social media and in publications are all valuable.
As mentioned previously, “openness” is not just about software, and there is certainly no requirement for everything to be open. We at the OPF are obviously advocates for open source software, although our tools can be found in both open and proprietary workflows, either on premises or in the cloud. But we are also becoming involved with open standards such aseArchiving and theOxford Common File Layout, open training materials, and open registries of information such asCOPTR.
While a community usually emerges naturally as you turn to open infrastructure, this is not guaranteed. Its continuation is a growing, institutional issue. We’ll consider why in the next blog in this series.