*OPF Executive Director Neil Jefferies has found that Open Infrastructure is becoming a more prevalent topic of discussion, with the topic as a whole expanding to encompass a much broader definition. *
*Neil has presented on this several times over this year: at the Oxford Forum for Open Scholarship, at ILIDE in Jasna, Slovakia, at FedoraCamp in Vienna and at Heritage Online in Gdansk. We considered this a great opportunity to combine his expertise into one place. *
*Over this short blog series, we hope to share th…
*OPF Executive Director Neil Jefferies has found that Open Infrastructure is becoming a more prevalent topic of discussion, with the topic as a whole expanding to encompass a much broader definition. *
*Neil has presented on this several times over this year: at the Oxford Forum for Open Scholarship, at ILIDE in Jasna, Slovakia, at FedoraCamp in Vienna and at Heritage Online in Gdansk. We considered this a great opportunity to combine his expertise into one place. *
*Over this short blog series, we hope to share the overall considerations from these presentations. We will cover open APIs, open standards and open resources, registries, shared information, persistent identifiers and all that comes with this to show that infrastructure has many open components. *
The Changing Digital Infrastructure Marketplace
Digital knowledge infrastructures can be defined as the combination of components used to maintain, create and share knowledge online. With almost 70% of the world having internet access, these are increasingly becoming the channels for cultural and intellectual discourse. Infrastructure discussions are often focussed on technology – hardware, software and networking but, when considering the persistence of knowledge in the longer term, I would like to consider some more human elements and the importance of trust borne out of openness.
Much of the value of such infrastructures accrues when they persist not just as a channel but also as a record of, and context for, discourse. While memory organisations such as libraries continue to hold much of this material, a significant fraction (the majority when considering volume of content) of knowledge infrastructures are operated by large commercial operations. Similarly, with the emergence of the cloud, much of the tooling and storage used by memory organisations is heavily dependent on commercial third parties.
However, as the pace of technological change accelerates, the turnover of commercial technology providers has been increasing. The consultancy firm Innosight has analysed corporate lifespans in the S&P 500, which represents the top US tech-based companies, and identified a clear downward trend that has been termed “creative destruction”. AmericaOnline still exists as a shadow of its former self, but Geocities, Tripod and Compuserve are distant memories, with much of their content lost despite somevaliant efforts.
Even when organisations survive, they increasingly have to pivot, changing their business models to maintain profitability. IBM, once primarily a hardware manufacturer, no longer makes PCs, and derives over 75% of its revenue fromsoftware licensing and consultancy. Meta is increasingly looking toalternatives to Facebook to maintain revenue, which may have contributed to the corporate name change.
At the same time, the concentration of compute and storage resources in cloud hyperscalers, and away from conventional commercial end-users, has changed the storage market in significant ways. When customers move their data into the cloud, demand for storage declines. End-user applications typically use larger numbers of small drives, which are mostly empty, but cloud providers share storage provision among several clients and can thus use fewer, larger drives and maintain higher usage levels. This puts pressure on drive suppliers and has led to significant hard drive market consolidation, with only three manufacturers left.
Similarly, backup to the cloud has reduced the tape market to basically two vendors continuing active development. This greatly reduces the options available for organisations wishing to host their own storage, who now compete with much larger customers for supplier attention.
Thus, not only is content at risk, but so are the tools and services that memory organisations might use to rescue and preserve that content. A successful long-term infrastructure for access to materials must include a significant community component in addition to hardware, software and networking, since it must survive both technological and economic obsolescence.
What I want to examine in the following posts is how “openness”, in several forms, can help memory organisations achieve their goals in this area.
Postscript
Interestingly, two recent, largely unrelated events have served to rejuvenate the “historical” storage media markets:
- The Generative AI-driven demand for vast amounts of cheap storage for large language model training sets, mostly culled from crawling the web, has led to increased demand for hard disks at scale. Hard disks still enjoy a significant unit cost advantage over flash storage at larger capacities. Although, they have been largely supplanted in consumer devices (except for people with enormous media or game libraries).
- The increase in ransomware attacks has led to the realisation that backups should not be network-connected (i.e. not “in the cloud”) but ideally disconnected and locked away. This has led to a resurgence in demand for tapes, tape drives and tape libraries as the value of on-premises storage that can be physically secured and physically accessed.