Vint Cerf, co-designer of the TCP/IP protocols that make the Internet work and vice-president and Chief Internet Evangelist for Google, warned last month (for example, here, here and here) about an information black hole into which digitised material is lost as we lose access to the programs which are needed to view them. Somewhat ironically, Google’s own priorities recently seem to have been to increasingly withdraw from information projects which preserved the past – killing off archives, slowing down digitisation activities, removing the Timeline and increasingly prioritising newness over older more established sources in search results (Baio 2015).
Responses to the reporting of Cerf’s warnings were mixed. Some seemed relatively complacent: after all, we’re already preserving data and information in libraries and archives, aren’t we, while using open file formats will mean that bit rot is not a problem? In the process, many seemed to overlook part of Cerf’s argument – that there was a need to preserve old software and hardware so that we retain the ability to read files in their original formats: what he characterised as ‘digital vellum’.