News

Insights from practitioners in Information Management

Issue 57 – Information Management – Digital Divide

The May edition of Information Overload takes a look at the chasm that is the digital divide. As devices get smaller, storage and capabilities get bigger, and the costs are comparable to what we were paying several years ago is it any wonder it is seen as the solution to getting rid of the forests of paper that sit in the cardboard boxes in our various sheds and disused buildings?

In this Issue we will be looking at:

  • What do we mean by the digital divide?
  • The more things change, the more they stay the same
  • But who pays the price for providing access?
  • Do we have a solution yet?

What do we mean by the Digital Divide?
First of all I would like to describe what I don’t mean by the Digital Divide. This paper is not going to look at those who can and those who cannot afford to buy a PC. Nor are we going to look at who knows where the best information can be found, and who uses the Google’s of this world to find everything they think they need in the information world. Rather, how do we maintain access to the information that we create if the mediums are forever changing?

The more things change, the more they stay the same:

When starting the research into the topic of the Digital Divide for this month’s e-zine I happened upon a range of articles on the subject. Delving deeper into the various titles and subjects the first thing that crossed my mind was – what we were reporting as problems 10 years ago, are still concerns today. Yes, some of the papers really were that old.

As information professionals (Librarians, Records Managers, Archivists and our IT colleagues who keep everything glued together) we have the rather interesting task of trying to ensure that what our work colleagues do in their day-to-day lives of running and operating businesses we can provide access to at some unidentified point in the future. Or will we forever be playing catch up and hoping against hope we are never called to account for the information we have created, received, stored, can no longer find, or if we can find, can still read?

One of the major problems that I can see is not that the bits and bytes are getting smaller, but that the new software may or (in most cases) may not be compatible with what systems and software that we already utilise. Vista for example needs more processing power than most small countries need to run their entire infrastructure.

But for the creators of the new range of programs, it is usually not a case of whether or not it is compatible, rather, can it do what I want it to do, and can it do it faster, easier and if it doesn’t then they’ll create something that does. “What do you mean, what about the records? What about them – doesn’t everyone use MP4 or .wmv files these days”.

All flippancy aside, these are serious issues.

Take YouTube for example. Youtube became possible on the world stage because of a leap in technology in related fields to the digital video world. For example, mobile phones and digital cameras became video cameras, capturing those interesting moments and then allowing people to upload them to the Net. But it wasn’t until Flash stabilised (incarnation Macromedia 7 [I think] that it became possible to watch these miniature works of creativity on our home PC’s using a site that allowed everyone to be their own directors – and were. And allowing a global audience to see the product of their labours.

Or what about Joost? For those of you who haven’t heard of Joost yet, it is online TV mashing together with the Internet to bring you television programs from across the world 24/7. It is still in beta testing mode at the moment, and you have to be invited to participate in the testing, but the implications for this is enormous, not least of which – will the UK still charge a licence fee if you own a television, if you watch all your programs on the net? (Yes they really do still charge a licence fee payable at the post office every year to own a TV, so you can watch the television programmes – only some of which are produced by the BBC [The original argument for the fee]).

Or take Twitter…. http://www.twitter.com – Twitter is MSN for grown ups. Answer a simple question – “What are you doing” and the people online can see your response. Sure you get the strange and the banal, but you also get a range of professionals discussing the latest and greatest in software development, creative projects and a whole host of other things. One of the other interesting things about Twitter is that they have mashed it together with Google Maps to create TwitterVision – allowing you to see where people are posting their messages from. Did the creators of Twitter think about the records they are creating and whether those records will be viewable tomorrow let alone next year? Of course not, they were too busy struggling to keep up with the necessary storage and bandwidth problems they encountered as zillions of people across the world tuned in and started talking.

As records and information managers we are still trying to persuade the powers that be that we need to capture emails let alone MSN records, and if we are to capture them, can we capture all of them and more importantly can we retrieve them when we need to produce them in a court of law? When we are still trying to educate our colleagues of the perils in deleting messages and documents that should be part of the corporate memory.

How on earth do we capture the information contained in the social networking sites that our colleagues have access to that we may not even be aware of? If we go back to Twitter for just a minute. Did you know that you can create a closed network of people? If you don’t want the messages you create to go into what is termed the “Public Timeline” you can keep your conversations quiet, and therefore away from the prying eyes of the watchdogs. Imagine if Kenneth Lay and his colleagues at Enron had used Twitter (or similar) for their conversations rather than Email. It’s an interesting thought, and one that doesn’t make our jobs any the easier.

But who pays the price for providing access?
Maintaining access in the long term to digital material appears to come down to the libraries, archives, records departments or collecting institutions of the world. Yet most organisations do not appear to have the necessary backing and /or funding (by senior management or government) to hope to do this for the long-term. In some cases, collecting institutions are having to rely on short-term grants to provide the infrastructure for what is hoped to be a long-term solution.

A report in the May/June 2007 issue of D-Lib Magazine on “Ten Major Issues in Providing a Repository Service in Australian Universities.” http://www.dlib.org/dlib/may07/henty/05henty.html looks at some of the concerns held by the major universities. Including:

Should libraries be responsible for covering the costs for the provision of a digital repository? And would people notice if there wasn’t one?

Of course the answer to that one is always ironic….in most cases, something only becomes valuable to an institution when they think they are either going to make money from the contents, or more importantly – when they know they will lose money as a result of not being able to prove something. Imagine not keeping the research underpinning a patent held by your company and then trying to prove in a court of law that you did in fact do the work and you will begin to see what we mean.

Which makes me wonder why organisations are not willing to fund the projects – is it short sightedness or a case of “she’ll be right mate” and like all forms of insurance – do you buy it and hope you don’t have to use it, or do you risk not buying it – and hope this decision does not come back to bite you later on down the track.

Or is it because organisations don’t really know what technological solution to back?

Do we have a solution yet?
As the speed to technological obsolescence has increased, it is harder than ever to keep pace with the changing formats. As we discussed way back in 2004 – in our original discussion on Digital Archiving the medium underpinning several high profile examples have long since disappeared into “silicon heaven” – “the National Space Agency (NASA) sent satellite probes into our galaxy to record information.  The information from the Viking probes was recorded onto digital tape. The tape recorders are now obsolete and most of the data has been lost.”

As recently as May 23 2007, Computerworld were asked this specific question – How should I archive files — on CDs, DVDs, floppies, flash drives or a hard drive? Gerhard Staufenberg, Gold Canyon, Ariz.

Their answer was interesting if not predictable:

A: I wish I could give you a definitive answer, but no one knows how long specific media will last. You need confidence that hardware and software that can read your archive files will be around, too. I’d stick with popular, nonproprietary file formats, such as .bmp, .jpg, .mp3, .htm, .txt and .pdf. And make sure any computer you buy can read your archives before you bag the old system.

Both CDs and DVDs are excellent choices for archival storage. See “Will My CD-R and DVD+R Discs Still Run in 10 Years?” for details.

Floppy disks, though, caught a train to Obsolescentville several years ago.

Flash drives and SD Cards have withstood teething puppies and trips through the wash, but no one expects them to last for more than about 10 years. They support a limited number of writes and connections, depending on the type of memory they use.” http://www.computerworld.com/action/article.do?command=viewArticleBasic&taxonomyName=Storage&articleId=9020840&taxonomyId=19&intsrc=kc_li_story

As Jeff Rothenberg said way back in 1995 – “it is only slightly facetious to say that digital information lasts forever – or five years, whichever comes first.” Rothenberg, Jeff. Ensuring the longevity of digital documents; Scientific American, January 1995, p42.

  • We cannot capture what we don’t know about. (See Reference to Twitter above).
  • We cannot capture documents / emails / files if users are trying to find ways to circumvent the system.
  • We cannot capture anything if we are not given the money and other resources to buy / acquire the software and the hardware in the first place, let alone the ongoing monetary and other requirements for ongoing upgrades and migrations.
  • We will spend countless hours trying to find a way of integrating new technology with the old, new software with the existing.
  • Organisations will spend more money digitising old records thinking they can save money on the hardcopy costs of storage, replacing those costs with costs mentioned above. Whilst the search and retrieval aspect will be better –
  • Software and Hardware Obsolescence will play a major part in what we are trying to do, even if we decide to run Open Access Software, someone, somewhere still has to tweak the code so that we can capture the newer file formats.
  • Information will continue to be lost, altered, deleted or damaged through ignorance, malicious intent and disasters.
  • There will be lots of changes, but everything will stay the same.