榴莲视频

Blurred division of the future

<榴莲视频 class="standfirst">
三月 8, 1996

Daniel Greenstein looks at humanities computing in a digital age. Humanities scholarship was in the past characterised by solo endeavour. In the digital age, collaborative effort is fast becoming the working pattern across disciplines, institutions and national boundaries. How this affects the information, cultural heritage, and educational industries was the theme of a workshop on mixed media held at Glasgow University in January (. uk/mmws/mm_web_1.html).

Unusually for meetings on the "Computers and Blah" circuit, this futurology was grounded in a survey of the methods and applications in current use. Lou Burnard of Oxford University computing service gave an online demonstration of the British National Corpus (). The corpus's scope - "100 million words . . . of written and spoken language" - and its detail - every word is explicitly identified as a part of speech occurring in a particular type of document of some defined provenance - permit sharply enhanced analyses of modern English usage. Though likely to become a standard reference work, it is unlikely to change fundamentally the nature of corpus linguistics or our understanding of contemporary English.

Similarly, neither art historical nor curatorial practices were about to be overturned by David Saunders's demonstration of how the National Gallery creates computer images of historical art objects and uses these in an academic as well as a curatorial capacity. As surrogates for real works, the images (of about 1.6 gigabytes each) are used to test what effect different restoring or cleansing practices might have on a given art object.

The images may be manipulated to reveal how an artist turned a sketch on canvas into a finished oil painting, or to compare two works thought to be by the same artist or to have used the same model or subject. Here, computer-based techniques sharpened a tried and tested approach to art criticism and offered a "safer" approach to preservation.

Iain Beavan of Aberdeen University library showed off a web-accessible database of interlinked text, images, commentary and translation from the library's rare and valuable mediaeval bestiary, an illuminated book of the real and fantastical animals that made up the natural world as known in 12th-century Britain ( .ac.uk/library /best.html/). Glasgow University's Hunterian Museum used similar techniques to bring its collections to people living too far away to visit the Glasgow site ( /Museum/tour/). Both installations represent a modern approach to a problem which librarians and museum curators have long since negotiated - to make valuable objects accessible to the widest possible public.

New approaches to manuscript preservation also seemed familiar. David Cooper of Oxford University's libraries automation service showed how with high-quality digital images of rare, valuable, and fragile manuscripts, scholars could distinguish the minutest details of a manuscript's condition, appearance, and contents without reference to the original (. lib.ox.ac.uk:80/libraries/). The Bri-tish Library has adapted similar techniques to improve access to the important Burney collection of 18th century newspapers, currently only available on poor quality microfilms ( access/microfilm-digitisation. html).

The management and dissemination of the mass of digital data that our information, education, and heritage industries are now producing led to repeated cries for common or at least compatible practices among people and institutions who until five or ten years ago were content to make up and use their own conventions for creating, formatting, describing, and using digital information.

Standards are essential to the long-term preservation of machine-readable information which will otherwise deteriorate or become corrupted, just as the pages of the Burney newspapers had already by the mid-19th century crumbled with age and use. Machine-readable information can be copied from old to new magnetic disks, for example. But in a world without standards, digital information will be developed for use with particular combinations of hardware and software.

If such information outlasts its hardware and/or software environments, it becomes worthless, inaccessible and unreadable. Should we preserve machine-readable information, then, by preserving in working order every hardware and software environment that has ever existed? Or should we endeavour to create digital information with a view to freeing it from any hardware or software dependence?

Given the rate at which the industry goes through hardware and software, the former strategy can only ever achieve partial results at substantial cost. It also introduces huge economic inefficiencies. Does every library, museum, and computing service need to maintain in working order a digital tape reader, a card reader, a Commodore PET, and working copies of Wordstar 3.1 and Famulus77? The development of standards to enable digital information to move freely from one processing environment to another is a more prom-ising and less costly solution.

The interoperability that is facilitated by the World Wide Web encourages the same drift toward standards. The web's power is derived from the ability of one site to point at (and thus allow the user to travel to) any other site on the network. The prospects are outstanding. Glasgow University's Whistler studies centre plans a digital edition of Whistler's life and works integrating letters, art works, three-dimensional architectural renderings, and interior designs. Many, though not all, of these objects are available at Glasgow University and may be digitised and mounted there on a local web site. Comprehensiveness may rely on integrating into that site digital materials that are mounted elsewhere. Such integration, however, relies upon common practices being adopted by web-site authors.

Susan Hockey of the center for electronic texts in the humanities at Rutgers University demonstrated how guidelines developed by the Text Encoding Initiative enable scholars to create machine-readable texts that can be freely exchanged across platforms and among users (. edu/ and archive/TEI/teij31.sgm). More promising perhaps was the revelation that all six of the text-based projects presented at the workshop had worked in accordance, or at least were compatible, with the TEI's guidelines.

There was less agreement, though lively debate, about standard image formats, and the market in digital video formats was even further behind as amply demonstrated by George Kirkland of Glasgow University Media Services.

Organisational (or are they cultural?) constraints on standardisation and interoperability are perhaps more severe. Some were implied by Les Carr of Southampton University, who demonstrated the Microcosm hypermedia system. Microcosm allows the sought-after integration of diverse, dispersed resources ( cosm.ecs.soton.ac.uk/). Only traditional approaches to intellectual property, which associate access to information with ownership, impede its implementation across a theoretically unbounded range of networked resources.

The Aberdonian bestiary is missing a handful of pages which exist at the Ashmolean Museum in Oxford. Technically, the missing images could be integrated into the Aberdeen site. But why should the Ashmolean permit Aberdeen to incorporate in its web-site digital images of objects that the Ashmolean owns? Why, indeed, should it permit free Internet access to those images whether that access is gained via Aberdeen's or some other web page?

The questions force a reconsideration of property rights. Though some enthusiasts may argue messianically that network technologies will form an advance guard of socialism, I suspect a rather different developmental trajectory in the making: one where purveyors of information work out new and innovative licensing arrangements with would-be users. Such arrangements will enable a far greater degree of network integration than is presently possible while at the same time preserving the sanctity of intellectual property and of information suppliers' ability to profit from their productive endeavours.

Here, work at Virginia University's institute for advanced technology in the humanities and electronic text center was particularly instructive. The two agencies collect (and IATH helps to create) digital resources. An impressive list of available scholarly electronic texts and mixed media installations is available from virginia.edu/etext/ETC.html and edu/home.html). Where IATH and ETC resources are not constrained by copyright, they are made available freely over the Internet. Where such constraints exist, IATH and ETC attempt to place some of what is available in the public domain, so that anyone may browse. More comprehensive access is restricted to fee-paying users.

Experimenting with such licensing mechanisms IATH and ETC have managed to attract the interest of commercial publishers who may prove willing to invest in enhancing, or indeed originating, electronic resources that are held at the centres. Established commercial publishers are at present very tentative about investing heavily in such electronic materials.

Scholarly "electronic books" such as those produced by Cambridge University Press and demonstrated at the workshop by Peter Robinson of De Montfort University, are splendid; they are also somewhat exceptional (cf. Chaucer/cptop.html).

A further and final trend emerged from discussion. By encouraging the development and implementation of standard practices and greater integration of electronic resources, network technologies have to some extent blurred the distinctions that have grown up historically between separable communities of scholars, librarians, curators, computing service professionals and publishers. Functional differences are no longer so apparent as they once were. By mounting their web-site Aberdonian librarians became scholarly authors, museum curators, and of course (electronic) publishers; by adopting standard encoding schemes for the British National Corpus, the scholars and commercial publishers who collaborated in the corpus's development became librarians interested in long-term preservation.

Though such changes could threaten established identities and provoke a degree of territorial infighting, the workshop seemed to demonstrate the ascendancy of a more genuinely collaborative response. Not one of the projects that was demonstrated resulted from the work of a single individual or indeed from within a single academic department, publishing house, library, or museum.

Whereas humanities scholarship was once a solo endeavour, interdisciplinary and cross-institutional effort is becoming the norm. National boundaries too are apparently being lowered. Several of the projects had relied on international collaboration.

In the United Kingdom the joint information systems committee of the funding councils has responded to the trend by establishing the Arts and Humanities Data Service which aims, in part, to broker collaboration among providers of electronic information and network services. In the United States the Getty art history information program has established a Museum Educational Site Licensing Project to provide a similar service focussed more narrowly on the museum and art historical communities ( mesl/home.html).

These collaborationist tendencies were reflected in the complexion of the conference participants, who were drawn in equal measure from arts academic departments, libraries, museums and computing services. They also established that Glasgow University was an appropriate setting for the workshop, for reasons which had little to do with its leading national role in humanities computing.

For the future revealed by the workshop reflected in some large measure Glasgow's Scottish past. The network technologies that are being extended further into our universities, libraries, and museums appear to be making it possible to recreate the community of interdisciplinary and international scholarship that existed 550 years ago when Glasgow's ancient university was founded. As the benefits of these technologies are extended into millions of private homes, it may be possible to establish on a global basis the genuinely democratic educational tradition for which Scotland is so well known.

Daniel Greenstein (d.greenstein@ kcl.ac.uk) is director of the Arts and Humanities Data Service Executive based at King's College London.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.