Simultaneous Interpretation:-

This paper presents the results of a survey on national bibliographies among the members of the Conference of Directors of National Libraries in 2001. The survey is a follow up of Robert Holley’s study “Results of a ‘Survey on Bibliographic Control and National Bibliography’” that was carried out in 1996. All parts of the world are represented by 52 responses. The main findings of this survey are that legal deposit legislation still emphasises textual material but many agencies presently revise their legislation to include more types of material, in particular electronic documents. Furthermore, print remains the preferred format for national bibliographies but more and more agencies are focusing on Internet access. More than half of the agencies are undertaking or planning retrospective conversion programmes.

liographic Standards in the British Library. He has taught at library schools in his native Britain and in the United States-most recently as Visiting Professor at the University of California, Berkeley, School of Library and Information Science (summer sessions). He is the first editor of the Anglo-American Cataloguing Rules, second edition (1978) and of the revision of that work (1988). He is the author of The Concise AACR2 (1989); editor of, and contributor to, Technical Services Today and Tomorrow, 2nd edition (1998); and editor of Convergence (proceedings of 2nd National LITA Conference), and Californien, both published in 1991. Future Libraries: Dreams, Madness, and Reality, co-written with Walt Crawford, was honoured with the 1997 Blackwell's Scholarship Award. His most recent book, published by ALA in 1997, is titled Our Singular Strengths: Meditations for Librarians. Mr. Gorman is the author of more than 100 articles in professional and scholarly journals. He has contributed chapters to a number of books and is the author or editor of other books and monographs. He has given numerous presentations at international, national, and state conferences. Michael Gorman is a fellow of the [British] Library Association, the 1979 recipient of the Margaret Mann Citation, the 1992 recipient of the Melvil Dewey Medal, and the 1997 recipient of Blackwell's Scholarship Award.
(Biodata and photo reproduced with permission from the website of the Library of Congress's Bicentennial Conference on Bibliographic Control for the New Millennium (15-17 November 2000): http://www.loc.gov/catdir/bibcontrol/. Dean Gorman presented the keynote address, 'From Card Catalogues to Web-PACS : Celebrating Cataloguing in the 20th Century' at this conference.) he great irony of our present sit-Tuation is that we have reached near-perfection in bibliographic control of 'traditional' library materials at the same time as the advent of electronic resources is seen by some as threatening the very existence of library servicesincluding bibliographic control. Before considering the question of 'cataloguing the Web and the Internet', it is salutary to review the great achievements of the past thirty years -in considering where we are going it is necessary to know where we have been. When the ideal of Universal Bibliographic Control (UBC) was first advancedl 1 thirty years ago, the international library community was only beginning to discern dimly the possibilities of the interconnection of international standardization and library automation. International standardization was at a very early stage (far closer to an ideal than a reality) and the ideal of each item being catalogued once in its country of origin -the resulting record being made available to the world communityseemed far from practical realization. Records were exchanged between countries (mostly between national libraries), but in the most inefficient manner possibleprint on paper -and, since they resulted 308 from different cataloguing codes and practices, were integrated into catalogues with great difficulty. Soon, however, this was supplemented by the idea that universally used distinctive punctuation, clearly identifying the areas and elements of the SBD, would not only aid in the understanding of bibliographic data in unfamiliar languages but could also be used in automatic translation of that data into MARC records. It is no coincidence that the areas and elements of the ISBD correspond exactly to the relevant fields and sub-fields of the MARC format.
In accordance with the theme of stumbling toward standardization, it should be noted that both MARC and the ISBD were developed initially for books and only later generalized into standards for all types of library material.
The second edition of the Anglo-American Cataloguing Rules (AACR2) is, in fact, nothing of the sort. It was politically expedient at the time to identify this new code as a revision of the previous Anglo-American Catalog[ujing Rules (1968), but AACR2 is completely different from its predecessors in many important ways. One need only cite the facts that AACR2 is a single text (unlike its predecessors, which came in North American and British versions), is the most complete working out of the ISBD for materials of all kinds, and represents the triumph of Lubetzkyan principles, which the first AACR signally did not. Be that as it may, AACR2 quickly transcended even the historic achievement of being a unitary English-language cataloguing code to become the nearest approach to a world code we have.
In the words of the introduction to the Italian translation of AACR2:8 Le Regole di catalogazione, nella loro seconda edizione, sono il codice pi6 diffuso nel mondo (sono state pubblicate in gran numero di lingue diverse) e l'unico che -di fattosvolga le funzioni di codice catalografico internazionale. [The Cataloguing rules, in their second edition, are the world's most widely used (they have been translated into numerous different languages) and the only rules that are, de facto, an international cataloguing code.] This state of affairs is partly due, of course, to the dominance of the English language (in its various manifestations) in the modern world. It is also due, in part, to the fact that AACR2 represents the most detailed working out of the principles of author/title cataloguing set forth in the Paris principles and based on the analysis and pioneering work of Seymour Lubet-zky ;9 and of the application of the ISBD family of standards to all library materials.
Here we stand then, on the brink of Universal Bibliographic Control for all 'traditional' (i.e., non-electronic) materials with a universally accepted format for exchanging bibliographic data, a universally accepted standard for recording descriptive data, and a quasi-universal cataloguing code that is either in use in, or influencing the codes of, most of the countries in the world. Is there any reason in principle why we should not bring electronic documents and resources into this architecture of bibliographic control?
The answer is 'no'. Are there practical reasons why this task is formidable ? The answer is 'yes'.

I have written and spoken else-
where about the problems posed by electronic resources and the proposed 'metadata' approach to bringing them under a form of bibliographic control.1° I will try here to summarize the arguments put forward in those papers and to propose a direction that I advocate for a new age of bibliographic control.
The first issue is that of the electronic resources themselves. Some are closely analogous to print documents -this is hardly surprising as many electronic documents are derived from print documents. Also, there is an established pattern of new technologies adopting the outward signs and structures of previous technologiesjust think of radio news 'headlines' and of television 'magazines' with their 'front pages'. We even refer to elements of websites as 'pages'. Other electronic documents are quite dissimilar and, therefore, do not immediately seem to be amenable to existing bibliographic control structures.
On reflection, however, we can see that there is a commonality between documents that embraces all formats. Electronic documents have titles, dates, texts and illustrations, editions, publishers, relationships to other documents (electronic and otherwise), authors, 11 contributors, and corporate bodies associated with them. We know well how to deal with each of these . The strongest support for this notion of exceptionalism comes from the evanescence and mutability of electronic documents. Those characteristics, which any true librarian deplores, are really the logical outcome of the history of human communication -each format produces more documents than its predecessor, and each is less durable than its predecessor. It takes a long time to make many copies of stones bearing carved messages, but those messages can be read millennia later. You can send an e-mail message from Boston to Addis Ababa in a twinkling of an eye, but that message may be expunged in a second twinkling. Many electronic documents are like those minute particles of matter that are only known because scientists can see where they have been during their micro-milliseconds of existence. Let me pose a deep philosophical question -does an e-mail message exist if it is deleted unopened?
There is another important difference between electronic documents and all the types of library material that preceded them. It centers on how electronic resources come to ,our notice. Let me tell you a short fable. There is an alternative universe in which there are books but no electronic documents. In that universe librarians have no control over the books that they purchaseno selection, no approval plans, and no collection development criteria. All these have been replaced by several trucks pulling up every hour, day and night, to the library's loading dock and depositing heaps of unordered and unwanted booksmostly from unheard-of publishers, vanity presses, and basement selfpublishers. Some of those books might be of interest and use, but which are they, how do librarians and library users find them, and what on earth do they do with all the rest? In that alternative universe, librarianship becomes a much more random, disorganized process than anything on earth. The library would send out squads of trained personnel to root through the piles looking for worthwhile items to be catalogued and shelved.
But wait! This is an alternative universe and, having selected 100 books from the piles and fully catalogued and organized them, librari- The only uniformly successful commercial enterprises in cyberspace are those of pornographers. Libraries as a whole have never collected commercial information or, with few exceptions, pornography.
Print-derived resources. One of the indisputably valuable sectors of the Net is composed of many documents and sites that are derived from the print industry and are dependent on the success of that industry for their very existence. These do not, by and large, present much of a technical bibliographic control problem. We know, in principle, how to catalogue different format manifestations of texts and graphic publications -extending that knowledge into cyberspace is not a massive intellectual challenge. Further, print derived electronic resources are far less transient than their purely electronic counterparts. Electronic journals. Most electronic journals are, of course, based on the products of a flourishing print industry. There have been many forecasts over the last decade that electronic journals will supplant print, but no one has, as yet, produced an economic model for such a major change and there are, at this time, a microscopic number of commercially viable true electronic journals. The problem is, of course, that the whole concept of a journal (serial assemblages of articles which are paid for in advancewhether they are ever read or not) seems inapplicable to the electronic age. Many problems in adapting to technology are caused by simply automating procedures or resources and not re-thinking the whole issue. Television, that great cultural wasteland, has not been as culturally beneficent as film, but it has given rise to video artists like Nam June Paik. In the same way, there are forecasts of new breeds of creators on the Internet including hypertext writers, digital artists, cyberpoets, and electronic musicians. When such productions belong to the same families as materials collected and catalogued by libraries (as is the case with hypertexts) they will be collected and catalogued. Other artistic productions in cyberspace will be the province of museologists, videographers, and art collectors.
Obviously, we need a more detailed analysis of the materials available on the Net and the Web than I have offered here and, crucially, we need more quantified analysis if we are to delineate the problem accurately and frame a response to it. Just as a beginning, we need to know which areas of cyberspace we are going to chart and catalogue and, by inference, which areas we are going to leave to search engines and the like. These will not be easy studies, but facts are a far better basis for planning than are the techno-boosterism and hand-waving that characterize most discussions of these topics.
If we reach a point at which we have decided which electronic documents and resources we are to bring under bibliographic control, two important questions will still remain. Which standards shall we use? How is the cataloguing to be organized?
The first question brings me to the topic of metadata. The term means 'data about data'a mostly meaningless concept that, taken literally, would embrace library cataloguing, even though metadata has been explicitly conceived as something that lacks most of the important attributes of cataloguing. The idea behind metadata is that there is some Third Way of organizing and giving access to electronic resources that is approximately half way between cataloguing (expensive and effective) and keyword searching (cheap and ineffective). Further, it is alleged that such low-level bibliographic data can be supplied by authors, Webmasters, publishers, and others lacking any knowledge of cataloguing.
It is entirely possible, since the original concept of 'metadata' did not originate among librarians, that no consideration was given to the use of 'traditional' cataloguing, and, even though librarians are now involved in the projects, the idea that electronic resources cannot be catalogued using existing standards may be firmly entrenched. Be that as it may, the fact is that electronic bibliographic entities have the same attributes as other bibliographic entities. It is perfectly possible to catalogue electronic resources in such a way that the resulting records can be fully integrated into library catalogues.
There is a recent ISBD for electronic resources14 that will form the basis of the revision of Chapter 9 of AACR2; electronic resources have titles and creators (authors) that can be used to provide standard access points, they have subjects that can 312 be expressed in classification numbers and subject headings, and all that data can be incorporated into a MARC record. In short, if one of the justifications for the invention of metadata is that it is needed to facilitate access to electronic resources in the absence of cataloguing standards, that justification is simply wrong.
Perhaps the decision has been made, almost without thinking it through. That decision appears to be, since 'traditional cataloguing' is too expensive, there must be a compromisesome third way -that will give the benefits of cataloguing without the effort or expense. In the words of the Introduction to the final report of the Nordic Metadata

Projectl5
Many specialists believe that any metadata is better than no metadata at allwe do not need to stick with the strirzgent quality requirements and complex formats of library catalogue systems. Instead, it is possible to live with something simple, which will be easily understandable to publishers, authors and other people involved with the publishing of electronic documents. (My emphasis.) This is one of the few mentions in this long report of the perceived need for, and nature of metadata as an alternative to cataloguing. It is taken for granted that there is something between 'stringent quality requirements' and no quality at all, and that there is something between 'complex formats' and almost no format at all.