Cataloging & Classification Quarterly

Volume 36, no. 2, 2003





 Sandy Roe, News Editor


Welcome to the news column.  Its purpose is to disseminate information on any aspect of cataloging and classification that may be of interest to the cataloging community.  This column is not just intended for news items, but serves to document discussions of interest as well as news concerning you, your research efforts, and your organization.  Please send any pertinent materials, notes, minutes, or reports to: Sandy Roe; Memorial Library; Minnesota State University, Mankato; Mankato, MN 56001-8419 (email:  Phone: 507-389-2155.  News columns will typically be available prior to publication in print from the CCQ website at


We would appreciate receiving items having to do with:


Research and Opinion 





Bibliography of Current Issues in Authority Control (covering January 2000 to July 2001). Prepared by the LITA/ALCTS CCS Authority Control in the Online Environment Interest Group


Adler, Elhanan. “Multilingual and multiscript subject access: the case of Israel.” International Cataloguing and Bibliographic Control 30(2) (April/June 2001): 32-33.


Alexander, Arden and Tracy Meehleib. “The Thesaurus for Graphic Materials: Its history, use, and future.” Cataloging & Classification Quarterly 31(3/4) (2001):189-212.


Ayres, F. H. “Authority control simply does not work.” Cataloging & Classification Quarterly 32(2) (2001):49-59.


Bowman, J. H. “The catalog as barrier to retrieval—Part 2: Forms of name.” Cataloging & Classification Quarterly 30(4) (2000):51-73.


Calvo, Antonio M. “Structuring biographical data in EAD with the Nomen DTD.” OCLC Systems & Services 17(4) (2001):187-199.


Cataloging & Classification Quarterly 29(1/2) (2000) Special Issue: “The LCSH Century: One Hundred Years with the Library of Congress Subject Headings.”


Chan, Kylie, Lily H. Hu and Patrick Lo. “A collaborative project on Chinese name authority control: the HKCAN model.” Journal of East Asian Libraries 120 (Feb. 2000): 1-16.


Chan, Lois Mai. “Exploiting LCSH, LCC, and DDC to Retrieve Networked Resources: Issues and Challenges.” Accessed 2 June 2002 at


Charbonneau, Gary. “Ameritech's Horizon system: part I, cataloging and authority control. The Serials Librarian, 37(4) (2000): 91-111.


Clavel-Merrin, Genevieve. “The need for co-operation in creating and maintaining multilingual subject authority files.” International Cataloguing and Bibliographic Control 29(3) (July/Sept 2000):43-45.


Defriez, Phil. “Thesaurus development at the Department of Health [part 1].” Catalogue & Index: Periodical of the Library Association Cataloguing and Indexing Group 140 (Summer 2001): 1-3.


Defriez, Phil. “Thesaurus Development at the Department of Health: part 2.” Catalogue & Index: Periodical of the Library Association Cataloguing and Indexing Group 141 (Autumn 2001):7-10.


DiLauro, Tim, G. Sayeed Choudhury, Mark Patton, and James W. Warner. “Automated name authority control and enhanced searching in the Levy Collection.” D-Lib Magazine 7(4) (April 2001), accessed 1 June 2002 at


Figueroa-Servin, Reynaldo D., and Berta Enciso. “Subject authority control at El Colegio de Mexico's library: the whats and hows of a project.” Cataloging & Classification Quarterly 32(1) (2001): 65-80.


French, James C. and Allison L. Powell. “Using clustering strategies for creating authority files.” Journal of the American Society for Information Science 51(8) (2000) 774-786.


Graham, Margaret E. “The Cataloguing and Indexing of Images: Time for a New Paradigm?” Art Libraries Journal 26(1) (2001): 22-27.


Greenberg, Jane. “Automatic query expansion via lexical-semantic relationships.” Journal of the American Society for Information Science and Technology 52(5) (2001):402-415.


Hearn, Stephen Scott. “Machine-assisted validation of LC subject headings: implications for authority file structure.” Cataloging & Classification Quarterly 29(1/2) (2000):107-115.


Heery, Rachel, Leona Carpenter, and Michael Day. “Renardus Project developments and the Wider Digital Library context.” D-Lib Magazine 7(4) (April 2001), accessed 1 June 2002 at


Hu, Jiajian. “Transactional analysis: problems in cataloging Chinese names.” Illinois Libraries 82(4) (Fall 2000): 251-60.


IFLA Section on Cataloguing, Working Group on the Revision of FSCH. “Structures of Corporate Name Headings.” Final Report - November 2000. Compiled and introduced by Ton Heijligers. Accessed 5 December 2002 at


Kasten, Eberhard. “Artists’ names--dates--standards and the Allgemeines Kunstlerlexikon project: General Artists’ Directory database; translated version of a paper presented at the German Libraries Conference, May 1999, Freiburg.” Art Libraries Journal 26(1) (2001): 28-32.


Lam, Vinh-The. “Outsourcing authority control: Experience of the University of Saskatchewan Libraries.” Cataloging & Classification Quarterly 32(4) (2001):53-69.


Landry, Patrice. “The MACS Project: Multilingual Access to Subjects (LCSH, RAMEAU, SWD).” International Cataloguing and Bibliographic Control 30(3) (July/Sept 2001):46-49.


Lanius, Lorraine. “Why does a library catalog need authority control?” The Unabashed Librarian 115 (2000): 24.


Long, Chris Evin. “Improving subject searching in Web-based OPACs: Evaluation of the problem and guidelines for design.” Journal of Internet Cataloging 2(3/4) (2000):159-186.


MacEwan, Andrew. “Crossing language barriers in Europe: Linking LCSH to other subject heading languages at the national libraries of France, Germany, Switzerland, and Great Britain.” Cataloging & Classification Quarterly 29(1/2) (2000): 199-207.


McBride, Jerry L. “Faceted subject access for music through USMARC: A case for linked fields.” Cataloging & Classification Quarterly 31(1) (2001):15-30.


Mann, Thomas. “Is Precoordination Unnecessary in LCSH? Are Web Sites More Important to Catalog than Books? A Reference Librarian's Thoughts on the Future of Bibliographic Control.” Accessed 2 June 2002 at


Nicholson, Dennis and Susannah Neill. “Interoperability in subject terminologies: The HILT Project.” The New Review of Information Networking 7 (2001): 147-158.

Olson, Hope A. and John J. Boll. Subject analysis in online catalogs. 2nd ed. Englewood, CO: Libraries Unlimited, Inc., 2001.


O’Neill, Edward T., Lois Mai Chan, Eric Childress, Rebecca Dean, Lynn M. El-Hoshy, and Diane Vizine-Goetz. “Form subdivisions: Their identification and use in LCSH.” Library Resources & Technical Services 45(4) (Oct. 2001): 187-197.


Ostrove, Geraldine E. “Music subject cataloging and form/genre implementation at the Library of Congress.” Cataloging & Classification Quarterly 32(2) (2001):91-106.


Quijano-Solis, Alvaro, Pilar Maria Moreno-Jimenex, Reynaldo D. Figueroa-Servin. “Automated authority files of Spanish-language subject headings.” Cataloging & Classification Quarterly 29(1/2) (2000):209-223.


Ruan, Lian. “Providing better subject access to nonprint fire emergency materials for Illinois firefighters at the Illinois Fire Service Institute.” Cataloging & Classification Quarterly 31(3/4) (2001):213-235.


Ruiz-Perez, R. “Consequences of applying cataloguing codes for author entries to the Spanish National Library online catalogs.” Cataloging & Classification Quarterly 32.3 (2001): 31-55.


Russell, Beth M. and Jodi Lynn Spillane. “Using the Web for name authority work.” Library Resources & Technical Services 45(2) (Apr. 2001): 73-9.


Sadowska, Jadwiga. “Dwajezyki hasel przedmiotowych: KABA and National Library

thesauri.” Translated title: Two subject heading systems. Bibliotekarz 68(4) (2001):13-15.


Shiri, Ali Asghar and Crawford Revie. “Thesauri on the web: Current developments and trends.” Online Information Review 24(4) (2000): 273-279.


Snyman, Marieta M. M. and Marietjie Jansen Van Rensburg. “NACO versus ISAN: prospects for name authority control.” The Electronic Library 18(1) (2000): 63-8.


Snyman, Retha M. M. “The standardisation of names to improve quality and co-operation in the development of bibliographic databases: study of the South African National Bibliography.” Libri 50(4) (Dec. 2000): 269-79.


Syren, André-Pierre. “Cartographie des hommes illustres: vers une liste d’autorité des ‘personalia.” Translated Title: “A cartography of illustrious men: towards an authority list of ‘personalia’.” Bulletin des Bibliothèques de France 45(2) (2000): 87-91.


TePaske-King, Bert and Norman Richert. “The identification of authors in the Mathematical Reviews Database.” Issues in Science & Technology Librarianship 31 (Summer 2001). Accessed 28 May 2002 at


Tillett, Barbara B. “Authority control at the international level.” Library Resources & Technical Services 44(3) (July 2000):168-172.


Tillett, Barbara B. “Authority control on the Web.” Accessed 2 June 2002 at


Vellucci, Sherry L. “Metadata and authority control.” Library Resources & Technical Services 44(1) (January 2000):33-43.


Vellucci, Sherry L. “Music metadata and authority control in an international context.” Notes (March 2001): 541-554.


Wake, Susannah and Dennis Nicholson. “HILT - High-Level Thesaurus Project: Building consensus for interoperable subject access across communities.” D-Lib Magazine 7(9) (September 2001), accessed 1 June 2002 at


Wang, Yewang. “A look into Chinese persons’ names in bibliography practice.” Cataloging & Classification Quarterly 31(1) (2000):51-81.


Wheeler, William J., ed. Saving the User’s Time through Subject Access Innovation: Papers in Honor of Pauline Atherton Cochrane. Champaign, IL: Graduate School of Library and Information Science, University of Illinois at Urbana-Champaign, 2000.


Wilk, David, Shlomo Rotenberg, and Sarah Schacham. “Problems in the use of Library of Congress Subject Headings as the basis for Hebrew subject headings in the Bar-Ilan University Library.” International Cataloguing and Bibliographic Control 30(3) (July/Sept 2001): 43-46.


Yee, Martha M. “Two genre and form lists for moving image and broadcast materials: A comparison.” Cataloging & Classification Quarterly 31(3/4) (2001):237-295.


Zhang, Sha Li. “Planning an authority control project at a medium-sized university library.” College & Research Libraries, 62(5) (Sept. 2001): 395-405.





Editor’s note: Recently I received a thought-provoking description of how one institution’s cataloging staff adjusted their workflow during downtime caused by their migration to a new library system. Subsequently, I posted a call on AUTOCAT and received two more, but suspect that there are a great variety of other solutions still out there. Enjoy.


The University of Illinois at Urbana-Champaign

The University of Illinois at Urbana-Champaign Library is migrating from DRA to Voyager during the summer of 2002. One of the problems that any library faces in a similar situation is how to provide access to currently received materials during the migration. The following describes how the cataloging teams of the University Library are dealing with this problem.


From March 25, 2002 to May 6, 2002, the cataloging teams will not be able to add any new records to the current statewide database, DRA, until the new system, Voyager, becomes available. The conversion of records from DRA to Voyager is a complicated and lengthy process. What will the cataloging staff do during the so called six week “Gap” period? After analyzing the workflow of the cataloging teams, the cataloging teams’ leaders have decided to continue cataloging during the Gap period since providing the best access for the patrons is our number one goal. There is not enough room to store these materials if they remain uncataloged -- the rapid cataloging team alone expects to process around 9,000 monograph records during this period.


How can we provide access when newly cataloged records will not appear in DRA or Voyager for six weeks? To solve this problem, the Library systems office has created an Access database on the Library File Server. The Gap database is not a substitute for the online catalog, but is a way to record new materials processed by the cataloging staff during the Gap period and for the departmental librarians to have information about recently cataloged materials.


The cataloging staff created records on an input form in the Access database entering the OCLC number, title, author, ISBN, publication information, location, call number, copy number, volume number, barcode, and added copy information. The catalogers’ initials are also recorded.  We will need to re-key the records into Voyager when it becomes available for cataloging.


The cataloging staff has been working with the Gap database for almost four weeks now and there are already nine thousand records in it. We are doing our normal cataloging work plus adding the newly created records to the Gap database. We have adapted easily to this workflow.


We considered presenting the Gap database information on the web so that the patrons would know what is available. However, we decided not to do this during the conversion process for several reasons. First, due to the tight time line in the process, we felt there wasn’t sufficient time to create a web interface and adequately notify our patrons. Second, the Gap database does not include all the items cataloged during the “gap period”. For example, original cataloging and cataloging from the special units (Slavic, Asian, Music, Map, etc.) is simply being stored in CatME files until our new system becomes available.


We believe that accessing the materials through departmental librarians is the best way to allow patrons access to the information. In addition to a campus wide announcement and an FAQ section on our Web site, the forty-two departmental libraries of the University Library are using different strategies to let patrons know about the newly cataloged materials. Some departmental libraries are creating lists of their new books to mount on their libraries' home pages while others are preparing printed lists for their patrons. Hence, patrons can borrow newly acquired materials that they need during the Gap period, but might need assistance to find them.


Certainly the Gap database has its limitations. For example, it is increasing the workload for the cataloging teams since they must transfer bibliographic information into the Gap Database. The situation is made even more onerous when diacritics are involved. We expect the workload to be further compounded in May when we transfer the newly cataloged records into Voyager. On the whole, we believe the Gap database is a good solution to help patrons find the materials they need while allowing the cataloging staff to continue working through the Gap period.


Qiang Jin, Social Sciences Cataloger

Original Cataloging Team

University of Illinois at Urbana-Champaign



Binghamton University Libraries: Cataloging during our implementation “freeze” period

Binghamton University (SUNY) completed the migration to Aleph500 (Ex Libris) from NOTIS on January 2, 2002. For the six working weeks from November 7, 2001 to January 2, 2002 we were in a “freeze” period unable to use either NOTIS or Aleph. Throughout the implementation we knew that we would be facing a certain amount of downtime and we spent a good deal of time deciding how best to use it. 


Shortly before the freeze we received training in OCLC’s CatME product which was scheduled to replace our OCLC Passport software. We had already decided that a good time to change from Passport to CatME would be during the freeze. Changing during this time was advantageous since having neither system available would allow us the luxury of slowly and thoroughly training the staff on CatME.


Soon after the CatME training we realized that CatME would help us limit our cataloging backlog during the freeze. We would train the staff on CatME using the approval books that came into the Library during the freeze. OCLC has a high hit rate for records for our major English language approval plan and using the approval books would give all catalogers, copy, adjusted and original, the chance to repeatedly use CatME. CatME allows each staff member to save their work to a bibliographic local file on their own hard drive. This enabled us to catalog the approval items and save the records until our new system became available. Then we could export the records to Aleph using the batch mode for local processing and a second batch mode for updating our holdings to OCLC.


Once January came and we all returned from vacation, we moved the records we had been saving on each hard drive to our new Aleph system. Using our freeze time this way really paid off. CatME was accepted without a look back, primarily due to the time spent on training. Furthermore we did not experience the downtime in cataloging we would have had if we had waited to train until the new system was in production. Utilizing CatME this way allowed us to keep up with our approval deliveries and catalog most of our approval items that came in during the six weeks. While the order records still had to be added and the books received on the new system, we were able to move the books much more quickly through Technical Services since the cataloging was already finished and the records were in the new system. By the end of January we had completely caught up.


In retrospect, we are very pleased that we decided to use our freeze period in this way. Training was done, cataloging continued and we did not experience the frustration of having work pile up while we were between systems.


Sandy Card

Acting Assistant Director for Technical Services and Head of Cataloging

Binghamton University Libraries, Binghamton, NY


Morris Library, Southern Illinois University, Carbondale


Before a library goes live with a new ILS, there is an extended period of preparation. Data mapping for the new system is followed by data capture and then by the inevitable—and dreaded—downtime during which any cataloging done in the existing system will not be reflected in the new one. Cataloging departments face the question of whether to continue to work in the old system and re-enter everything in the new or to shut down and resume when the new system comes up. Given that catalogers have become so dependent upon technology to accomplish their work, the inability to use our computers, even for a short period of time, impacts workflow and presents a severe test of resourcefulness.


During 2002, Morris Library converted from DRA to Endeavor’s Voyager system. Although the new system was eagerly anticipated, cataloging operations were shut down for three months during the transition period. Morris Library is part of ILCSO, a statewide consortium of 45 libraries, and the conversion of bibliographic, holdings, circulation and patron records for all the member libraries was complex and time-consuming. Other libraries might not have to deal with what we considered to be a rather lengthy “gap” period. During the months prior to conversion, regular cataloging continued while we grew busy with extensive data clean up projects and mapping. We also had to find time to plan for downtime activities. What would staff members do?  Data was captured in late March. Although a popular suggestion was to throw daily Cataloging staff parties after that, this idea was ultimately abandoned as too frivolous. 


We had stopped using the MicroEnhancer in 1998 when the library migrated to DRA. Copy cataloging was being done directly in DRA, originals were cataloged on OCLC, and our holdings were loaded to OCLC by a batch process. We decided that retraining in that system was not a feasible option.


We decided to limit our cataloging in DRA to rush and reference material, as well as new serials and withdrawal processing in order to keep DRA somewhat current. This decision met with approval from the public services staff. We kept records of everything we handled during the gap period and this information was entered or updated in Voyager as soon as that system became available in June. We continued to use CatME for cataloging our theses and dissertations.  For the bulk of new materials, we chose to revert to the long-abandoned method of cataloging from printouts once DRA was frozen. After the migration to Voyager was complete, Cataloging staff and student assistants updated the new system by working from the edited printouts.


In addition to interim activities for Cataloging staff, graduate assistants and student workers also had to have jobs to do while the system was down. They continued to sort new material that came from Acquisitions and they also helped with some long overdue filing tasks and general clean up. We arranged with the Curator of Manuscripts for our students to type up inventories which could then be marked up and loaded on the library’s website. This, of course, helped both departments.


Voyager provided train-the-trainer workshops for all libraries in the consortium. We then formed a committee of five staff members with responsibility for preparing training materials for local use. In the final month of system downtime, we conducted a series of training sessions and then all Cataloging staff members and continuing student workers spent a large portion of the remaining weeks working in the training database.


Lengthy system downtimes are a serious concern for catalogers. Collection growth statistics are adversely affected during the time it takes to implement a new system, to train staff, and to become familiar with new procedures. This, of course, makes administrators unhappy. Although everyone acknowledges that there is an explanation for the dip in numbers, it still rankles. It is best to be prepared with some form of cataloging activity and other projects during downtimes to assure the administration that no one is idle.


Daren Callahan, Head, Cataloging Department

Katia Roberto, Special Collection Cataloger

Morris Library, Southern Illinois University, Carbondale, IL


Guide to Cataloging DVDs Using AACR2r Chapters 7 and 9


The OnLine Audiovisual Catalogers, Inc. Cataloging Policy Committee DVD Task Force is happy to announce the availability of their primer for cataloging DVDs using AACR2 Chapters 7 and 9. The primer appears in both HRML and PDF formats, and can be accessed at the following URL: Task force members were Francie Mrkich, New York University; Nancy Olson, Minnesota State University at Mankato (Ret.);        Sueyoung Park-Primiano, New York University; Scott Piepenburg, Infotrieve Library Services; Verna Urbanski, University of North Florida; and Robert Freeborn (Chair), Pennsylvania State University.


Robert B. Freeborn, Music/AV Cataloger

Pennsylvania State University Libraries


National Library of Medicine Classification, 2002 Edition


The National Library of Medicine is pleased to announce the availability of the 2002 edition of the National Library of Medicine Classification. Beginning with 2002 the NLM Classification ( is being published annually in electronic form only. Publication of printed editions ceased with the 5th revised edition, 1999.


The Index to the NLM Classification consists primarily of MeSH® concepts used in cataloging. It includes concepts first appearing in the latest edition of MeSH and other older concepts as warranted by literature cataloged. Schedule numbers are added or revised to reflect changes in the biomedical and related sciences.


The new online environment offers many advantages over print, including hyperlinks between terms and the MeSH Browser and class numbers; however, the biggest improvement will be in NLM's ability to keep the Classification current with changes in MeSH.


Christa F.B. Hoffmann, Head,

Cataloging Section, Technical Services Division

National Library of Medicine, Bethesda, MD


Library of Congress Authority Records now available online


Library of Congress authority records are now available online on the Library's Web site at Known as Library of Congress Authorities, the free online service allows users to search, display and download authority records in the MARC 21 format for use in local library systems.

An authority record is a tool used by librarians to establish forms of names (for persons, places, meetings and organizations), titles and subjects used on bibliographic records. Authority records enable librarians to provide uniform access to materials in library catalogs and to provide clear identification of authors and subject headings. Authority records also provide cross references to lead users to the headings used in library catalogs.


The Library of Congress database contains more than 5.5 million authority records. Through the Library of Congress Authorities service, users have access to these authority records, including 3.8 million personal, 900,000 corporate, 120,000 meeting, 90,000 geographic name authority records; 265,000 subject authority records; 350,000 series and uniform title authority records; and 340,000 name/title authority records.


The Library is currently working with Endeavor Information Systems to provide access via Z39.50 (an international standard for information retrieval) and other features such as the full MARC 21 character set for display and download of authority data and access to the approximately 2,300 subject subdivision records in the Library of Congress Subject Headings.


This new service was made available on a trial basis on July 1. During the trial period, the Library sought feedback from users worldwide to assist in evaluating the service. User response was overwhelmingly positive. Based on their input, the Library has made improvements to Library of Congress Authorities and decided to offer his free service on a permanent basis.


The Library welcomes comments from users, which should be sent via e-mail to


A PowerPoint presentation on Library of Congress Authorities is available at


Reports from the OnLine Audiovisual Catalogers (OLAC) Conference held in St. Paul, Minnesota, September 27-29, 2002


Media Cataloguer’s Long Journey to the Twenty-First Century: A Keynote Presentation by Jean Weihs

Reported by Verna Urbanski, University of North Florida


When Jean started out in the profession, librarians were mostly single women. Single because, like many other professions in the 1950s, once it became known that a female librarian was married, it was expected than she would want to focus solely on her family. She was expected to surrender her career, or, if continuing to work, she was to be paid less than before because (after all) she had her husband’s income to depend on! These comments set the stage for an enjoyable and informative presentation by one of our best media librarians. Jean’s presentation described the world where she began her journey of discovery. Men were quickly promoted on the job. The public perceived no difference between librarians and library workers – they all just checked out books, right?


In 1966, Jean became a media cataloger for a school board in the Toronto area. In 1967, she began to catalog AV materials only to discover that there were no definitive written guidelines for cataloging AV. The Anglo-American Cataloging Rules published in 1967, documented different routines for different media. In later years, C. Sumner Spalding, general editor of the 1967 rules, revealed to Jean that he locked himself away in his office in the Library of Congress and wrote Part III of the 1967 AACR on his own in just two weeks. Part III was based on three sets of separate rules used at the Library of Congress for different categories of materials. Part III was “used by few libraries and condemned by many!” Because the Library of Congress was not permitted to acquire kits, LC cataloged the visual item as the dominant medium, e.g. filmstrips, with accompanying materials.


After canvassing her fellow catalogers in hopes of finding some universal methodology at work, Jean’s worst fears were confirmed. There were no existing standards for the consistent description of various media materials. There were a variety of local circulating systems with various treatments for different materials and there were storeroom collections that were uncataloged and, for the most part, unknown and unusable. After extensive consultation with cataloging colleagues it became obvious that they all were in a similar situation. After much trial and error, in 1970 Jean and two colleagues produced a preliminary edition of their now standard text, Nonbook Materials: the Organization of Integrated Collections. This book served as a rallying point to announce the relevance of AV materials in collections and as a focal point for discussions on how to catalog media materials. It introduced the concept of a single record for different formats. The 1973 edition included the first occurrence of the concept of entry under performer. It was primarily through the leadership of Jean and the success of her book that the treatment of media materials became a serious issue in the second edition of AACR and its refinement AACR2R. With the emergence of the notion that it was good to catalog from the item in hand, the description of media materials began to mature. Today, following the initiatives begun at the 1997 Toronto conference on the principles and future of AACR, media cataloging stands at another turning point. The addition of electronic and digital formats brings new considerations to the forefront.


When a person has long involvement in a specialty, it is easy to see the same controversies emerge periodically. As colleagues leave the profession or move to other assignments, the cataloging issues fade, re-emerge and are recast. Jean concluded her remarks by emphasizing the importance of developing and adhering to standards. She urged catalogers to resist the temptation to catalog to suit our patrons but to rather adhere to national standards while speaking out on problems that we see. Committees and policy-making groups need more than ever to hear from the cataloger on the street. Even though it may be intimidating, it is important to put yourself and your views forward, especially for those working in small libraries. It is all too easy for national rules and policy to be set only in terms of what works in large organizations and libraries. Jean commended OLAC for its excellent work in the field and declared it to be the best library organization available. Jean’s career is a testament to the importance of coming to an agreement, nurturing consensus and standing up to be counted.


IMAGES: A Metadata Sharing Initiative at the University of Minnesota, A plenary presentation by Charles Thomas

Reported by Verna Urbanski, University of North Florida


Chuck Thomas was the second plenary speaker of the conference and described the building of a metadata sharing community at the University of Minnesota. On a campus as large as the University of Minnesota, there are lots of collections of data that have accumulated without any coordination. Departments and researchers had tried different approaches to storing their computerized information. This led to a lack of documentation about the data and made interoperability nearly impossible. Databases full of interesting and useful information existed in isolation. There was heterogeneous content under varying degrees of content control. Since the audiences for the data varied, the organization and presentation of it also varied. There was a serious need for someone to apply consistent principles of design and form a metadata sharing community. A metadata sharing community is based on a consistent design principle and provides functionality, support and content control. It is information and resource discovery in a distributed environment.


In a decentralized environment there are little silos of information that are underutilized if not shared with a larger audience. IMAGES (Images Metadata Aggregator for Enhanced Searching) is intended to link sources of information while providing a consistent, easy to understand format. Since much of this information resides outside the library, the approach and methodologies employed need to be universally appealing to non-librarians, while still fulfilling the rigorous descriptive traditions that make libraries such a success in information handling. Otherwise, there is a real danger that departments will not cooperate with exposing their data to a larger audience.


Since the project is intended to provide a larger audience for isolated but desirable stores of information, one of the main challenges of the IMAGES project is to discover existing information and persuade the current holders to participate in the project. For digital collections to be sustainable, staff need multiple skills and the latest technology. This often is not available out in the departments. Even when holders of the information recognize that they do not have the resources to provide the data the exposure it deserves, it is hard for them to overcome their fear of loss of control.


The IMAGES initiative works to find solutions to this information sharing dilemma. IMAGES staff determine the scope of the information, try to anticipate the needs of the users of the information while serving the dual purpose of both delivery and management. Some of the success of the initiative comes down to personal negotiations. Data must be massaged. IMAGES staff work with academic department staff to train them in the use of record editing software and then continue to serve as consultants to the department when needed. All of these activities serve as the beginning foundation for a future of cooperation across the campus. There is much to be done to promote the IMAGES initiative, especially outreach efforts and training programs to help faculty learn how to build a sustainable collection. As with any complex undertaking, there are unresolved issues most important of which is a new role for libraries in hosting this data and training department staff to produce the descriptions. Libraries must also discover how best to integrate this new type of diverse information with their traditional (and sometimes not so traditional) resources.


Advanced Realia Workshop, presented by Bobby Ferguson, East Baton Rouge Parish Library

Reported by Ian Fairclough, Marion (Ohio) Public Library


This workshop was definitely "hands-on" and those present had a challenging array of objects to catalog. These included: a quilted angel, on a heart-shaped base which became the angel's wings (a wig and halo were missing); a corn husk doll made in 1902 as a toy; a lead sinker; a piece of petrified wood; a sand dollar; a musk turtle shell; fossil of a small minnow embedded in a rock; a hand-cast bronze wombat from Australia; a unicorn, bearing the words "solid brass" on the bottom; a lotus pod with seeds inside; a hand-carved jade dagger (a replica of an ancient sword); an amber ring, embedded in which was an eight-legged bug; the tooth of an unidentified animal from New Guinea; a wasp's nest; a pill maker, consisting of two wooden parts that are rubbed together; a ball-point pen from Kazakhstan, in the shape of a cultural dress; and a necklace, of hand-made sterling silver with malachite inlay.


One object for cataloging was provided by this reporter, a British 50-pence coin, which, in addition to the distinctive rounded heptagonal form, also bears an end-on picture of an open book with the words, "Public Libraries 1850-2000". This item differed from all the others in that it actually bears a title, statement of responsibility, and publication date! An OCLC master record (#50767237) has been prepared for it.


Another item was represented by a "surrogate" photograph--a lion's wig from the movie, The Wizard of Oz. In addition to these items, a copy of OCLC's workform for realia was distributed, along with numerous examples of bibliographic records, some of which may have been duplicates. Since description of realia is dependent so greatly on cataloger-provided description, identification of duplicates is a fine art.


After discussion of the items and of the terms "realia," "artifact," and "replica," the participants worked in small groups to provide draft bibliographic records. Typical of data elements in this format are: the absence of a publication statement, each item usually being one of its kind and therefore unpublished; the need for brackets in almost all cases to indicate that a title was provided by the cataloger; and the importance of providing subject access (if possible, multiple subject headings are recommended) in the absence of other access points.


A basic principal for cataloging realia is: "If you know it, put it." This is because, unlike other media, very little information is usually available for description. In conclusion, participants discussed the records they had created, giving opportunity for further input from others present.


Cataloging Digital Sound Files: AACR2 Chapters 6 & 9, presented by Robert Freeborn, Pennsylvania State University

Reported by Gayle Porter, Purdue University Libraries


Robert Freeborn began by defining and offering examples of various types of sound files, including MP3, AAC, RealAudio, and WMA. Freeborn said that both MP3 files and digital sound files are popular because they can be compressed, and it is easier to send a compressed file for video streaming purposes. He gave examples of various types of MP3 players and explained the purpose and the process of file compression. Other concepts defined included, ID3 tags (the sound file’s catalog record); digital automated music (DAM) CDs, pocket or mini PCs, which are similar to a palm pilot but are produced by Microsoft, and an “enhanced” CD, which is a sound recording that has been enhanced.


Freeborn provided a list of resources about the topic, including sources for direct-access and remote-access files, the web citation for MLA’s Copyright for Music Librarians <> and the citation for Scott Hacker’s book on MP3. He referred to the Variations project at that includes the addition of notes for streaming audio files. He advised librarians to establish a collection development policy for sound files.


Freeborn explained a number of rules and provided interpretations from both Chapters 6 and 9 in coordination with their respective MARC tags. Freeborn advised us to follow standards and to be consistent in our local catalogs. Using nonstandard practices might come back to haunt us later when migrating to another system, and the migration process could hinder the local enhancement of records.


Several examples of cataloging records for both direct-access and remote-access sound recordings were shown. One such example described a remote-access non-sound recording. Freeborn explained that the library bought access to this title, and users were allowed to download it to their own, or the Library’s equipment. The 538 field for this item described the type of equipment needed to use the item. A possible local note in the record might be: “Access provided by Library’s MP3 player.”


Freeborn said that people could access files that are available in more than one format and download them from the website, Appropriate notes should be made about this in the catalog record. The Kalamazoo Public Library has an agreement with to display records in their catalog and to circulate MP3 players. Robert Freeborn’s detailed presentation was assisted by his handout that paralleled his PowerPoint presentation on the topic.


Cataloging with AMIM, presented by Jane D. Johnson, UCLA Film and Television Archive

Reported by Mary Huismann, University of Minnesota


Jane Johnson provided a most interesting overview of cataloging with AMIM, which is the acronym for Archival Moving Image Materials: A Cataloging Manual, 2nd ed. (Washington D.C. : Library of Congress, 2000). AMIM was first published in 1984, before the VHS era, by a committee of the Library of Congress Motion Picture, Broadcasting, and Recorded Sound Division staff. It is available from CDS (and also on Cataloger’s Desktop). Updates can be found at the Cataloging Policy and Support Office Website: <>.


AMIM can be compared to other cataloging manuals for specialized materials such as Descriptive Cataloging of Rare Books and Archives, Personal Papers, and Manuscripts. The rules in AMIM are laid out similar to AACR2. In addition, AMIM contains six appendices, glossary, bibliography and index. AMIM refers to AACR2 for many things, including punctuation, abbreviations, GMDs, series, and sound recordings.


AMIM is used by a variety of constituencies, including moving image archives, libraries with archival collections, and archives cataloging current commercial releases.  It works within the general framework of AACR2 Chapter 7. AMIM can be used with all types of moving image materials in any physical format. 


Johnson reviewed a few basic cataloging principles, including Cutter’s objects of the catalog (particularly the finding and collocating functions). A brief overview of the Functional Requirements of Bibliographic Records (FRBR) was given with examples.


Preservation is a priority with AMIM. The inclusion of multiple manifestations (“copies”) on a single record facilitates comparison. There is also an emphasis on provenance, history of works, and the relationship between expressions of a work. Since the description is based on the original expression of a work, the cataloger must be prepared to do research.


AMIM’s strengths include: expression-based cataloging, entry of television programs, detailed rules for series titles (including television), supplied titles, breaking conflicts using uniform title, guidance on outtakes, trailers, etc. and for choice and placement of statements of responsibility, what constitutes a new version, notes, special rules for release and broadcast, detailed physical descriptions. It also addresses ambiguous terms specific to moving images.


Expression-based cataloging-- a strength of AMIM is also a complication. The description of an item is based on the original release not the item in hand, necessitating research by the cataloger. A change in title alone does not warrant a new record (only change in content), and the title on the item is not necessarily recorded in the title area. Examples of problematic re-release and reissue titles were given. Other complications include uniform title rules that do not always lead to a logical index display, the non-use of parallel titles, and that the use of AMIM can be a barrier to shared cataloging. The underlying assumption of AMIM is that the title in hand is unique and the agency holds the original.


The workshop concluded with a brief cataloging exercise, using AMIM. A handout of the presentation slides and a handout summarizing cataloging principles, differences between AMIM1 and AMIM2, and a list of AMIM features were distributed to participants.


Creating Annotations for Nonbook Materials, presented by Donald Clay Johnson, Ames Library of South Asia

Reported by Jeannette Ho, Texas A&M University


In this workshop, Donald Clay Johnson led the participants in a discussion about the process of creating annotations for nonbook materials. The discussion focused primarily on websites. He began by introducing his role as the South Asia Specialist at the Digital Asia Library <> where he edits annotations for websites to be added to its collection. He then asked participants to critique examples of annotations for websites and books contained on two handouts. At the end of the workshop, he led the participants in creating annotations based on web page printouts contained in a third handout.


It was agreed that annotations need to be succinct, factual, and objective in order to be included as summary notes in catalog records. In addition, annotations should describe resources in enough detail to help readers make informed decisions about whether to use them. In particular, workshop participants noted that the first annotation for an organization’s website merely described the organization without describing what was on the actual site. According to Johnson, an effective annotation for an agency’s website should do the following: allow readers to rapidly identify the type of organization being described, and describe what readers will find once they link to the website, including its features, and give them a sense of why they would want to see it.


Johnson stressed that creating effective annotations is an art, not a science. Catalogers should keep the questions, “Who, what, when, where, and why” in mind, even though not all of them may apply to a particular resource. Catalogers should also seek to bring out information that is unique to a geographic region.


Participants discussed how annotations should avoid the use of emotion-based words that express value judgments (e.g., “stupendous,” “horrendous,” etc.) and to avoid quoting directly from websites and book jackets, since these sources tend to use excessive emotional language to promote the resources they describe. In addition, they may not present accurate information. For instance, Johnson commented that catalogers should be alert to information on websites that may seem illogical.


Workshop participants also considered ways to make annotations succinct. It was agreed that long, wordy annotations would only increase the number of false hits during keyword searches. Annotations for websites should only report information that is expected to remain stable and omit details that may change over time (e.g., number of publications, number of departments of an agency, etc.). As the content of websites are continually changing, an annotation only describes a site at a single point in time. This fact makes it extremely important to include the date the cataloger viewed the resource in the catalog record. In addition, annotations for fictional works should indicate that they are fiction, but should not contain so much detail as to give away the entire plot.


The pros and cons of including important terms not represented by controlled vocabulary were also discussed. Participants were cautioned to use judgment when deciding to include special terms in annotations, and to consider whether they truly help users or cause them to have greater difficulty with keyword searches. For example, one annotation on the second handout contained the word “Dalit” (an “untouchable” person in Indian society). Participants discussed whether including this term in an annotation for a fictional work would help users looking for novels about Dalits, or merely increase the number of irrelevant results for researchers conducting keyword searches for scholarly materials on this topic. According to Johnson, users are easily overwhelmed by vast numbers of irrelevant results during keyword searches, and often have difficulty evaluating which resources would meet their needs. Thus, it is important to consider one’s audience when deciding whether to include such terms in summary notes.


Finally, Johnson led the audience in creating annotations for two websites on his third handout-- an overview of the “Chipko Movement” in India, and the “about” page and contents pages for the Digital South Asia Library. Each annotation created during the workshop consisted of a brief description of the subject of the particular site, which included important key terms (names of geographic areas and agencies), and a list of features and resources to be found on the websites. Broad and redundant terms (“environment,” and “online”) were avoided, as well as the term “full text,” since libraries often use full text content as a selection criterion. While Johnson initially used complete sentences, workshop participants recommended using phrases.


Overall, I found this workshop to be interesting and informative. It provided an excellent opportunity for librarians to share their insights, observations and strategies for writing effective annotations. In particular, as the instructor was not a cataloger, it was interesting to learn how annotators at the Digital Asia Library approached this task differently from library catalogers. In this way, the workshop provided an opportunity for both sides to learn from each other.


Electronic Resources Workshop, presented by Steve Miller, University of Wisconsin—Milwaukee Libraries

Reported by Kelley McGrath, Ball State University


Steve Miller provided an excellent overview of the 2001 and 2002 changes to AACR2 and how they affect electronic resource cataloging. He began by summarizing the 2001 changes to Chapter 9, such as rule 0.24’s requirement to describe all aspects of an item, the new GMD “electronic resource,” and the option to use conventional terminology in the 300 field.


The bulk of the presentation focused on the much-anticipated new AACR2 Chapter 12 and integrating resources. Miller provided tentative examples of best practices, but warned that the practical details of implementation are still being worked out and said that catalogers need to stay tuned for more definitive versions.


The new category of integrating resources applies only to remote-access electronic resources, so the focus of the discussion was on websites, most of which were previously considered monographic. Some websites will still be considered monographs and e-journals will continue to be treated as serials, but the majority of websites will now fall into the new category of “integrating resources.” Miller provided a useful chart showing the relationships between serials, monographs, and integrating resources. He covered both the rules for creating a new record for an integrating resource and the rules for updating an existing record for an integrating resource when information has changed. Since cataloging departments are unlikely to have the resources to regularly revisit sites to check for changes, updates will most likely take place during the copy cataloging process or as reference librarians, patrons, or link checkers note a change. It was also mentioned that one way to search for sites, which were cataloged under an old name, is to search by URL. Some participants expressed concern about who would be able to update information about integrating resources on utilities, in particular, in OCLC.


Miller also introduced the fixed fields and other MARC fields traditionally associated with serials cataloging (such as 362), which will be used for integrating resource records, but are currently unfamiliar and intimidating for many catalogers with monographic backgrounds. He described some of the problems associated with establishing dates for electronic resources and provided example of ways of handling common situations, but also stated that further guidance is needed in this area. Since 008 BLvl fixed field “i” for integrating resource is not yet available, catalogers should continue to use BLvl “m” for integrating resources until its authorization. There was also discussion of how many and what kinds of changes make an integrating resource a new work, which would require a new record. There seems to be no definitive answer to this question. Miller also distributed some useful handouts on MARC coding for electronic resources and some practice exercises. This was an informative and timely introduction to important recent changes in electronic resource cataloging.


Graphic Materials, presented by Nancy B. Olson

Reported by Michaela Brenner, Carleton College


Nancy Olson promised that, in these two hours, we would learn all there was to cataloging graphic materials, but before we began, we had the pleasure to be among the first to see the brand new AACR2.


The new AACR2 has several sensible improvements, like the new three-hole binding and the promise to make it easy to add amendments without overlapping pages. Ms. Olson brought to our attention a couple of well-hidden, but significant changes. What was formerly 1.4D4 was deleted, and 1.1B1 now instructs, what had already been practice for Chapter 7, namely, to exclude introductory words and to go straight to the title. After “reading it several times” and after “reading all related LCRIs”, Ms. Olson was still rather displeased about the new Chapter 12, which is somewhat overwhelming, and said: “If you don’t understand it, maybe it will make you feel better that I didn’t understand it either, and I have to teach it.”


Chapter 8, graphic materials, includes a very wide range of two-dimensional material. There were no changes made to Chapter 8 at this point. The most important source for guidance is Graphic Materials: Rules for Describing Original Items and Historical Collections by Elisabeth Betz Parker. Unfortunately, this book is out of print, but the current edition is available online at:



With different example posters, Ms. Olson led us in detail through fixed and variable fields. Often, there is not much information available regarding title, publisher and artist. Another difficulty is the assigning of a GMD that actually makes sense. The only GMD available for a poster, for example, is [picture]. Ms. Olson cautioned us not to use homemade GMDs, but rather leave them out if it would serve to avoid confusion. To make up for the lack of GMDs, there is a long list of SMDs. Notes also are great way to add all the extra information the user may need.


Ms. Olson then briefly talked about digitizing an art collection and pointed out that setting up guidelines was the most important step. Generally, the original item is cataloged, and fields for electronic resources are added. The pattern is similar to microform cataloging.


There were not many questions left at the end of this clear and detailed presentation. Even for a beginner, it was very easy to follow. Ms. Olson had kept her initial promise.


Map Cataloging Workshop, presented by Mary Lynette Larsgaard, University of California, Santa Barbara

Reported by Allison M. Sleeman, University of Virginia


Ms. Larsgaard began the workshop by distributing a map and opaque scalefinder (natural scale indicator-- a necessary tool for map catalogers) and had participants open their maps, thereby demonstrating the importance of having a large space to catalog maps. She provided some helpful information about where to order materials and locate information useful to map catalogers. This included the address from which to order the plastic natural scale indicators, which we had received courtesy of the vendor Map Link. Natural scale indicators can be ordered from: Charles Conway, Department of Geography, Memorial University of Newfoundland, St. John’s, Newfoundland, A1B 3X9, Canada.


Some useful URLs for map catalogers include: Western Association of Map Libraries’ Map Librarian’s Toolbox; Date Codes for Maps, compiled by Phil Hoehn; DMS Converter (which converts geographic coordinates from degrees, minutes, and seconds to decimal degrees); and MARC21 description of 342 field


Ms. Larsgaard was very familiar with the changes in Chapter 3, having been active in collecting and submitting rule change proposals. There are three basic kinds of changes in Chapter 3: changes necessitated by cartographic materials in electronic form; miscellaneous changes to rules to reflect cataloging practice; and editorial changes. In this workshop she emphasized substantive changes and concentrated on the parts of the bibliographic record that were different for cartographic materials.


Most of the substantive changes occur in MARC Area 3, tag 255 for mathematical data including scale and MARC Area 5, the 300 field. Handouts were provided, covering the specifics of these fields.  Highlights of major changes in Area 3 are: those dealing with scale; those dealing with coordinates; and raster/vector and file type that can now be given in field 352 (3.3F, new rule).


Highlights of changes in Area 5 are: the new lists of terms, which can be used as the specific material designations in the 300 field include: atlas, diagram, globe, map, model, profile, remote-sensing image, section, view. (3.5B1); clarification of the terms map and sheet (3.5B2); a list and description of elements, which can be used in 300 subfield b (3.5C1-3.5C8).


She then gave some general information and advice about cataloging maps, including 3 alternatives for cataloging a sheet, which contained 2 maps of the same size: 1) use 2 separate records; 2) consider both maps to be main maps, and put both titles in the 245 field; or 3) decide that one map is the main map, and describe the other map as being on the verso (Larsgaard’s preferred method).


Latitude and longitude were also discussed. Lines of latitude are always parallel; longitude is not since the lines become closer together as one gets towards the top of a map/globe. This was demonstrated on an inflatable globe. When a cataloger is listing the coordinates, it is easiest to remember left then right for longitude; top then bottom for latitude.  Ms. Larsgaard uses degrees of latitude, then statute miles most often when using a natural scale indicator. She instructed the class in how to determine the scale of a map using latitude with the scale indicator.  The most accurate scale is in the center of the map.


The 342 and 343 fields, described in MARC21, are generally used for electronic cartographic materials. They are necessary for letting users know what kind of software is needed. Put this information in the catalog record if it is clearly stated such as in the readme file.


The new Chapter 3 is an improvement over the “old” Chapter 3, reflecting what map catalogers have been doing for the last 20 years.


In closing, Ms. Larsgaard announced that the second edition of Cartographic Materials: A Manual of Interpretation for AACR2, prepared by the Anglo-American Cataloging Committee for Cartographic Materials, will be published in loose-leaf format by ALA publications during 2003. The 1982 edition has been out-of-print for quite some time.


Videorecordings Cataloging Workshop, presented by Jay Weitz, OCLC

Reported by Joan Colquhoun McGorman, Southeastern Baptist Theological Seminary


This was a practical workshop for catalogers who already had knowledge of the basics of visual materials cataloging using AACR2 and the MARC format. Jay Weitz briefly reviewed the background of the rules. Although AACR1 required that motion pictures be entered under title, the rules concerning works of shared responsibility in AACR2 usually lead to the same result.


Certain problem aspects of cataloging videorecordings were discussed in some detail.


There was considerable discussion about variations in information presented in the sources of information for videorecordings and how this has led to multiple records in OCLC for works that are most likely the same bibliographic entity. Catalogers must be alert to differences among information from title frames of the video, the container and labels. The chief source is still the title frame sequence on the film. If any other source--such as the container or label--is used instead, that source must be specified in a note. Differences in titles in the various sources should be recorded and given access using the 246 field.


Names of publishers, distributors, producers, production companies, etc. can often be very confusing. Depending on the cataloger’s interpretation of the information, the same names might appear in field 245 ?c in one record and in 260 ?b in another. Different catalogers can interpret the same information differently. Even the same cataloger can interpret the same information differently at different times.


Differences that justify a new record include: black and white vs. color vs. colorized, sound vs. silent, dubbed vs. subtitled, different language versions, different formats (VHS vs. Beta vs. DVD, and NTSC vs. PAL color reproduction standards), and significant differences in length which might be due to differences in content, such as the inclusion of restored scenes.


Differences that do not justify a new record include: absence or presence of multiple publishers, distributors, etc., as long as one on the item matches one on the record, and changes in date which relate only to the packaging.


Confusion over interpretation of such differences has resulted in OCLC having many records that are probably duplicates. When in doubt, catalogers should use an existing record whenever possible, and edit it for local use.


Having some knowledge of the history of various formats of videorecordings can help catalogers avoid errors in recording dates. Regardless of when the filming was done, the publication date cannot precede the introduction of the format. Therefore, it is useful to remember that Beta began in May 1975, VHS in September 1977, and DVD in March 1997.


Although some experienced video catalogers seem concerned about cataloging DVDs, Weitz stressed that the same rules and procedures are used for cataloging DVDs as for any other video material.


He gave some guidelines to follow for DVDs. The GMD is [videorecording]. The SMD to be used in the 300 field is videodisc with the size (4 ¾ in.) given in subfield c. Use the System Details note (538) to record “DVD” plus any applicable special characteristics of sound, color, etc. This is also the field to note information about regional restrictions, which are indicated on the DVD by a picture of a globe with a number superimposed on it (1 means United States and Canada)


Since DVDs have such an immense capacity, they often have substantially more material (trailers, documentary material, outtakes) than comparable VHS releases and should be given Date Type: s. A note should be included about the date of the original release. DVDs often have various optional sets of subtitles and closed captions, which should be recorded in fields 546 and 041.


The last part of the workshop was devoted to streaming video. This is an Internet data transfer technique that allows the user to see and hear audio and video files without lengthy download times. The “host” or source “streams” small packets of information over the Internet to the user. Few catalogers have experience with streaming video yet. Cataloging this format will require catalogers to use rules and MARC format fields for both videorecordings and computer files. Assistance with learning terminology for this new format can be found at the website:



The following are some guidelines for streaming videos. Use Form s for electronic. Use field 006 for computer file. Use field 007, which is repeatable, for videorecording and for remote access computer file. The GMD is [electronic resource]. Field 300 is not used. Include a 500 note “Streaming video” and, optionally, the duration. The 538 field should specify which streaming video player is required, along with any other requirements, such as modem speed. An additional 538 field is needed to specify the mode of access, e.g., World Wide Web. The 856 field provides the URL needed to access the streaming video being described.


Although new formats present new challenges, Weitz encouraged catalogers to apply existing cataloging rules with confidence and said, "Don’t agonize!"


[Editor’s note: Additional descriptions of this conference as well as links to handouts and PowerPoint presentations for many of the workshops described above are available from the OLAC website,]



Notes from the American Society for Information Science and Technology (ASIST) Annual Meeting, held in Philadelphia, Pennsylvania, Nov. 18-21, 2002


Following are synopsis of three sessions that provide insights into issues and questions that should be of interest to readers of CCQ. These include: Subject Metadata from the Other Side, Semantic Web, and The Changing Face of Scientific Communication.


Reported by Abe Crystal, Ph.D. student, UNC-Chapel Hill


Subject Metadata from the Other Side


This session, moderated by Francis Miksa (Professor, Graduate School of Library and Information Science, Univ. of Texas at Austin), addressed the issue of subject metadata from the "other side," that is, metadata created not by professional catalogers but by resource authors or others without training in formal cataloging techniques. Jian Qin (Syracuse) discussed the creation of a workforce development portal using collaborative modeling. The portal attempts to model goals, users and practices using concept structures and concept types, limited by various constraints. This led to some fundamental questions about metadata usage, such as: Which elements are needed to represent a particular domain? What level of terminological complexity and abstraction is appropriate?


Grete Pasch (Univ. of Texas) discussed her study of implicit subject metadata on web pages. Implicit metadata is "discovered" by examining the structure of the page's markup (for example, page anchors, comments and ALT tags), rather than defining explicit metadata fields (e.g. META tags). Implicit metadata is intriguing because it avoids the potentially high cost of metadata creation, but it is difficult to construct metadata records from implicit evidence in a systematic way.


Ed O'Neill (OCLC) introduced Faceted Application of Subject Terminology (FAST,, a faceted reconceptualization of Library of Congress Subject Headings (LCSH). FAST is intended to be simple, intuitive, scalable and interoperable. It is designed to minimize the difficulty of constructing subject headings while still retaining a rich vocabulary and upward compatability with LCSH. The current schema has eight facets: topical, geographical, genre, time, personal name, institutional name, conference/meeting and uniform title. FAST is still under development, but it promises to be a valuable tool for digital libraries as well as emerging metadata-driven information architectures. Most importantly, its simplicity could allow for more widespread, decentralized metadata creation.


For decentralized metadata creation to become a reality, resource authors must be able to contribute useful metadata records. Jane Greenberg (Univ. of North Carolina) reported a baseline study by the Metadata Generation Research Project ( examining author-generated metadata. Scientists at the National Institute of Environmental Health Sciences (NIEHS) were asked to create metadata records for webpages for which they created the intellectual content. The study found that authors are a promising source of metadata, with approximately two-thirds of the records receiving acceptable evaluations for both specificity and exhaustivity. Three-fourths of subject keywords were rated acceptable (defined as facilitating resource discovery and accurately representing website content), suggesting that author-generated metadata could contribute to improved information retrieval.


The Semantic Web


Jim Hendler (Univ. of Maryland and W3C) and Eric Miller (W3C) discussed the W3C's current activity in defining and implementing Semantic Web standards ( The W3C's goal is to "add layers" to the existing web infrastructure, enabling a gradual, seamless rollout of new capabilities. Current HTML hypertext only supports very limited types of connections--document A links to document B. The Semantic Web promises much richer connections such as "requires," "is based on," or "has author." The core standard here is RDF, and the W3C is working to clarify and improve the abstract model that RDF embodies, as well as to complete the RDF vocabulary descriptions in the RDF Schema standard.


The Semantic Web Advanced Development (SWAD) project ( sponsored by W3C is developing core software components for building Semantic Web applications. By doing so, they hope to facilitate deployment of Semantic Web technologies, stimulate complementary areas of research, and prepare for future standards creation. Currently, the major initiative is TAP (, an application framework for building the Semantic Web. TAP is intended to provide simple tools for rapidly developing Semantic Web applications, in particular "semantic search engines" that allow for more precise queries.


Jim Hendler currently chairs the Semantic Web Ontology Working Group, which is developing the OWL (Web ontology language). OWL attempts to extend the thesaurus model to provide semantic restrictions and relationships. Its predecessor, the DAML+OIL combination, has been rapidly gaining adherents, and new tools that make ontology creation accessible to end users are beginning to emerge. For example, one interface allows users to link web markup to terms in an ontology. Ultimately, this might allow such sophisticated searches as retrieving a photo based on a description of its subject's posture (Hendler used the whimsical example of a monkey scratching its head.) But as Miller noted, "creating metadata is a daunting, challenging task" and there is much work to be done in enabling the creation of "Semantic Web-ready" content.


The Changing Face of Scientific Communication: Developing New Models for Scholarly Publishing in the Electronic Environment


K.T.L. Vaughn (NCSU Libraries Scholarly Communication Center) introduced this session with two critical problems in scholarly publishing: cost and copyright. Journal subscription costs have been increasing twice as fast as health care costs. Definitions of copyright and fair use that support both scholarly publishing and the open dissemination of ideas in an online environment are still missing. She argued that new models of scholarly publishing should strive for compatibility with the AAU's publishing ideals: 1) low cost; 2) available e-prints; 3) consistent archiving; 4) peer review; 5) well-defined copyright and fair use; 6) faculty negotiation rights; 7) rapid publication; 8) quality over quantity, esp. for tenure evaluation; and 9) privacy.


Julia Blixrud (SPARC, presented a model of a "disaggregated" system in which a variety of publishers--including traditional closed-access for-profit, not-for-profit, and "new business models"--could coexist. She identified four key functions that a scholarly publishing system must implement: registration, certification, awareness and archiving. These functions may be implemented in many different ways, ranging from highly aggregated to highly disaggregated. She argued for a disaggregated system in order to expand access, cope with an increasing volume of research and reduce cost. However, this level of disaggregation relies on the implementation of two key features: institutional repositories and an interoperability layer.


Institutional repositories are essentially institution-specific digital libraries. They maintain a well-organized collection of digital objects which may range from traditional scholarly papers and technical reports to more complex multimedia artifacts. Repositories are costly, but institutions may be motivated to develop and maintain them because of their contribution to visibility and prestige. For repositories to work, they must avoid creating impediments to formal publication, which remains the "coin of the realm" in academia. They also must streamline the contribution process and provide incentives for contribution in order to ensure the participation of faculty. This will likely require collaboration with libraries and other university units. Encouraging the creation of institutional repositories also risks balkanization, as different repositories may employ different metadata schemas and access protocols. To combat this, Blixrud suggested an independent "interoperability layer" to negotiate between repositories, service providers (such as publishers or academic societies) and readers. The nature and technical requirements of this layer remain to be developed.


David Cohn (Journal of Machine Learning Research), a computer scientist who started an open-access, peer-reviewed journal, argued that we need a better understanding of what journals do and how they meet the needs of authors and readers. In particular, we need a more sophisticated understanding of what peer review is and what it accomplishes, as well as how journals function as a system for structuring information. He suggested that e-print services such as astro-ph (astrophysics) and Citeseer are changing the nature of scientific work, enabling rapid feedback and collaboration. But without filtering, information overload soon becomes a problem--astro-ph receives over 100 articles/day. We need "emergent editorial behaviors" that can scale to handle thousands of articles in order for these services to supplant traditional journals and indexes. Bibliometrics and collaborative filtering may help here, allowing these services to form the foundation of a "personal, distributed journal."



1 The University of Illinois at Urbana-Champaign Library is made up of nine divisions, which are disciplinary and administrative sub-units related by common interests. They are the Area Studies Division, Arts & Humanities Division, Central Public Services Division, Life Sciences Division, Physical Sciences & Engineering Division, Social Sciences Division, Special Collection Division and Technical Services Division. These are further divided into forty-two departmental libraries. The Technical Services Division consists of Acquisitions, Original Cataloging, Rapid Cataloging (Copy Cataloging) and Serials Cataloging.

CCQ Homepage | Tables of Contents | Back to vol. 36, nr. 2 | Informaworld |

Comments to: Jeffrey Beall at
© Haworth Press, Inc.