Volume 47, no. 2, 2009



 

Columns
Cataloging News, Mary Curran, News Editor

 

Original Articles

From Bibliographic Models to Cataloguing Rules: Remarks on FRBR, ICP, ISBD, and RDA and the Relationships Between Them
Carlo Bianchini and Mauro Guerrini

ABSTRACT: This article discusses the changes that are occurring in the world of cataloguing. It argues that these changes need to be coordinated. It also discusses the feature of current OPACs, FRBR, the Paris Principles and its proposed replacement (ICP), AACR2 and its proposed replacement (RDA), ISBD, and the relationships between and among these standards. It argues that the syntax of ISBD is an essential component of RDA and all future international and national cataloguing codes.

KEYWORDS: Cataloguing, ICP, RDA, FRBR, ISBD, Principle of local variation

 

Applying the FRAD Conceptual Model to an Authority File for Manuscripts: Analysis of a Local Implementation
Marielle Veve

ABSTRACT: To date, the library literature has overflowed with articles on the theory and application of the FRBR conceptual model, but little has been written about its counterpart for authorities: the Functional Requirements for Authority Data conceptual model (FRAD). A few discussions of the theory of FRAD have been written by Glenn Patton and members of the FRANAR Working Group, but nothing has been documented yet about its application real authority files. The following article addresses this gap in the literature by analyzing the FRAD conceptual model, examining its applicability to an authority file for manuscripts, and proposing a way to implement and display this entity-relationship model in a local authority file. The usefulness of this FRAD-based authority file in cataloging manuscripts is evaluated and presented.

KEYWORDS: FRAD, conceptual models, authority data, manuscripts, FRANAR, entity-relationship models, implementation, FRAD-based authority file

 

Metadata Professionals: Roles and Competencies as Reflected in Job Announcements, 2003-2006
Jung-ran Park and Caimei Lu

ABSTRACT: This study presents the current state of the roles and competencies sought from metadata professionals. We conducted a comprehensive content analysis of 107 job descriptions posted on the AUTOCAT listserv from January 2003 through December 2006. Multivariate techniques of cluster and multidimensional scaling analysis were applied to the content analysis. Results show that the principal responsibility expected of metadata professionals concerns metadata creation (73.8%). In addition to metadata creation, electronic resource management, awareness of trends and digital library development constitute the core areas of demand in the metadata profession. The findings of the study also show that knowledge and skills centering on traditional cataloging and classification standards (60.7%) remain highly relevant in the digital environment and accordingly to metadata professionals.

KEYWORDS: Metadata professionals, job description analysis, metadata, electronic resources, competencies, digital libraries

 

Single-Record vs. Separate-Record Approaches for Cataloging E-Serials in the OCLC WorldCat Local Environment
Lihong Zhu

ABSTRACT: Libraries have been making decisions on whether to use a single- or separate-record approach for cataloging e-serials based on what works best in their local library integrated systems and which approach is likely to save cataloging time. However, in the new network-level cataloging and OCLC WorldCat Local environment, we need to make cataloging decisions based on what works best in a network-level platform. This paper discusses the OCLC WorldCat Local environment, the concept of network-level cataloging, and why Washington State University has responded by migrating from a single-record to separate-record cataloging policy for e-serials.

KEYWORDS: e-serial cataloging; single-record approach; separate-record approach; network-level cataloging; OCLC WorldCat Local

 

Cataloging News

Welcome to the news column. Its purpose is to disseminate information on any aspect of cataloging and classification that may be of interest to the cataloging community. This column is not just intended for news items, but serves to document discussions of interest as well as news concerning you, your research efforts, and your organization. Please send any pertinent materials, notes, minutes, or reports to: Mary Curran, Morisset Library, University of Ottawa, 65 University Ave, Ottawa, ON Canada K1N 9A5 (email: mgcurran(at)uottawa.ca; phone: 613-562-5800 ext. 3590). News columns will typically be available prior to publication in print from the CCQ website at http://catalogingandclassificationquarterly.com/.

We would appreciate receiving items having to do with:

Research and Opinion

Events

People

 

RDA Milestones

The full draft of RDA will be available for Constituency Review during the week of November 3, 2008.
Nathalie Schulz, Secretary Joint Steering Committee for Development of RDA

 

The Music Library Association's Bibliographic Control Committee (BCC) released its Final Report of the BCC Working Group on Work Records for Music (available at: http://www.musiclibraryassoc.org/BCC/BCC-Historical/BCC2008/BCC2008WGWRM1.pdf). The report was tabled on July 31, 2008 and its availability was noted on the RDA-L listserv on September 9, 2008. The short-term working group, struck in April 2008, was charged with looking at "issues surrounding the question of what elements and attributes of musical works should be included in a work record, once the cataloging community moves beyond AACR2 and MARC21 to a new descriptive model that stores data in a relational or object-oriented database."1 The recommendations will help inform future discussions of music issues in FRBR, FRAD, and RDA and will also provide foundational documentation for the proposed MLA task force to examine next generation library catalogs. Kathy Glennan, chair of the MLA Bibliographic Control Committee can be contacted at kglennan(at)umd.edu.

 

RESEARCH AND RESPONSES RELATED TO "ON THE RECORD: THE REPORT OF THE LIBRARY OF CONGRESS WORKING GROUP ON THE FUTURE OF BIBLIOGRPAHIC CONTROL"

Electronic Book Client-driven Acquisition Trial
1.1.1 Make Use of More Bibliographic Data Available Earlier in the Supply Chain

From September 2007 through March 2008, Monash University Library (Melbourne, Australia) carried out a trial of the Ebooks Corporation (EBL) client-driven acquisition model for purchasing electronic books. The model represents a completely new approach to collection development in that the purchase of ebooks on the EBL platform was placed entirely in the hands of library clients based on actual online use of the material.

For the trial, a selection of approximately 50,000 titles deemed relevant to Monash teaching and research from the entire EBL catalogue of over 70,000 titles was made available for client searching, browsing, and purchasing via the local Voyager OPAC. Purchase of titles was based on the number of times a title was used. That is, all titles could be browsed for up to 5 minutes without a use (or loan) being registered. After the five minute free period the client was offered the option to continue to read online. Acceptance of this offer was registered as a loan for which the Library was charged a small usage fee. Once a title was loaned three times, it was automatically purchased.

By implementing the EBL client-driven acquisition model, the Library provided immediate access to a large collection of multi-disciplinary material much of which would not otherwise have been readily available. Over the 6 month period of the trial, 14,000 titles were accessed, and of these, 1,600 titles were later purchased. The kind of material that was being used and selected by the clients proved to be highly relevant to the broad academic needs of the University for both research and education. The most popular titles included books on diverse topics like fan culture and the popular media, capital risk management, a survival manual for using the SPSS data analysis program, and law textbooks.

Besides the uniqueness of the model itself (other suppliers are now offering their own versions) there were some interesting aspects about this trial from a technical services perspective. EBL initially provided brief, provisional MARC bibliographic records. Although these records do not contain Library of Congress subject headings, the table of contents data in the records provide rich terms for keyword searching. Full OCLC MARC records were purchased to replace the EBL provisional records for all titles purchased. As an ebook is auto-purchased, a request to purchase an OCLC full record is generated.

Machine processes were put in place for initial loading and updating of the bibliographic records. Software tools were used to preprocess the records and the Voyager bulk importing program was used to load the records into the catalogue. Because Monash already had a significant number of ebooks from other providers that were duplicated in EBL, access queries were developed to de-duplicate the data before loading EBL records into the catalogue. The most reliable matching point was the ISBN but we encountered problems when trying to match catalogue records with vendor data which contained an array of ISBNs (hardback, paperback, electronic, 10 digit, 13 digit U.S. ed., U.K. ed., etc.). Further processing was required to minimize duplication, change record visibility in the EBL database, and to suppress records towards the end of the trial to limit further expenditure.

The main challenge to emerge from the EBL trial, however, was not technical but rather how to control expenditure. Because virtually all purchases are unmediated there is a potential for large, uncontrolled expenditure if it is not carefully monitored. As the full report on the trial observed, adequate measures to monitor and control expenditure need to be put in place should the model continue to be used by the Library.

While there were some minor, initial problems common to most ebook platforms related to access and printing, it can be concluded, on the basis of usage and a small amount of client feedback, that the trial was a success for library clients. The use of EBL will be continued at Monash in 2009, subject to negotiation with the supplier regarding record loading and de-duplication and the introduction of a range of measures to monitor and control the rate of purchasing and expenditure.

The full trial report is available at http://arrow.monash.edu.au/hdl/1959.1/45804.
The ebook Working Party members:
Joan Gray, David Horne, Winifred Hirst, Michael McLellan,
Robert Stafford (Chair), and Robert Thomas
Monash University Library, Melbourne, Australia

 

RESEARCH AND OPINION

Serco Consulting released its interim report of A UK Research Data Service Feasibility Study for the Higher Education Funding Council for England (HEFCE). The report which is based on four case study universities (Bristol, Leeds, Leicester, and Oxford) and describes emerging trends of local and national data repositories in EU, Australia and the U.S. is available from http://www.ukrds.ac.uk/UKRDS%20SC%2010%20July%2008%20Item%205%20(2).doc. Building on the interim report, the study has moved into the second phase devoted to developing the business case.

Conference details for Electronic Resources & Libraries 2009 to be held at the University of California, Los Angeles, CA, February 9-12 2009 are available online at http://www.electroniclibrarian.org/ocs/index.php/erl/2009. This conference provides a forum for information professionals to explore ideas, trends, and technologies related to electronic resources and digital services.

The IFLA Continuing Professional Development and Workplace Learning Section and the New Professionals Discussion Group invite proposals for presentations—keynote/plenary presentations, research reports, smaller scale interactive round-table discussions, workshops, and posters—for the IFLA Satellite Conference in Bologna, Italy - August 18, 19, and 20, 2009, Moving In, Moving Up, and Moving On: Strategies for Regenerating the Library and Information Profession. The satellite conference will be held immediately prior to the World Library and Information Congress in Milan, Italy, August 2009. Proposals are requested by November 10, 2008 in order to meet publication deadlines (for inclusion in the published Proceedings distributed at the conference). Not all presentations and papers need to be published as part of the proceedings; the review committee will also accept presentations that are not intended to be considered for publication. Themes and guidelines as well as submission forms are available at http://www.ifla.org/IV/ifla75/satellite-cpdwl-call-en.htm. Proposals should be sent, no later than November 10, 2008, to both of the program conveners: Loida Garcia-Febo, Assistant Coordinator, Special Services, Queens Library, 89-11 Merrick Blvd., Jamaica, NY 11432, USA, email: loidagarciafebo(at)gmail.com, and Roisin Gwyer, Associate University Librarian, The University Library, University of Portsmouth, Cambridge Road, Portsmouth, PO1 2ST, England UK, email: roisin.gwyer(at)port.ac.uk. Successful proposers will be advised of the acceptance of their proposal in early December 2008.

The Canadian Health Libraries Association / Association des bibliothèques de la santé du Canada sent out a call for papers or posters for its annual meeting, to be held in Winnipeg, Manitoba, May 30-June 3, 2009. Papers and posters should describe innovative programs/practices or new research findings and should relate to the overall conference theme: The Sky's the Limit / Horizons illimités. To submit a paper or poster session for consideration and for more information please send an email to Analyn Cohen Baker at Analyn_Baker(at)umanitoba.ca by December 15, 2008 with the following information: title of the paper; short structured abstract (250 words or less); and author(s) name(s), address, email, and work phone number. Structured abstracts should follow JCHLA/JABSC Instruction to Authors at http://pubs.nrc-cnrc.gc.ca/jchla/jchla26/c04-900.pdf. All those submitting abstracts for contributed papers will be contacted by the Program Committee by January 19, 2009. All those submitting abstracts for posters will be contacted by the Poster Committee by January 19, 2009.

A call for papers for the second international Public Knowledge Project (PKP) Conference to be held from July 8-10, 2009 in Vancouver, British Columbia, Canada was sent out to email groups and blogs on Sept. 17, 2008 The conference is of interest to the PKP community but also to anyone interested in scholarly publishing and communication trends and developments in general. Papers and presentation proposal guidelines are available at http://pkp.sfu.ca/ocs/pkp/index.php/pkp2009/pkp2009/schedConf/cfp-. Submissions are requested by January 15, 2009. The quick link to the PKP Conference website is http://pkp.sfu.ca/ocs/pkp/index.php/pkp2009. Heather Morrison, member of the Second International PKP Scholarly Publishing Conference Planning Team, can be contacted at heatherm(at)eln.bc.ca or 778-782-7001.

New technologies:

Although not new, WorldCat.org seems to add more value-added features on a regular basis and deserves another look since the interface really does "help Web users everywhere find and link to library-owned content and services."2 At http://www.oclc.org/worldcatorg/features/default.htm you will find a list and explanation of WorldCat.org features including My WorldCat, the social networking list feature allowing users to create private and public lists; the ability to refine your search with faceted browsing; library results tabs showing locally owned materials or geographically close libraries that own the item; a "Cite this Item" link that launches a pop-up window with the five most common citation styles, APA, Chicago, Harvard, MLA, and Turabian.

Also from OCLC, a WorldCat application developed for iPhone, is available on the Apple apps website at http://iphonetoolbox.com/webapp/worldcat/ for free download.

On September 22, 2008, Alex Diaz, Product Manager, Google Book Search announced the free availability of a book search API which allows libraries to embed book reviews, ratings and lists or just to link to books at Google Book Search. For more information see Diaz's blog announcement at http://booksearch.blogspot.com/2008/09/book-search-everywhere-with-new.html.

Innovative Interfaces announced in its July 2008 newsletter INN-Brief the release of Content Pro: "a new digital library solution that exposes collections for greater discovery on the web. Content Pro also makes digitization projects easy by providing an easy submission form based on Qualified Dublin CoreSM metadata. Content Pro is an OAI-PMH-compliant data provider that can be harvested by aggregators such as OAIster for greater discovery by outside libraries, content repositories, and even search engines like Google."3 Binghamton University, SUNY (NY), Michigan State University, University of San Diego (CA), University of Western Ontario (Canada), Scottsdale Public Library, and Westerville Public Library (OH) are working with Innovative to test Content Pro and its ability to integrate local digital content into their Encore implementations.

 

 

EVENTS

Meetings:

The Qualitative and Quantitative Methods in Libraries International Conference (QQML2009) to be held in Chania, Crete, Greece, from May 26-29, 2009 is now accepting submissions and proposals for special sessions. The forthcoming QQML 2009 Conference will focus on qualitative and quantitative methods in libraries. General information about the conference is available at http://www.isast.org/home.html, and instructions for submission specifically are available from http://www.isast.org/presentations/abstractpapersubmission.html.

Presentations from the popular ACLTS CMDS Program Making the Switch from Print to Online: Why, When and How? are now available at http://tinyurl.com/6j8kgl by scrolling down and looking for the program title Making the Switch. Papers include an introduction by Judy Luther (Informed Strategies and co-author of of the ARL white paper "The E-only Tipping Point for Journals: What?s Ahead in the Print to Electronic Transition Zone), Jill Emery (University of Texas at Austin, current President of NASIG), Tim Bucknall (University of N. Carolina, Greensboro & developer of Journal Finder), Noella Cohen (Springer, Science Business & Media & Academic Licensing Manager), and Kimberly Steinle (Duke University Press, Library Relations).

Semantic and Social Metadata Meet in Berlin:
A Report from the 2008 Dublin Core Metadata Initiative Conference

The Eighth Annual International Conference on Dublin Core and Metadata Applications (DC 2008) met in Berlin, Germany September 22-26, 2008 at Humboldt-Universität zu Berlin. Once known as the "mother of all modern universities," Humboldt was a wonderful setting for the conference. The conference was organized jointly by the Competence Centre for Interoperable Metadata (KIM), the Max Planck Digital Library (MPDL), the Göttingen State and University Library (SUB), the German National Library (DNB), the Humboldt Universität zu Berlin (HU Berlin), and the Dublin Core Metadata Initiative, and supported by Wikimedia Deutschland.

The theme for DC 2008 was "Metadata for Social and Semantic Applications." The program was co-chaired by Jane Greenberg and Wolfgang Klaus and was supported by a 53 member program committee. The conference theme drew submissions and participants from a variety of disciplines, including library and information science, archives, government, and education, which made for a rich program filled with a variety of papers, project reports, posters, tutorials, workshops, and seminars. A total of 312 attendees, including 28 students, representing 39 countries were present.

The first day of the conference consisted of two sets of simultaneous tutorials: Dublin Core tutorials presented in English and tutorials der FH Potsdam presented in German. The topics of the tutorials in English were Dublin Core History and Basics, given by Jane Greenberg; Dublin Core Key Concepts, presented by Pete Johnston, Dublin Core and other Metadata Schema, given by Mikael Johnston; and Dublin Core in Practice: Implementation Issues, presented by Marcia Zeng. The simultaneous tutorial sessions delivered in German were Die Dublin Core Konfernez für Einsteiger, presented by Bernadette Gandaa; Einführung in Semantic Web, given by Tina Matzat; Einführung in RDA (Resource Description and Access), presented by Dierk Eichel; and Einführung in Social Tagging/Computing, given by Johannes Hercher. These tutorials, held each year before the conference, are a good way for people new to the Dublin Core Community to become familiar with the history and subject content of the conference and metadata initiative.

At the heart of the Dublin Core conference were the full papers presented on theory and research related to the conference theme. Twelve separate papers were presented in five paper sessions. The session themes were: Dublin Core: Innovation and Moving Forward; Semantic Integration, Linking, and KOS Methods; Metadata Generation: Methods, Profiles, and Models; Metadata Quality; and Tagging and Metadata for Social Network. Papers were presented during the morning session each day. A complete list of the program, including slides from these presentations, can be accessed at http://dc2008.de/programme.

Each section of the core program began with a keynote presentation. Kurt Mehlhorn, Chairman of the Steering Committee from the Max Planck Society on e-Research, launched program on Tuesday, speaking on e-Research. Later that afternoon, a second keynote called "Access to art museums on-line: a role for social tagging and folksonomy" was given by Jennifer Trant, a Partner in a company called Archives & Museum Informatics. Two more keynote addresses were given later in the program. On September 24, Ute Schwens, Permanent Deputy of the Director General of the German National Library and Director of the German National Library Frankfurt am Main, presented on the modern need for cataloging codes. The last keynote, "Why the Semantic Web Matters," was given on September 25 by Talis's Technology Evangelist, Paul Miller.

The afternoon sessions were filled with project reports, workshops, DCMI Community meetings, forums, and special sessions. Eight reports were given over a 3 different project report sessions. The themes for these sessions were: Toward the Semantic Web; Metadata Scheme Design, Application, and Use; and Vocabulary Integration and Interoperability. The project reports included updates on continuing projects that use Dublin Core metadata, and provided helpful advice and information to those working on or looking to start similar projects. The DCMI Task Groups (Education, Government, DCMI/RDA, and DCMI/IEEE) met to discuss and work through important information and library science issues. Two special sessions, one on Knowledge Organization and the other on Metadata for Scientific Datasets, also gathered to discuss new challenges in both of these areas. The DCMI Communities (Libraries, Social Tagging, Identifiers, Tools, Scholarly Communications, Knowledge Management, Registry, Localization & Internationalization, and Accessibility) also met during this year's conference. These sessions were open to all participants and attendees, and provided insight into issues that the Dublin Core community is addressing.

On the final day of the conference, participants could choose from four seminars that each lasted all day. The four were User Generated Metadata; Using the TEI for Documenting Describing Documents; PREMIS Metadata Tutorial; and Ontology Design and Interoperability. Seminars are intended to engage participants in longer discussion and activities and created a satisfying end to a productive conference.

Electronic proceedings for Dublin Core 2008 and previous years will be uploaded for viewing at http://www.dcmipubs.org/ojs/index.php/pubs/index. Below are a few highlights of some of the papers, project reports and workshops presented at this year's conference.

 

LCSH, SKOS, and Linked Data (presented by Ed Summers)

Simple Knowledge Organization System (SKOS), was a very popular topic at this year's conference and the paper "LCSH, SKOS and Linked Data," written by Ed Summers, Antoine Isaac, Clay Redding , and Dan Krech, discussed the technique for converting the Library of Congress Subject Headings (LCSH) to SKOS RDF. This paper was presented during the Full-Paper Plenary Session 2: Semantic Integration, Linking, and KOS Method. Summers presented the many problems with the conversion from the MARCXML format to SKOS and mentioned the importance and challenges in using URIs for conceptual resources. Having the LCSH vocabulary in SKOS provides the opportunity for linking to traditionally isolate data sets.

 

Project Report Session 2: Metadata Scheme Design, Application, and Use
Chaired by Wolfgang Klas

A vital part of the Dublin Core conference is the reporting of those projects that use Dublin Core metadata. Project Report Session 2 on Metadata Scheme Design, Application, and Use contained three project reports that each represents metadata issues of interest to libraries. The first report was called "The Dryad Data Repository: A Singapore Framework Metadata Architecture in a DSpace Environment" and was presented by Sarah Carrier and Hollie C. White. Dryad is a repository of scientific data used to support publications from certain evolutionary biology journals. The report focused on the attempt to conform Dryad's application profile to the Dublin Core's Singapore Framework and how both are being implemented within DSpace. The Dryad team's report was helpful for those attendees also working in DSpace. The second report was "Applying DCMI Elements to Digital Images and Text in the Archimedes Palimpset Program" presented by Michael B. Toth and Doug Emery. This report was an update on the Archimedes digitization project and presented many interesting issues related to the digitization of classical and rare texts. The third report was "Assessing Descriptive Substance in Free-Text Collection-Level Metadata" given by Oksana Zavalina. This report discussed an analysis of the metadata used for projects listed in IMLS Digital Collections and Content (IMLS DCC). This analysis found that free text metadata can be as important as controlled vocabulary metadata in collection-level descriptions.

 

Workshop 16: Metadata for Scientific Datasets
Chaired by Jane Greenberg

The role of the library and repositories is evolving to include things beyond the scope originally imagined. Understanding that storing, preserving, and describing scientific datasets is a concern for librarians, a special session dedicated to Metadata for Scientific Datasets met for the first time at Dublin Core 2008, and brought together more than forty participants. Only lasting an hour and a half, this meeting brought up many questions, such as, "Should Dublin Core create elements for scientific data?" and "How do you represent the diversity of scientific data in a consistent way?" At its conclusion, the attendees voted that the group called "Metadata for Scientific Datasets" should be formed to continue work on these issues, and that a request should submitted to the Dublin Core advisory board to make "Scientific Datasets" an official DCMI group.

Other important strides occurred in Workshop 6, "RDA: Finding Your Place on the Evolutionary Path," led by Diane Hillman, and Workshop 3 an open meeting between the DC-Libraries Community and the DCMI Libraries Application Profile Task Group, led by Christine Frodl. Both addressed the potential of RDA, the new metadata content standard replacing AACR2, and spoke specifically to the formal representation of the RDA Elements and Vocabularies that are being developed in parallel to the textual instruction being developed by the Joint Steering Committee for the Development of RDA.

Overall, DC 2008 was a productive and successful conference. Makx Dekker, Managing Director of the Dublin Core Metadata Initiative, summed up the conference best with his closing observations, commenting that Dublin Core 2008 "supported the cross-fertilization of research and practice that will help advance the field to the benefit of all."

Hollie C. White
Dublin Core 2008 Publication Committee Member
Metadata Research Center Doctoral Fellow School of Information and Library Science
University of North Carolina at Chapel Hill

 

Reports from the 2008 OLAC-MOUG Conference, September 26-28, 2008

Access to many of the presentations and supporting materials cited in these reports can be found at: http://www.notsl.org/OLAC-MOUG/Handouts.html.

 

MAP CATALOGING PRE-CONFERENCE
Presented by Paige Andrew, Pennsylvania State University
Reported by Stacey Beach, Southern Methodist University

The focus of this full day preconference was on original cataloging for single print maps. A binder full of useful information and an extensive bibliography was provided for each attendee.

Andrew began the workshop by defining a map and pointing out the differences between cataloging maps and cataloging monographs. Major differences include the fixed fields for relief and projection, as well as the variable fields for scale and coordinates. As of August 2008, there were some changes in OCLC input standards for full-level records for cartographic materials. Coordinates are now required if applicable in both the 255 and 034 MARC fields. The 007 field for maps and globes is now required if applicable, while the 052 is optional. We briefly went over the chief and prescribed sources of information and tools that are available to assist map catalogers.

Titles were discussed in depth. This is a particularly troublesome aspect of map cataloging, as often there are multiple titles on the map. The 245 should contain the title that most precisely expresses the area and topic of the map, with preference given to a title printed within the neat line. A title source note is often needed. The 246 field should be used liberally for other titles on the map. We also discussed choosing the main entry when several statements of responsibility are present. Catalogers must consider AACR2 rule 21.1B2 as it applies to cartographic materials. It is helpful to bear in mind whether corporate bodies listed on the map are in the business of cartography when choosing between a corporate or personal main entry.

Considerable time was spent discussing and practicing how to determine scale and coordinates. Participants were each given a Natural Scale Indicator and taught how to use this tool to estimate scale in representative fraction form when only a bar scale is on the map. Andrew also covered how to phrase the 255 in various cases where the scale is not given. The group practiced coding coordinates in the 255 and 034 fields.

The proper measuring of maps was discussed and practiced. This is another particularly tricky aspect of map cataloging, as neat lines are often broken or include extraneous area, and some maps are printed on multiple sheets.

There was also discussion about general notes that are commonly used in map cataloging, as well as constructing call numbers using the Library of Congress G classification schedule. The workshop concluded with participants dividing into small groups to practice cataloging sample maps.

 

PLENARY SESSIONS
ROCKING THE METAVERSE: A/V CATALOGING IN A WEB X.0 ENVIRONMENT
Opening Keynote Address presented by Lynne Howarth, University of Toronto
Reported by Michelle Hahn, Southern Methodist University

Lynne Howarth, faculty in Information Studies at the University of Toronto, began with a look back at the keynote address she presented at the 1998 OLAC conference, touching on the pace of new media and its cataloging. At that point in time, there had been an increase in the number of websites, CDs, DVDs, interactive multimedia, digital media, e-commerce, social commerce, and digital libraries. iPods, mp3 players, camera phones, handheld devices, USB sticks, social networking, social tagging, blogs, wikis, RSS, and virtual worlds had not yet emerged. During that same period, the cataloging world was considering the future development of AACR after the Toronto Conference, implementing updates to ISBD, rolling out Dublin Core, and starting to think in terms of "metadata." But, RDA, ISBD Consolidated, International Cataloging Principles, WorldCat.org, recent versions of OPACs, and social online catalogs had not yet made their debut. Clearly, over the last ten years, however, media has gone beyond the digital revolution and into the digital mainstream.

In a bit of a segue from cataloging and media into a public service mode, Howarth introduced the crowd to the "metaverse," as described in a book titled Snow Crash, by Neal Stephenson, where humans are represented by avatars and interact in a digital environment meant to replicate the physical world. An example of a "metaverse" is the online environment, Second Life (also, a Multi-User Virtual Environment, or MUVE), where such avatars are created by the users, who then interact with the avatars of other users, and go about exploring a virtual world they create and collectively maintain. More than 400 librarians participate in Second Life, making over 30 libraries visible to Second Life users. This is all well and good, but Howarth asks, "But where are the catalogers?"

Revisiting what has happened in the last ten years, the most recent transformation of AACR began at the International Conference on the Principles & Future Development of AACR (commonly known as the Toronto Conference, 1997), where incorporating FRBR concepts dealing with user tasks and entities was considered, as well as carrier vs. content in describing material. The Joint Steering Committee continued to issue updates and revisions to AACR2, and looked toward AACR3 (now, RDA). Expected to be available in draft for review by mid-October 2008, and released in its final state mid to late 2009, RDA has been through many transformations of its own since its inception and now will exist as a document in ten sections, with thirty-seven chapters. This, along with the content vs. carrier discussions and a move to ISBD Consolidated, will provide many avenues for further discussion, consideration, review, and implementation that may affect the cataloging community a great deal.

Howarth began her conclusion by commenting on the fact that, though it did not necessarily seem that way at the time, the trends in formats and cataloging from 1998 are simpler than those trends seen today. However, this time around, the cataloging community is armed with many more tools, technologies, sources, and collaborators. As she says, "we have places at multiple tables, and even more credibility" in this new era, with a lot more to look forward to as these developments progress.

This progress will not go without its hindrances, such as the abundance of A/V materials, especially those born digital, that have grown and will grow exponentially while standards, budgets, and personnel have not. The de-emphasis of cataloging in LIS education may also impede progress. But, if catalogers are able to make themselves flexible and adaptable by continuing to deal with new materials in new formats, as well as putting new standards into context with current and upcoming practices, the future could prove to be great.

Returning to the topic of Web 2.0, Howarth's suggestions to catalogers include find ways to gather user-created metadata to complement standards-derived metadata, create "OPACs for the people (and a little bit by the people)," and determine where to draw the line between professional structure and public involvement. "Melville's world is changing, but the foundations still pertain."

 

CLOSING ADDRESS
Presented by Janet Swan Hill, University of Colorado at Boulder
Reported by Susan Moore, University of Northern Iowa

Janet Swan Hill began her comments by stating that she was of the opinion that catalogers need to be like a decathlete in that we have to be good at multiple different things. Among other things, we have to know the structure of records, we have to know metadata structures and how they integrate (or not) into other systems, and we have to understand the results and impacts of what we do as reflected in the public catalog.

Giving the conference summary is a strange task. You need to be able to gather the strings together from the various sessions and try to weave them into a cohesive whole. One common thread was that most sessions gave a bit of history at the beginning. In accordance with this thread, Hill related that her early career involved map cataloging at a time when catalogers of "funny" or non-book formats often had to provide alternate ways to access the collections. The development of rules for non-book materials began with AACR1 and some other formats began to be included in the main catalog. All the formats from AACR1 are still around and catalogers still hope to learn more about how to provide access to them. Additionally, with the increase in formats (video, electronic, and others) the "funny" formats are becoming more central.

Those of us who deal with non-book materials seem to share certain traits and characteristics. There were and are shortcomings in the rules for non-print materials so flexibility and creativity are qualities we seem to share. On the downside, we can tend to be "downtrodden elitists" and not reach out to other communities. We tend to be very organized with a love for detail.

Lists of resources to help catalog various formats was a part of every session. We have so many tools because we are mid-stream in many different areas. We need to remember that cataloging rules aren't intuitive because things aren't simple. Despite this, non-catalogers seem to want to be catalogers with things like social tagging and websites like de.lic.ous, so those outside the field are learning about the need for consistency and uniformity.

The poster sessions tended to be graphic rather oral, the projects presented were either a small-scale or not quite finished, and the presenters often wanted closer interactions with the audience.

There are major changes looming on the horizon, not only with the standards, but in the philosophy. If we start getting more information from others, we could apply resources to provide access to materials not currently well served. The last time a major shift like this happened, cataloging departments gave up positions, which was unwise. People doing non-book cataloging have always had to be flexible, adaptable, and creative. We need to be involved in the revision of the standards. There's no point in waiting to see what develops because things still need to be done.

 

WORKSHOPS AND SEMINARS

BASICS OF SCORES CATALOGING
Presented by Margaret Kaus, Kansas State University
Reported by Ellen Symons, Queen's University Library

Margaret Kaus presented a revised version of the workshop given by Ralph Papakhian, who was unable to attend the conference. The workshop was intended for catalogers who are new to music cataloging or catalog music infrequently, so it focused on practical applications rather than concentrating on AACR2 rules and MARC coding. Kaus illustrated the discussion with many useful examples.

The focus of the workshop was published printed music. Kaus made the distinction of when a music score is a score, and when it is a book, and discussed the coding needed for each format. If the score contains music with or without words, it is coded ‘c’ for score, but a collection of songs without music, or an opera libretto without the music, would be coded ‘a’ because they are considered to be books.

Kaus then went on to discuss searching for music scores in OCLC. An efficient way to do this is by using the Publisher Number index for publisher or plate numbers. She suggested that catalogers use a double dash when searching a range of numbers, since the number itself often has a hyphen in it. She used the example in the handout, which also illustrates the use of * for truncation: 12357* finds 12357, 12357-3, 123578, 12354--12367. Plate numbers, if present, are found at the bottom of each page of the score. These numbers can end with a hyphen plus a number which is an indication of the page number, but the hyphen plus number is not always entered in the 028 field, making truncation particularly useful. Later in the workshop, Kaus talked about inputting the 028, telling the group that all spaces and punctuation should be transcribed, and that plate numbers come before publisher's numbers.

There was a discussion of when to input a new record into OCLC. Examples that Kaus gave included that when the OCLC record is for the score and parts, but the cataloger only has one or the other, a new record should be made. If the cataloger has a miniature score (music shrunk down to a smaller format e.g. originally 45 cm. but published as 28 cm.), a new record can be made. If the item is issued in a new series a new bibliographic record can be made, but a new record is not justified if there is a change of series among issues or parts of a serial or multipart item.

The rest of the workshop concentrated on different parts of the bibliographic description. Kaus discussed title pages and titles, spending quite a bit of time looking at examples illustrating the difference between generic titles, which are the name of a type of composition (Sonata, Symphony, Concerto) and distinctive titles, which refer to a specific work by a particular composer (A Midsummer night's dream) and how to transcribe them. For the Edition area, she discussed LCRI 5.2B2 where the cataloger is instructed to use the voice range as the edition statement. In the Dates area, Kaus emphasized LCRI 1.4F6—"Ignore copyright renewal dates for works first copyrighted before 1978." Kaus also discussed how to determine if a score is a copy or a new edition of an earlier manifestation. A new edition would have this explicitly stated on the item, or there would be a change in the title or statement of responsibility area, edition area, the extent statement of the physical description area and the series area, and a new record would be required. However, if there is a difference in the publication/distribution area, in the printing or copyright date when a publication date is present, in the ISBN, or if the binding is different, then the item is considered to be a copy of the earlier manifestation and an existing record can be used and edited.

Although the workshop did not cover subject headings, there was a question about the subject heading Popular music $z United States, and why there was no $v Scores. Kaus said that this subdivision cannot be used for vocal music, popular music, country and jazz, or for scores for individual instruments. She followed up this discussion with an email explaining that Subject Cataloging Manual H1160 states "Do not use the subdivisions in List 1 [includes $v Scores] under ... [h]eadings for music of particular seasons, occasions or styles, etc. that neither state nor imply medium of performance." The examples given in H1160 include country music, which is a style similar to popular music, so that 650 _0 $a Popular music $z United States is a valid subject heading for the score of a piece of popular music. The email also included an addendum to the workshop handout that contained information about music subject headings and uniform titles, and more examples, which were a valuable addition to this informative workshop.

 

ADVANCED SCORES CATALOGING
Presented by Paul Cauthen, University of Cincinnati
Reported by Mary Huismann, University of Minnesota

Paul Cauthen, Assistant Music Librarian at the University of Cincinnati, began his presentation with a reminder of some essential music cataloging resources: Richard Smiraglia's Describing Music Materials, 3rd ed. (Soldier Creek Press, 1997), Michelle Koth's Uniform Titles for Music (Scarecrow Press, 2008) and the website "Music Cataloging at Yale." Other helpful resources mentioned include Universal-Handbuch der Musikliteratur compiled by Franz Pazdirek (for nineteenth century music), and Types of Compositions for Use in Music Uniform Titles, 2nd ed. compiled by the Music Library Association Working Group on Types of Compositions (for uniform titles). Cauthen also gave a brief overview of his scores cataloging "basic decision trees" for uniform titles and subject headings. The bibliography, decision trees, a compendium of stock phrases for notes, score and MARC record examples and topical index handouts are available from the conference website.

With the "basics" out of the way, Cauthen continued the presentation with multiple examples of sticky scores cataloging problems. He used MARC records to illustrate the various solutions to these problems. Among the many situations covered in this session were handwritten scores, sets of parts for large ensembles where the number of parts acquired may vary, transcription of non-Roman alphabet information, graphical notation, differences between teaching pieces and instructive editions, orchestral excerpts, identification of editions and printings, various elements of uniform titles, proper use of the LCSH subdivisions "Songs and music" and "Musical settings," monologues with music, edition statements for vocal music, tablature, and use of vendor records.

 

BASIC SOUND RECORDINGS
Presented by Mark Scharff, Washington University in St. Louis
Reported by Nathan B. Putman, George Mason University

Mark Scharff is the Music Cataloger at the Gaylord Music Library of Washington University in St. Louis. In a small and packed room, Scharff presented basic sound recording cataloging, which included two handouts: the presentation slides and the examples (which included disc labels, container information, additional notes, etc.). Because of time constraints, Scharff stated that he would cover published compact discs containing musical works and describe the sources of information, the title proper, dates and numbers associated with sound recordings, information on performers and content notes, choice of entry, and added entries.

After the introduction, Scharff jumped into the presentation by describing the sources of information and stated that the term "label" was leftover from the vinyl era. He described the places one might look for a title and in regards to collective titles stated: "We like collective titles. They make our lives much simpler." Suggestions followed on what to do when you don't have a collective title. His examples included steps to find the title proper with two slides showing his thought process. These slides (titled "Omissions Testing") listed all possible titles from an item and a step-by-step elimination process (described as taking away and giving back) with explanation until he was left with the title proper.

Before moving on to dates and numbers, Scharff briefly described the use of the general material designation (GMD), stressing that you can do whatever you want in your local catalog (following local procedures, of course), but shared standards should go in the shared database. He also described the publication area as "painful," especially for cataloging older items. He suggested using the Internet to find publisher/manufacture information and to be aware of differences between sound recording cataloging and print cataloging.

Scharff said that the basic truth about dates and sound recordings was that sound recordings rarely or never have a publication date. Sound recordings use a phonogram copyright date which is the copyright date of the recorded sound (proceeded by "p" in the MARC record) and may include a copyright date for liner notes, etc. (proceeded by "c"). This is due to the separation of copyright for recorded sound and carrier/accompanying materials. He suggested that an item having only a "c" date was really a clumsily presented phonogram date and should be presented in the MARC record with square brackets and possibly a question mark. Scharff continued with reissues dates, dates of various formats, and composition dates and later with standard numbers such as the stock/issue/label number with their variant forms.

Next was information for a MARC field 511 note and he added that for some works, such as anthologies where different groups are responsible for each item, the performer or group may be more suited as the statement of responsibility in a 505 contents note. Although there are no guidelines for the order of performers, Scharff suggested using score order or listing local performers first. He stated that ISBD punctuation for the 511 (space semi-colon space) was the rule, but is no longer required. He continues to use the "Compact disc" note stating that it is in the rules even though some people see it as unnecessary.

At this point, the time for the end of the session had arrived (and all too soon as this was an in-depth and interesting presentation). The topics of "Choice of Entry" and "Added Entries" were covered briefly. Scharff stated that if we took anything away from these last two sections, it should be regarding named groups and named performers: If a group or an ensemble is named, do not include the individual names of the performers with the exception of jazz ensembles.

 

ADVANCED SOUND RECORDINGS
Presented by Robert Freeborn, Pennsylvania State University
Reported by Vickie Brueck, Akron-Summit County Public Library

Robert Freeborn covered non-standard compact disc recordings, both music and one example of a non-music recording. He began by listing the problem areas for music sound recordings, which are the GMD, Physical description (300), Note fields, and the MARC fields: Type (Leader 06), 006, and 007. His last several slides in the presentation were a summary of the contents of these troublesome fields, a chart listing the preferred order of notes, and a listing of additional resources.

The bulk of Freeborn's presentation was spent describing unusual CD formats and the changes to the MARC record that these formats would require. For non-music recordings the main difference is in three Leader fields. They should be coded as follows: Type: i, Comp: nn, and LTxt: (for the content of the spoken book, such as f for fiction). For enhanced CDs (which include a video portion requiring a computer to view), an additional 007 and an 006 field must both be added to bring out the computer and video aspects of the CD, as well as some additional notes describing the video features.

Another problem type of CD involves those where one side of the disc is a DVD and the other side of the disc is an audio CD. Freeborn suggested cataloging the work based on the packaging. If the publisher issued it as a DVD with a bonus side of an audio CD, then it should be cataloged as a videorecording. However, if the publisher issues the work as an audio CD with a bonus DVD side, then it should be cataloged as a sound recording. Extra notes and 007 and 006 fields will need to be added to describe the video or audio portion of the work.

In order to circumvent illegal copying of CDs some publishers are encrypting their CDs. This information should be included in the record as a note. The publisher will include statements such as: Content protected compact disc, or This CD is copy protected. Another means publishers use to convey this information is by the symbol:

right-pionting triangle with capital C in the center

Super audio compact discs are strictly audio in nature, but they have a clearer sound than regular CDs. For these, add a note stating that they are Super audio compact discs (SACD) and change the value of the 007 $e to z. DVD audio discs will frequently have video content as well as audio content and the video content will need to be described with notes and 007, 006 fields.

Another category of sound recordings are those that are born digital, such as remote databases like Naxos Music Library, ITunes, and Podcasts/RSS files. These will all be cataloged on the sound recordings format but the GMD will be electronic resource. MARC fields 006 and 007 will be added to describe the computer nature of these resources. With a recent change to the rules a 300 field can be added to these resources whenever it makes sense to do so, although it is not required. An example would be: 300 12 sound files : $b digital, MP3 files.

There are also stand-alone devices such as Playaways, which began in 2005. For these, use GMD electronic resource and SMD sound media player, for example: 300 __ 1 sound media player (ca. 4 hr) : $b digital ; $c 3 3/8 x 2 1/8 in. The 006 and 007 fields to describe the electronic nature of the work will also need to be added.

There will always be new formats developed by publishers and collected by libraries so that catalogers will need to stay informed about the music publishing industry and be proactive in developing new standards to describe these new formats.

 

BASIC VIDEORECORDINGS CATALOGING
Presented by Jay Weitz, OCLC
Reported by Lucas Mak, Michigan State University Libraries

Jay Weitz began by emphasizing that the workshop would focus on the current rules, i.e. AACR2 and corresponding LCRIs, and then provided a brief overview of the history of cataloging videorecordings. According to Weitz, early rules for motion pictures actually were for "film" films, since it was the era before mass production of videorecordings. Since the concept of an integrated catalog came into being, AACR2 has been trying to use similar processes to catalog all materials, including motion pictures. Because intellectual responsibility is diverse for a motion picture or videorecording, it is usually entered under title.

Unlike videotapes which require a VCR to play, DVDs can be played on computers. The dominance of DVDs in library video collections makes almost all catalogers able to catalog videorecordings from title frames without leaving one's seat. Besides title frames, videodisc and videocassette labels are also considered as chief sources of information. Although a container is considered as a secondary source, catalogers usually get system requirement and other useful information from it.

Weitz then provided a list of criteria for making the decision on whether or not to input a new record. Besides consulting OCLC guidelines http://www.oclc.org/bibformats/en/input/default.shtm, Weitz also recommended Differences Between, Changes Within: Guidelines on When to Create a New Record, which was published by the Association for Library Collections and Technical Services (ALCTS) in 2004. Weitz highlighted two criteria from the list: significantly different length and changes in publication dates. The former requires a meaningful difference in duration that reflects a different cut or version. Since it is not uncommon to see different durations being listed on different places of an item, e.g., container, label, and time read by the machine, catalogers need to pay close attention to these differences when trying to match an OCLC record against the piece in hand. For changes in publication dates, Weitz used an analogy of hardcover vs. paperback to explain. If a change in date on the container merely indicates a redesign of packaging but without significant change of the work, that change does not warrant a new record.

In response to a question on statement of responsibility, Weitz said that 245 $c normally only includes producer(s), director(s) and writer(s) who have overall responsibilities for the work. However, under special circumstances, it is legitimate to put other entities in 245 $c, e.g., animator(s) for animation films. Narrator(s) or people who do voice over for animated characters should be recorded in 511. However, persons responsible for voiceovers of translations should be recorded in 508.

Since this was a joint conference of OLAC and MOUG (Music OCLC Users Group), Weitz spent a significant amount of time on music videos. In the broadest sense, "Music videos" apply to all music moving images including but not limited to concert films, operas, and video singles. Common practices and some LCRIs for music videos are based on recommendations in the Music Library Association's Working Group on Bibliographic Control of Music Video Material Report published in 1996. According to LCRI 21.23C, when a work with collective title has principal performer(s), the work is entered under the principal performer if there is only one, under the first named if there are two or three. The rule of three applies when there are four or more principal performers. Principal performers are "those given prominence (by wording or layout) in the chief source of information of the item being cataloged" (AACR2 21.23 A1, footnote 5). LCRI 21.23C essentially brings music video cataloging in line with sound recordings. If a work has no collective title but has principal performer(s), "popular" and "serious" idioms come into play. When principal performer(s) goes beyond mere performance or execution of a work, choice of entry follows the same decision path as in LCRI 21.23C. This category is commonly known as "popular" idiom because most music videos in this realm are popular, jazz, or rock music. On the other hand, if principal performer(s) does not go beyond mere performance or execution, a work is entered under title. Since this decision most often applies to classical and other "serious" music videos, it is commonly referred as "serious" idiom. The same consideration applies to principal performers who are corporate bodies, meaning that when a music video is entered under a corporate body, that corporate body has to be responsible to a major degree of the artistic content of the work, which is in line with AACR2 21.1B2 Category E. The Working Group emphasized that entities merely performing technical functions, e.g., producers and directors, are always not considered for choice of entry. When a music video is of mixed responsibility, e.g., videos of operas (staged or partially staged), musicals, ballets, it is entered under title because the responsibility is typically broad and diverse.

Throughout the workshop, Weitz repeatedly emphasized that catalogers should not agonize. As in Weitz's words, "publishers of videorecordings are not well-behaved," catalogers need to "be flexible" and "use your own judgment" since there are always ambiguities and exceptions not covered by rules or examples.

 

ADVANCED VIDEORECORDINGS CATALOGING
Presented by Jay Weitz, OCLC
Reported by Scott M. Dutkiewicz, Clemson University Libraries

Jay Weitz, Senior Consulting Database Specialist, OCLC, conducted this workshop which provided an opportunity for experienced catalogers of videorecordings to fine-tune their understanding of this format. For the most part, Weitz used the time to respond to questions from the participants, and at certain points provided mini-lessons on certain aspects, supported by his overheads, extensive PowerPoint document, and a set of sample bibliographic records. This summary will reference sections in the PowerPoint or records when applicable.

The first question from the participants revolved around the applicability of edition statements for such terminology such as "colorized," "letterbox," and "widescreen." Weitz reviewed the aspect ratio technology that gave rise to these terms. Letterbox is the term for the reduction of the theatrical screen ratio (1.5:1 or larger) down to the square television screen. The resulting aspect ratio is usually 1.33:1. As entertainment systems developed the capability to project the same ratios as that of the theater, widescreen versions became more prevalent. Whether such designations are edition statements, entered in the 250 field, hinges on whether the statement includes terms such as version or edition, and also involves the cataloger's evaluation of the prominence of the statement on the container. See section 6.

There are a number of possible dates that appear on the resource and Weitz provided his recommendations for sorting them out. Section 12 provides a list of date sources and bibliographic events. Bibliographic events are the date of the original production, the original release as motion picture, the release as video, and the copyright of design or accompanying material. Date sources include the video image, the container, the cassette label, and accompanying material. One could construct a matrix of 16 or more possibilities from these variables! Since few of us would wish to do that, the most important fact to focus on is the date on the moving image itself.

This date must be evaluated in light of the history of the format. Weitz announced his only absolute rule of the afternoon by stating that a DVD release cannot be dated before 1997, the origin of the format. This rule may be generalized: the publication date of a resource cannot predate the existence of its carrier. A participant asked about the dating of Blu-ray discs. Another informed participant stated that Blu-ray began in 2002.

The original date of release should be included in a note, thereby serving users who want to know about the history of the work. Another participant pointed out that the release date can be crucial for formulation of the uniform title. This is true, although the example provided may have also involved American and British versions with different valid release dates. Another participant lamented about the habit of a publisher that apparently employs lack or ambiguity of dating to suggest to purchasers that the product is perpetually new.

The issue of first date/second date in field 008/07-14 came up. The single ("s") approach is used for a resource regardless of its release history that contains substantial new material accompanying the feature presentation. The dual date ("p") approach is used for resources that simply rerelease the film (or the film and its original trailer). VHS recordings usually only had the capacity to contain the film and trailer; DVDs offer ample room for the film and a variety of extras. When Weitz was asked about a cutoff point for the amount of new material it takes to move from "p" to "s," he advised participants to use judgment and to ask how the resource is presenting itself. Subtitling and multiple language options tend to suggest new materials, so an "s" date type is recommended.

The topic of languages transitioned naturally to the application the 041 Language Code field. Weitz described the current condition of 041 as a "mess" due in large part to the double-duty use of $b for both subtitles of moving images and summaries of books. Thanks to the OLAC recommendation, a new subfield, $j, defined specifically for subtitles, was approved in October 2007, but is not yet implemented in OCLC WorldCat.

Weitz took this occasion to review the history of how we got to the DVD, using the materials in Section 9. Two disc formats, the CED and Laser Optical Disc (itself in two varieties, CAV and CLV) existed until the advent of the DVD in 1997.

Returning to the matter of language, the class was alerted to treat claims made on containers with skepticism. Weitz shared that the menu screen is the final arbiter. Language options are expressed verbally in field 546 and coded in field 041. Weitz recommended OLAC's video language best practices document for assistance in this area.

As an added hint of coming attractions, the 007 coding of videodiscs in $e (for Blu-ray) will be implemented in the next year.

The group wanted to discuss region coding. Normally encountered as a globe/number logo, these codes are to be transcribed as found on the item in field 538. Weitz shared some background on the codes. There are Regions 1-8. Region 0 is technically not a "code" but expresses all regions. Region 8 is interesting since it provides for "special international venues" such as on airlines and cruise ships. When statements made on regional materials say they "cannot be played," this means they cannot be played on players commonly sold in the region. Players that can play "any code" may be purchased. Differences in regional coding, as well as color system considerations, do call for a new record in OCLC WorldCat.

Similarly, the three color systems (NTSC, PAL, SECAM) are also recorded in the 538 field. When digital television becomes the standard in the United States in February 2009, ATSC (Advanced Television Systems Committee) will join the color system list. Recording the acronym only is permissible. There is no designated order to follow in the 538 field in video formats. Weitz again recommended judgment, referring to Record Example 7.

A participant wanted to know about whether to be concerned about recording the subspecies of DVDs, such as DVD-R, DVD-/+, and DVD-9. These variations should be indicated as such in 538. If other system requirements information is not stated, the cataloger should not guess or invent data.

There was an inquiry about situations in which information on the resource contradicted actual playability. Weitz recommended that the resource should be cataloged as it presents itself and local notes be added about actual playability. Participants were reminded that playability on a DVD player may be different from the response one obtains on a computer-driven DVD drive. Catalogers were encouraged to test questionable items on other players and unaltered equipment that the borrower is likely to use.

The workshop concluded with a brief mention of new recording formats, such as streaming video. Weitz provided an example of such (Record 19) and recommended continued used of the standard note for mode of access, although this appears obvious to the well-informed.

The wide-ranging discussion concluded with Weitz's recommendation of Library of Congress's "New Sound Recording Formats—Library of Congress Practice." Although the title does not suggest it, this document includes helpful notes on video formats the cataloger might encounter.

While the workshop did not reveal any radically new material, this reviewer appreciates the best practice reminders that contribute to better cataloging performance for this format. The expertise of both presenter and attendees was evident.

Sources: At http://www.olacinc.org/new/index.html: "Guide to Cataloging DVD and Blu-ray Discs Using AACR2r and MARC 21"
"Video Language Coding Best Practices Task Force Draft Recommendations" (http://www.olacinc.org/capc/langcodedraft1.html)
At http://www.loc.gov/catdir/cpso/soundrec.pdf: "New Sound Recording Formats—Library of Congress Practice."

 

ELECTRONIC RESOURCES CATALOGING
Presented by Bobby Bothmann, Minnesota State University, Mankato
Reported by Jan Mayo, East Carolina University

Bobby Bothmann, Electronic Access/Catalog Librarian at Minnesota State University, Mankato, gave a thorough and informative session on electronic resources cataloging, so much so that it was not apparent that he was a late substitution for the original presenter. His presentation style was relaxed and easy to follow, and he took questions from the audience as he went along, which helped to clarify the more difficult to understand portions of his material.

He began by giving an overview of what he planned to present, followed a list of links to resources for electronic cataloging, explaining a little about each one. In defining the term "electronic resources," he made the point that, to be an electronic resource, it must require a computer to be played. Playaways are a point of contention, but for the sake of national standards, they should be given the GMD "electronic resource," however, for local catalogs, the use of "sound recording" or even "playaway" as the GMD could be acceptable.

The next concept Bothmann covered was the nature and content of the resource. A convenient list of what kinds of materials can be an electronic resource followed. There are two types of access: direct, which requires a physical carrier, and remote, which uses computer networks. To determine which chapters of AACR2r to use when cataloging, first determine the primary content of the resource, and then apply the Chapter 9 (Electronic Resources) cataloging rules.

What is being cataloged must be considered. Is it a discrete or a component resource? Is it monograph, serial or integrating? Bothmann provided a chart that clearly illustrates finite vs. continuing resources. Using the appropriate Type of Record and Bibliographic level is also important.

Formerly, all electronic resources were Record Type "m." Now, this is only used for computer files, but should also be used when you are unsure if what you have is a computer file or not. If the Record Type is "m," be sure to use the appropriate File Type.

Bothmann reviewed the fixed field elements and many of the areas of the bibliographic record, highlighting the aspects that pertain to electronic resources. This included the assigning of the 006 and 007 fields; how to determine the chief source of information and elements of Areas 1, 2, 4, 5, and 7; as well as the 856 field.

He touched briefly on the use of form subdivisions and finished his presentation by displaying sample records for an e-book, a digital map, a digital image, and a blog or RSS feed, applying the rules and interpretations he had just shared with us.

 

FORM/GENRE HEADINGS
Presented by Janis L. Young, Library of Congress
Reported by Beth Flood, Harvard University

Janis Young discussed the ongoing implementation of genre/form headings by the Library of Congress. Two main objectives of the genre/form project as a whole are: 1) to assist retrieval by creating access points for genres and forms of expressions, and 2) to have a system of authority records that permit future development and maintenance and that support automatic validation of headings. LC began the genre/form project with headings for moving images and radio programs. These areas were chosen in order to identify issues and determine policies in the context of a relatively small group of headings.

An important distinction made during this presentation is the conceptual difference between genre/form headings and subject headings. LC considers genre/form headings to be headings which describe what a work actually is, rather than the subject of the work. An implication of this decision is that a record can contain both topical subject headings (MARC field 650) and genre/form headings (MARC field 655).

The preferred approach for establishing genre/form terms in the authority file is to create separate records for the genre/form heading and the term as a subject heading. MARBI originally considered a proposal for new fixed field (008) coding indicating whether the term would be appropriate as a topical and/or genre/form term. This was rejected in favor of the two record approach. Topical authority records will be coded as MARC field 150 for the authorized term; form/genre records will be coded as MARC field 155. Both records may contain the same see references (4XX fields) and broader terms (5XX fields). Subject terms used in bibliographic records (MARC field 650) which are also used as genre/form terms are now required to include a subdivision, indicating they are subject terms. For example, the term "War films" used as a subject now should include the subdivision "History and criticism" to make it clearly distinct from the genre term "War films [no subdivision]".

When a scope note indicates the term stands for a type of work, rather than the subject of a work, it is currently permissible to use LCSH topical headings as genre/form headings. If no scope note is present, catalogers should use their own judgment to determine if a term represents a genre or form. For example, the headings "Cantatas (Equal voices)," "Detective and mystery stories," and "Nautical charts" can be correct as genre/form terms, but the heading "Human figure in art" is not correct and can only be used as a topical term. Headings which are not already established as genre/form terms but can be used as such should currently be coded as local headings: 655 _7 $a [heading] $2 local

A pilot project is currently underway in which two libraries are contributing new and revised genre/form authority records through SACO and are testing a web-based fill-in form and workflow. After the project is completed, LC will begin accepting genre/form proposals from all SACO libraries. In the next few months, LC will begin using moving image and radio program headings in their cataloging. LC is also formulating timeline recommendations for implementation in two new areas, music and law.

To assist in the implementation of genre/form terms, a subcommittee has been formed by the ALCTS-CCS Subject Analysis Committee. This subcommittee is charged with facilitating communication between LC and cataloging communities interested in genre/form implementation.

 

INTEGRATING RESOURCES
Presented by Joseph Hinger, St. John's University
Reported by Amy Pennington, Saint Louis University

This workshop was a condensed version of the longer SCCTP Integrating Resources Cataloging Workshop that Hinger has given in various locations.

Hinger began by giving a brief background of the development of cataloging rules, guidelines, and codes relating to integrating resources, due to the changing "bibliographic landscape." These new AACR2 rules, LCRIs, and Leader Bibliographic level code "i" were implemented in 2002. He explained that AACR2 Ch. 12 (Continuing Resources) now has two parts for each rule: one that relates to serials and the other to integrating resources. In addition, there are two types of integrating resources: print (updates are integrated into the original base volume), and electronic (updating Web site). He made the point that just because a print resource "has holes" and lives in a binder does not make it an integrating resource; you have to look at the content and intent. The concept of "updating" is central to the definition of an integrating resource.

Hinger also spent some time explaining some of the differences between monographs and continuing resources (including both serials and integrating resources), and how to tell them apart (LCRI 1.0). Continuing resources have no predetermined conclusion, but the various parts or updates may remain discrete (serials) or not (integrating resources). A monograph, on the other hand, is either complete in one part or a finite number of separate parts. He went on to explain, however, that even a finite updating Web site (a conference Web site, for example) is still an integrating resource, and that online and loose-leaf format resources may be monographic, serial, or integrating. A CD-ROM or any other direct access e-resource cannot be an integrating resource. In terms of remote access resources, if you can access the earlier iterations you probably have a serial or multi-part monographic item; if you cannot access the earlier iterations, you have an integrating resource. If you truly cannot determine what it is, consider it an integrating resource.

The first steps in original cataloging of an integrating resource include: determining the aspect of the resource that your bibliographic record will represent, the type of issuance, the primary content (which affects the Type of Record and 008 / OCLC workform you will use), and the iteration you have (which affects how you record dates of publication). He went on to describe in more detail the MARC leader and control fields that are used for these resources.

The next part of the workshop dealt with bibliographic description (using AACR2 12.0B1b). Those areas that are based on the current iteration include: title and statement of responsibility; edition; publication, distribution, etc. (except dates); physical description (optional for e-resources); and series. Areas based on the first and/or last iteration(s) include: dates of publication, distribution, etc. Areas based on all iterations and any other source include: notes; standard number and terms of availability. One change since the 2004 update of AACR2 is that one is no longer required to use the 516 field (type and extent of resource); rule 9.3 was deleted with this update.

An important point made concerning publication information is that square brackets are not needed as long as the information comes from anywhere on/in the resource.

When discussing publication dates, Hinger emphasized that DtSt fields are extremely important, and that getting something in the Date 1 field is much better than nothing (even if it is just 199u). If you have the publication date of the first iteration (unlikely), it can be put in the 260 field. If no explicit statement of publication date of first iteration appears, put estimated date (or range of possible dates) in the 362 field. It was also pointed out that a copyright date should not be considered an explicit statement of date of publication. Although, if a range of copyright dates appears, one can probably assume a correspondence with publication dates.

Concerning note fields, Hinger mentioned that he does not generally use a system requirement note about Adobe Acrobat Reader being required, or a mode of access note that specifies "World Wide Web." He thinks those are obvious in this day and age. A source of title proper note is absolutely required.

Hinger also discussed some concerns with 856 fields. One important thing to remember is that the URL used in the 856 must match the granular level of the description (link for home page if home page is being described, for example). He recommended not using $z (Public Note) for link text (or for explaining restrictions, etc.) in OCLC records. Obviously you can do what you want or have to do to make things displays properly in your local system.

A discussion ensued about the use of classification in records for electronic integrating resources, since it is not required. The point was made that a patron browsing by call number would not find a potentially useful resource if a classification number was not provided or indexed. Some catalogers put only the class number portion without additional Cutter(s) or dates in the call number field for these resources, so that it will at least appear in a browsed call number index.

When it comes to updating integrating resource records, anything can change (just like serials), but all the changes must be reflected in the same bibliographic record.

One last important thing that was discussed was the use of the 247 and 547 fields with integrating resources. MARC field 247 subfield $a is used for the title proper when it changes, and $b is used for the corresponding dates, if known. The 245 field always reflects the current title proper, and all former titles go in 247 fields. The 547 field is a complexity note that goes with it, if further information about the 247 field(s) is needed.

Hinger kindly provided copies of his full SCCTP workshop presentation slides as a handout, and even though we did not quite make it through each slide, everyone was extremely pleased with the amount of quality information and guidance received about cataloging these tricky resources.

 

METADATA FOR AUDIOVISUAL MATERIALS AND ITS ROLE IN DIGITAL PROJECTS
Presented by Jenn Riley, Indiana University, Bloomington
Reported by Lauren K. Marshall, John Carroll University

Jenn Riley took her audience on a "whirlwind tour" of a representative sample of metadata standards compatible for use with images, audio, and video. The primary focus was on those standards used by cultural heritage institutions, e.g., libraries, archives, museums. She emphasized the importance of finding the right fit between one's needs and an appropriate metadata format. Also significant was the idea that metadata standards reflect the values of those who created them to serve specific needs in describing, managing, and/or providing access to their resources. Objectives of the workshop were to lessen apprehension about metadata formats and to help participants know what questions to ask when making metadata decisions for digital projects.

The workshop began with an introduction to XML (eXtensible Markup Language), which is used to encode many metadata formats. The use of XML as a background encoding for metadata formats enhances the shareability/interoperability of formats across systems and environments. Riley described four general types of metadata: descriptive, administrative, structural, and markup languages. Descriptive metadata serve to describe properties of resources, such as title, dates, publishers, etc. Administrative metadata help manage aspects of resources, such as preservation information, usage rights, or technical information. Structural metadata help the user navigate within a resource or between related resources, e.g., within a digitized set of 10 audio CDs, organizing information related to the order and navigation of the CDs, tracks, and related text. Markup languages are not technically metadata, but are XML coding that "marks up" the full content of a resource with metadata, e.g., "header," "paragraph," etc., within a text document.

The next part of the workshop was a barrage of metadata schema examples (only a few of which are mentioned here), with information about their properties, interoperability, and usage. First, general descriptive metadata schema, e.g., MARC, Dublin Core, were covered. These are intended for use with a variety of media/resource types and tend to be bibliographic in nature. Media-specific descriptive metadata formats were discussed next. These standards reflect specific needs related to the description and access of a particular media type (still images, music, artworks, video, etc.) and do not work well for generalization to other types of resources. Media-specific administrative metadata formats emphasize technical information involved in the creation, storage, and access of resources, e.g., file type and size, or camera/audio equipment settings at time of creation, and are often created by machine directly from digital file information. The primary structural metadata format discussed was METS (Metadata Encoding and Transmission Standard), which Riley termed a "wrapper" for packaging many types of metadata for a resource together, connecting descriptive and technical metadata with content, for example. METS documents would be generated by software tools, not people.

Riley concluded the workshop by presenting several scenarios and possible choices for implementation of metadata standards to meet the needs of those situations. She emphasized that in order to implement any metadata format, there must be tools and systems available to utilize it, and it must address the needs of the users and resources. Decisions about metadata implementation need not be constrained to the formats currently available, and Riley encouraged participation and leadership from the cataloging and metadata specialist community to contribute to the creation of useful metadata formats and the tools/systems needed to implement them. Overall, despite the rapid pace of the presentation, Riley succeeded in imparting a level of understanding that should increase comfort levels of working with and making decisions about metadata formats and their uses.

 

WORLDCAT LOCAL
Presented by Cathy Gerhart, University of Washington
Reported by Debbie Ryszka, University of Delaware

Cathy Gerhart, Music/Media Cataloger at the University of Washington Libraries, presented an overview of their implementation of WorldCat Local. She likened WorldCat Local, a new search and discovery tool developed by OCLC, to Google, saying that it is a Google-like interface to an online catalog, and used a live feed to their online catalog to demonstrate searches, displays, and product features.

University of Washington Libraries, serving approximately 60,000 on-campus users, installed WorldCat Local in a beta-test mode in 2007. The Libraries have been using this as the interface to their online catalog since then. On the University of Washington Libraries web site WorldCat Local is prominently displayed by a search box entitled "Search UW Libraries and Beyond." It offers streamlined searching and discovery for users of the University of Washington Libraries online catalog.

Gerhart explained the many reasons why the Libraries decided to install WorldCat Local, among them: one interface for everyone who uses the University of Washington Libraries online web site, one search box for many catalogs, access to one form to fill out for Interlibrary Loan users, and an easy mechanism for teaching how to search and navigate WorldCat Local and the libraries online catalog. Throughout her presentation, Gerhart reiterated that searching WorldCat Local is just like searching Google—just put something in the box. Users of their online catalog find it easy to use and seem pleased with the product. For the foreseeable future, WorldCat Local will be the way that users enter the University of Washington Libraries online catalog. To date, feedback from comments left by users has been overwhelmingly positive. Because WorldCat Local is still in a pilot phase, changes are constant. Recent additions to WorldCat Local include records for articles from major databases, such as ERIC and MEDLINE.

On the downside, Gerhart noted, a search in WorldCat Local does not give users access to everything in the University of Washington Libraries collections, but OCLC and staff at the Libraries are working to remedy that situation. Materials not included in WorldCat Local searches are on-order or in-process materials, records for works that have not been retrospectively converted by the Libraries, licensed third-party record sets such as EEBO, ECCO, and some microform sets. When users want research materials like these, they are encouraged to ask librarians for assistance.

Gerhart remarked that WorldCat Local may not be for users or scholars doing research on an in-depth level. Sophisticated researchers may not find WorldCat Local as useful as undergraduates and others seeking quick discovery. In situations such as these, researchers and scholars need to know to go elsewhere to meet their detailed information needs. When consulted, the librarians on the University of Washington Libraries staff direct these users to the right places to begin and conduct their research. Frequent and savvy users of the media and music collections at the Libraries are being encouraged to use the online catalog directly and to bypass WorldCat Local.

Gerhart showed those in attendance exactly how WorldCat Local functioned by performing specific searches. We were able to see how searches worked in WorldCat Local and how holdings for the University of Washington Libraries automatically floated to the top of search results. Gerhart navigated through specific displays by using many of the features and enhancements available in WorldCat Local. She pointed out which information in MARC records is being displayed in WorldCat Local records currently and which fields are being ignored. For media and music materials, fields 508 and 511 do not display presently, and Gerhart thought that they should. Uniform titles, relator codes, and genre headings also do not display currently. A MOUG committee is looking at these issues with representatives from OCLC.

To view the University of Washington Libraries installation of OCLC's WorldCat Local, see: http://www.lib.washington.edu/. For a more detailed description of the University of Washington Libraries implementation of WorldCat Local, consult:
Jennifer L. Ward, Steve Shadle, and Pam Mofjeld. "WorldCat Local at the University of Washington Libraries." Library Technology Reports, v. 44, no. 6 (2008).

 

RDA PROGRAM
Presented by Glenn Patton, OCLC, and Heidi Hoerman, University of South Carolina
Reported by Robert Ellett, San Jose State University

Glenn Patton, Director of WorldCat Quality Management at OCLC, discussed the history of RDA and the current and future state of development of the proposed cataloging code. As a caveat he stated that some of his projections were over 18 months in the future. The RDA prospectus indicates that while RDA was built on the foundations of the Anglo-American Cataloguing Rules, 2nd edition, revised (AACR2r) and originally called AACR3, its broader scope included not only libraries but also other metadata communities such as archives, museums, and publishers. The constituent organization responsible for the development of RDA includes U.S., U.K., Canadian, and Australian library organizations including the Library of Congress and the British Library. RDA has taken its roots from AACR2, Paris Principles (1961), International Standard Bibliographic Description (ISBD), Functional Requirement for Bibliographic Records (FRBR) and Functional Requirement for Authority Data (FRAD), the growth of electronic and digital resources with the proliferation of the Internet, University of Toronto Conference (1997) and the International Meeting of Experts on an International Cataloging Code. RDA includes element sets which encompass FRBR attributes and relationships. Mappings to the encoding standards of ISBD, MARC21, and Dublin Core will also be included as well as standardized terminologies known as RDA vocabularies. These vocabularies will make distinctions between content type, carrier type, media type, and relationship designators. Patton introduced the concept of element set, such as title including sub-type elements of title proper, parallel title, other title information. RDA's core elements are influenced by the FRBR tasks of find, identify, select, and obtain and the FRAD user tasks of find item, identify in a catalog, contextualize, and justify. Patton then discussed the entity group 1 FRBR user tasks of work, expression, manifestation, and item. The outline of RDA will consist of a general introduction, two main parts on recording attributes and recording relationships, and a number of appendices. Other communities such as publishers are working on a framework with RDA and ONIX data. A draft of RDA is projected to be available in late October with the initial release as an electronic document in the third quarter of 2009. Lastly, Patton discussed implementation issues such as testing and training.

Heidi Hoerman, Instructor, University of South Carolina's School of Library and Information Science, gave a very humorous presentation entitled "How Should I Prepare for RDA?, Should I Prepare for RDA?" Being a cataloging instructor, Hoerman stated clearly she "didn't have a horse in this race." Her best guesses about RDA were derived from reading, poking informants, and thinking about the process. Hoerman predicted that due to time constraints and economic downfalls, RDA will not be published, but instead AACR2/2010- would be published with some underlying RDA principles. RDA's goals of getting rid of AACR2 baggage, being more global, and solving the multiple versions problems are too drastic a change for the cataloging community. Hoerman indicated that goals for RDA are conflicting—both to break from the past but also be compatible with AACR2. Hoerman stated that there are several nails in the RDA coffin, including the Library of Congress Working Group on Bibliographic Control's recommendation to suspend work on RDA, and the national libraries' reluctance to implement it prior to extension testing by the national libraries and cooperative partners. Hoerman insisted that while the value and merit of RDA was being debated, the cataloging community still needs to update its existing cataloging rules.

 

POSTER SESSIONS
Reported by Rebecca Belford, University of Oregon

The well-attended poster session featured eleven posters. The presenters displayed a range of projects and developments in media cataloging and metadata: digital collections, moving image metadata schemes, cataloging tools and decisions, workflows for specific formats, and new discovery mechanisms for music.

Collaboration on digital projects was the focus of two of the posters. Kate James (Illinois State University) presented a collaboration of the Milner Library and the School of Art in "The Art of Collaboration: Creating an Effective Metadata Workflow for a Digital Project." James demonstrated the collaborative workflow for digitized art images by using a flowchart to illustrate the multiple locations for metadata assignment and review: the slide library, the digitization center, and the metadata unit. Quality control in the project occurs at multiple levels, involving review and approval first by the metadata librarian, then by the slide library manager, and final review and approval by the metadata librarian. The growing collection is available online through a CONTENTdm interface on the library's Website.

Harris Burkhalter (Minnesota State University Mankato/Westonka Historical Society) presented a collection resulting from collaboration on a statewide scale in "Metadata Use at the Minnesota Digital Library and User Research." Burkhalter presented the development of metadata practices and guidelines for the "Minnesota Reflections" collection, the first project of a coalition of museums, libraries, and colleges across Minnesota. Dublin Core—with a few modifications and additions—was chosen to organize the collection, based on the simplicity and extensibility that allow both non-cataloger volunteers and catalogers to enter metadata. The collection of over 30,000 digitized historical images and documents is available online, offering both easy and advanced search capabilities as well as a social element in permitting user comments.

PBCore, a specialized metadata standard, was the topic of "PBCore: A Dynamic Metadata Standard for Motion Media" by Tom Adamich (Visiting Librarian Service). Based on the Dublin Core metadata standard, PBCore is used to describe media created by the Public Broadcasting community. Adamich profiled the creation and structure of PBCore, addressed display with XSLT and HTML, and cited related resources. Accompanying screenshots illustrated the project's home page and the search fields available in the Educator Search mode.

Three posters addressed workflows and ideas related to cataloging specific formats: spoken-word recordings, video games, and screen cast tutorials. Lucas Mak (Michigan State University) outlined an economical solution to cataloging spoken-word recordings in "Using Student Employees in Cataloging Digital Spoken Word Recordings." The MSU Vincent Voice Library contains over 40,000 hours of spoken word material. Most of this material is not accompanied by abstracts and requires complete listening to construct accurate summaries. The library has hired students to perform the time-intensive work of listening to the recordings, to check audio quality, and write summaries. Students create brief database and OCLC Connexion template-based MARC records for each recording. The records are later reviewed and enhanced by a catalog librarian. The presentation addressed some of the drawbacks of this method, including issues of typographic accuracy, bias in summaries, and difficulties with subject analysis.

Video games were the topic of "Video Games PWN the Library" by Megan Dazey (University of Oregon). Dazey included talking points for recommending a video game collection to an academic library, noting that video games account for 15% of all circulation at the University of Oregon science branch library and that students use the collection for social events publicized through Facebook. A complete MARC record and cataloging tips demonstrated the cataloging decisions made in this project. Issues in creating a collection development policy and circulating complex items like console sets were also addressed. ("Pwn" is gamer slang for the domination of a rival, derived from the word "own".)

Marcy A. Strong (Binghamton University), in "Cataloging Screen Cast Tutorials in Dublin Core and MARC," addressed the history and workflow of cataloging tutorials created by subject librarians using the Camtasia software for research instruction. Subject librarians catalog the tutorials in Dublin Core upon creation using a feature in Camtasia. Working from a screen capture of the Dublin Core record, catalogers later catalog the tutorials as electronic resources in MARC format under the title of the resource being taught and collocate them with a consistent tracing in the MARC 793 field. In response to faculty and teaching assistants' preferences for easy access to the tutorials, records are added to the library catalog with direct links to the tutorials.

In "Use of a Series Title to Track Named Collections," Valarie Adams (University of Tennessee at Chattanooga) presented a poster rich with both MARC and OPAC examples of the Lupton Library's approach to tracing named collections with a series title using MARC field 830. In part a response to donors' desire for named collections to be kept together, the series tracing allows virtual access to a named collection without housing the collection together physically. The series titles are also used to add title access and browsing for electronic journals, audiobooks, and other formats that would be otherwise difficult to retrieve as a set.

Tools that increase efficiency and functionality in cataloging were the focus of two posters. Teressa Keenan and Leslie Rieger (University of Montana) outlined the four major phases in their library's adoption of the Macro Express utility in "All Aboard the Macro Express." The phases were discovery, which involved research into the product, cost, and training; implementation of the macros for OCLC downloading, holdings and item information, and purchase orders; sharing within Mansfield Library; and future possibilities and evaluation. Rich with advice and supporting statistical evidence on the reduction in time spent on specific workflows and in repetitive keystrokes, Keenan and Rieger demonstrated the increased efficiency gained at their library through the use of Macro Express.

Susannah Benedetti and Gary Moore (University of North Carolina-Wilmington) also demonstrated helpful utilities for catalogers in "Catalog 2.0: Implementing Browser Tools for Customized Searching." A set of "2.0" utilities was compiled for catalogers at their library: a search box in the library toolbar, imbedded search boxes, tutorials, and ISBN searches. Catalogers can use these tools to access the library catalog directly without first navigating to the OPAC and to access external resources like Classification Web, OCLC's Bibliographic Formats and Standards, and local resources. While the tools are of high value to catalogers, many also enhance search efficiency for public users.

Addressing the practical need to track library collections, Gayle Porter (Chicago State University) offered information and advice in "Lessons from Using RFID on Media: A Case Study of RFID Implementation at Chicago State University." Porter discussed RFID technology, retrospective conversion issues, pros and cons of use for media, and best practices for RFID use on various media types. Numerous examples of fully processed media items supplemented the information in the poster and provided a forum for audience questions.

Departing from traditional cataloging and metadata, Susannah Cleveland and Gwen Evans (Bowling Green State University) presented "Moody Blues: The Social Web, Tagging, and Non-Textual Discovery Tools for Music." The HueTunes project, in an early phase, grew out of conversations about the needs of the graphic design department in locating album cover art. Currently, users tag musical selections by selecting a color from a palette. Phase 2 will see increased data collection and analysis. The project aims to reduce language barriers, reach non-text-based learners, reduce the dependence on expert knowledge in interpreting catalog records and finding music, and examine the relationship between music and mood or color.

The posters as a group represent the diversity of activity in audiovisual and multimedia cataloging in a variety of different libraries. Innovations in traditional workflows coexist with collaborative digital collections, unique metadata schemes, non-textual discovery, and "2.0" features. The session demonstrated that traditional AV cataloging is thriving while moving in new directions.

 

 

Publications:

After extensive consultation UK SIG announced the release of Version 2.0 of the Transfer Code of Practice available at: http://www.uksg.org/Transfer/Code. Publishers are being asked to publicly endorse and follow the Transfer Code of Practice, a set of voluntary "best practices" for the industry, by emailing marketing(at)uksg.org and stating that they endorse the Code. Publishers endorsing Transfer will be listed on the Publisher Endorsement page at: http://www.uksg.org/Transfer/Transfer_Publishers. If the Code is widely adopted, the TRANSFER Working Group may consider some next steps, such as establishing a more formal international committee to consider such issues as practical guidelines for implementers, developing a simple alerting service at a centralized location, etc. Questions may be addressed to the Ed Pentz, Chair of the Working Group, at epentz(at)crossref.org.

Following a survey to solicit subscriber feedback about Cataloger's Desktop, CDS acted on the popular recommendation to integrate Cataloging Service Bulletin (CSB) into Cataloger's Desktop. Plans are to include the CSBs from issue 84 (Spring 1999) to present. Issues 1-83 will not be scanned and included in Desktop because they would not be searchable. Questions can be addressed to Bruce Johnson at bjoh(at)loc.gov.

 

Projects:

Indiana University received a National Leadership Grant from the Institute of Museum and Library Services for a project which will begin October 1, 2008 entitled "Variations as a Test-bed for the FRBR Conceptual Model" to be based on Indiana University's expertise in digital music libraries and the Variations digital music library system (http://www.dlib.indiana.edu/projects/variations3/index.html). The library will provide a concrete test-bed for the Functional Requirements for Bibliographic Records (FRBR) conceptual model to enhance code and system design that can be re-used by others interested in FRBR (http://www.imls.gov/news/2008/091008a_list.shtm#IN). Jenn Riley elucidated on the project's primary activities in an announcement sent out to the Autocat Listserv on September 12, 2008. They include:

The project expects to have the following concrete work products:

This study should serve the cataloging community well.

 

Online and Public Training of Interest to Catalogers:

Aslib, The Association for Information Management, based in London, England offers selective open learning courses (http://www.aslib.com/training/openlearning/index.htm) and a number of public Library and Information Management Skills courses of interest to cataloguers such as Abstracting and Summarising, Basic Cataloguing & Indexing, Building and Deploying a Corporate Taxonomy, Cataloguing Practice, Classification Practice, and Constructing a Thesaurus (http://www.aslib.com/training/section4.html). For a calendar of public training sessions, see http://www.aslib.com/training/calendar/calendar.html.

Innovative Interfaces announced Innovative University in its July 2008 Innovative News: INN Brief. Innovative University consists of various two-hour WebEx-based training webinars of general interest to Innovative sites seeking a cost-effective way to offer training on Millenium. Some of the webinars offered in the near future include: Millennium Acquisitions - Fiscal Close, Millennium Acquisitions - EDIFACT Invoicing, Millennium Create Lists - Advanced, Customizing Millennium Print Templates, and Global Update Basics.

The American Library Association (ALA) now has two 25-seat virtual rooms, one 50-seat virtual room and one 100-seat virtual room available for use by any ALA group or affiliated group at any time. Rooms are run through OPAL (Online Programming for All Libraries) but are booked by ALA staff. Groups should contact your ALA staff liaison for reservations.

 

 

Footnotes

1 Kathy Glennan, "Final Report available: MLA's BCC Working Group on Work Records for Music" message posted on RDA-L listserv, Sept. 9, 2008.
2 WorldCat.org interface features website at http://www.oclc.org/worldcatorg/features/default.htm (Viewed Sept. 28, 2008).
3 "Introducing Content Pro" Innovative News INN Brief July 2008, email newsletter to Innovative subscribers.
4 Jenn Riley, Variations/FRBR project funded, email message sent to AUTOCAT distribution list, September 12, 2008.

 

Return to the top of the page.

 


©Taylor & Francis