Cataloging & Classification Quarterly

Volume 38, no. 2, 2004





 Sandy Roe, News Editor


Welcome to the news column.  Its purpose is to disseminate information on any aspect of cataloging and classification that may be of interest to the cataloging community.  This column is not just intended for news items, but serves to document discussions of interest as well as news concerning you, your research efforts, and your organization.  Please send any pertinent materials, notes, minutes, or reports to: Sandy Roe, Milner Library, Illinois State University, Normal, IL 61790-8900 (email:, phone: 309-438-5039).  News columns will typically be available prior to publication in print from the CCQ website at


We would appreciate receiving items having to do with:


Research and Opinion 




Meeting Minutes of the ALCTS Directors of Technical Services of Large Research Libraries Annual conference (“Big Heads”), held during the Annual American Library Association Meeting, Toronto, Ontario, June 20, 2003

For the text of the Round Robin on issues of concern to these institutions see

Welcome and introductions, announcements (Chair)

Bob Wolven (Columbia) was elected Chair Elect; he will serve as Chair in 2004/05. Sally Sinn (NAL; outgoing Chair) thanked Arno Kastner for his efforts as Vice-Chair.

Report on University of California Collection Management Initiative (UC/CMI)
(Brian Schottlaender, UC San Diego)

Brian noted that when Big Heads heard the last update on the Collection Management Initiative six months ago (see the Initiative was on the verge of starting its second phase: a qualitative survey of user preferences. They are now having talks with College and Research Libraries to publish the findings of that survey.

In the first phase the Initiative analyzed the use of some 300 electronic titles which had been portioned among 4 categories: Arts and Humanities (22 titles), Life and Health Sciences (130 titles), Physical Sciences and Engineering (102 titles); and Social Sciences (26 titles). The titles had good representation across 12 publishers. For each title one campus (experimental) sent all print copies of the title to storage while another campus (control) kept the print and monitored physical use through reshelving counts. Conclusions: while use of print journals was higher on the control campuses (6,044 uses) than on the experimental ones (201 requests to recall print from storage), on both types of campuses digital use was much greater than use of print versions (160,180 uses on experimental campuses and 97,493 uses on control campuses). They also studied journal use at campuses prior to the start of the study and found that use of electronic journals on the experimental campuses had already been high.

The second phase consisted of a 2 month survey of 20,000 faculty, staff and students. They got more than 7000 responses for a response rate of over 30%. Fifty-four percent of the respondents were graduate students, 23% were faculty; the remainder were health care professionals, researchers and post-docs, undergraduates, University of California staff, and a few miscellaneous.

Both faculty and the total respondents said research needed both electronic and print journals; however, more said their research was dependent on the availability of electronic versions than the number who depended on print. The frequency of electronic use was lowest for the Arts and Humanities users but even so, 40% of them had used an electronic journal within the previous week.

Fewer than 20% of the respondents agreed with the statement that print was more reliable (defined as available when wanted); while over 70% of faculty (and 80% of total respondents) agreed that e-journals are a suitable alternative to print. Over 80% of the total respondents agreed with the statement that electronic journals were accessible.

Almost half of all respondents (and one-third of the faculty) liked electronic journals for browsing current issues. Nearly 50% of faculty liked e-journals to keep current in and out of their fields; they also liked e-journals for comparing and contrasting articles. However, use of e-journals in course assignments was less than 50%. Respondents from the Arts and Humanities were least likely to use e-journals in course assignments.

Among the advantages given for electronic journals were that there was no need to go to the library for them and that they were always available (over 90% of all respondents).

Among the “availability of content” barriers to electronic journal use was unavailability of older issues (given by 90% of all respondents) and of recent issues (over 50% of all respondents). In terms of “ease of use” reading on screen was considered a barrier by almost 70% of respondents (over 70% of faculty). Other “ease of use” barriers were annotation limitations (many of the respondents didn’t know how to annotate or highlight electronic text), difficulty of moving between sections of articles, etc. There were also “computing equipment” barriers related to such things as authentication of off-campus users, speed of home Internet connections, etc. Faculty were more likely than other respondents to admit that deficiencies in their own computer skills were a barrier to use (almost 40% vs. about 25% of all respondents).

Summary: the unavailability of backfiles was considered the greatest barrier to use of electronic journals.

The demographic variables differed in degree, not kind. The strongest variable was affiliation (defined as whether you were a student (either graduate or undergraduate) or faculty (including health science professionals and post-docs)), which was followed by discipline, age, gender, and campus. One’s affiliation had a significant effect on how one answered 67 questions, while discipline mattered for only 37 questions.

Karen Calhoun (Cornell) asked how decisions based on the Initiative’s findings will be made. Brian said that the individual campuses will use the findings as they deem appropriate. The University of California System will mine the study’s findings for use in license negotiations. It has agreed to test the Counter (Counting Online Usage of NeTworked Electronic Resources) standard ( which is an emerging standard on how electronic journal providers will provide usage data to customers; UC is insisting that all electronic journal suppliers be Counter compliant. Cindy Shelton said UCLA has used Counter and has pretty well decided to do away with multiple (print and electronic) journal subscriptions; however, the Social Sciences and Humanities still need print for monographs. Brian added that UC-San Diego will use the study results to cancel about $400,000 worth of print journals. During the journal use phase, UCSD only had to bring 4 journals back from storage; all were mathematics journals.

Judi Nadler (University of Chicago) asked, If this is so, so what? At Chicago they need to cancel print journals. Given that one will have to honor differences by discipline, do you still have across the board allocations and cuts or do you make each discipline responsible for raising its own funds? Brian said that at UCSD funding is discipline based. Duane Arenales (NLM) commented that some users have expressed concern about stability of content with electronic journals and asked if the UC/CMI was looking further at that. Brian said they were, that the California Digital Library (CDL) has started a program for the care and feeding of digital content over the long term, in fact, digital content is the CDL’s highest priority.

Bob Wolven (Columbia) said he had noticed that printing charges were considered to be a low barrier to the use of electronic journals and wondered why that was so. Brian answered that much of the printing costs were subverted by grant funding.

Report on Batchloading Survey and OCLC Response
Big Heads Task Group: Arno Kastner, Lee Leighton, Joan Swanekamp (Glenn Patton, OCLC)

Lee Leighton (Berkeley) reported that there had been an 89% response rate to the survey (24 responses from 27 Big Heads members). Fourteen of the libraries (58%) batch load original cataloging and they identified 9 categories of records that are excluded from batch loading or that did not load successfully. These are:

Mixed materials – 8 libraries
Computer files – 4 libraries
Serials – 5 libraries
Non-books – 1 library
Government documents – 1 library
E-resources – 1 library
Wade/Giles transliterations – 1 library
Original cataloging – 1 library
PCC records – 1 library

Still another problem identified in the survey was the fact that only the Library of Congress (LC) and the National Library of Medicine (NLM) send batches of records in format-specific loads, all other libraries mix formats in a single load. Another type of problem identified at the Midwinter 2003 Big Heads meeting was ‘group loading’, where an institution sends a file of records that includes more than one OCLC holdings symbol.

After e-mail discussion of the Task Group report in April 2003 the Task Group had been charged to draft a letter to OCLC raising the Big Heads’ concerns about the records in the excluded and unsuccessfully loaded categories. The letter explicitly noted the 600,000 retrospective conversion records from the University of Minnesota and the 250,000 original cataloging records from Yale University that were not loaded.

Glenn Patton then gave the OCLC response to the Task Group’s letter and report. He started with some general comments about OCLC’s batchloading policies and then commented specifically about several of the points in the report. [I have quoted liberally from the written version of Glenn Patton’s response. JH]

OCLC’s batchloading policies: General comments

Service level agreements:
“OCLC has service-level agreements with its Regional Service Providers to set expectations about services which OCLC provides. In the case of Batchloading, OCLC’s commitment is to complete the evaluation and setup processes for new batchload projects within 90 days and, once a project becomes an ongoing one, to process newly received files within 7 days. Over the 18 month period from July 2001 through January 2003, OCLC completed 258 new setups with an average turnaround time of 56 days. During that same period, average turnaround time for processing new files for existing setups was 1 day with an average of between 600 and 700 files per month.”

Adding holdings versus adding original records:
“… Given the requirements of many of the large state-wide or regional projects that we work on, we have had to give a greater emphasis to adding holdings [to existing records over adding original records]. In addition, [he noted that] it’s important to understand that batchloading institutions are not automatically set up to add records. That requires a separate evaluation process and is done only at the explicit request of the institution or the Regional Service Provider.”

Evaluation process for adding records:
“Files of original records to be considered for addition to WorldCat go through an evaluation process that is both more rigorous than the one used for records used to set holdings and much more time and labor intensive. The goal of the evaluation is to insure that the records are of sufficient quality and that the duplication rate is less than 20%. The ‘quality’ aspect is admittedly somewhat subjective but MARC validation is also an important part of the process in order that records added to WorldCat do not contain errors that other member libraries will be required to fix.”

“Frequently in the evaluation process, OCLC staff identify changes in the records that will improve our ability to process the records. Sometimes that involves updating MARC tagging and subfielding. In other cases, we may be able to supply coded data based on the content of the record. Over the years, we have developed a large repertoire of software modules that we can use in this cleanup. These modules are invoked as part of the setup each time we process a file from a particular institution. As an example, the setup used to add original records for one of the Big Head institutions involves some 40 separate modules. Some of these are from our repertoire but others are developed specifically for that institution.”

MARC validation:
“… there is a wide variation in the level of MARC validation (are the fields, subfields and indicators valid, are fields with prescribed content correctly formed, etc.) in local systems. Over the years, OCLC staff have frequently heard that local systems depend on OCLC’s validation routines. That worked when the primary record flow was from OCLC into the local database but works less effectively as workflows have shifted in the other direction.”

Effects of local system migration:
“Another reality of batchloading is that MARC output is almost completely dependent on the capabilities of an institution’s local system and, when the institution migrates to a new local system, files must be evaluated again to insure that a library’s existing setup will still work when OCLC receives data output from the new local system.”

Comments on the specific problems resulting from the survey:

“Records for serials are indeed among the problem areas. Currently, OCLC adds serials records via Batchload in only a very few cases. The considerable variations in serials cataloging practice (including, for example, consolidating multiple physical format[s] into the record for the print, successive versus latest entry, etc.) make accurate matching very difficult. As a result, the risk of adding duplicate records is greater than we’re willing to tolerate.”

Other types of materials:
“OCLC currently does not have matching algorithms for computer files and for mixed materials. These are being worked on as part of current development activity to move batchload processing into OCLC’s new Oracle-based platform. In addition, as part of that effort, we are attempting to develop the capability to load non-MARC records, which will be an important part of dealing with electronic resources in the future.”

“In addition, our ability to match accurately for various non-book formats is less effective than we would like.” [Glenn interjected that one reason is the lack of standard numbers for many of these materials.] “As part of the development work mentioned above, we have been experimenting with using additional matching points and that looks promising.”

“The report also mentions concerns with loading government documents and Wade-Giles transliterations. We were not aware of concerns in this area and would welcome the chance to explore these issues further with the institutions involved.”

PCC records:
“Until June 2002, OCLC was not able to process batchloaded PCC records correctly. At that time, we implemented software that allows PCC records to be separated from mixed files of records and processed through a separate job stream. We are currently processing PCC records for 4 of the Big Heads libraries with Batchload definitions in process for 2 more.”

Batchloading for Groups:
“Although OCLC does not set holdings for multi-institution groups, it is true that we have not been able to add original, non-matching records from these groups. [Glenn noted orally that often the 040 field is lacking in these records]. However, we are currently in the process of developing software to allow adding records for groups. The details of this should be forthcoming in the next couple of months.”

Separating large files by format or holdings:
OCLC was perplexed by that problem since they “routinely segment files by record type or holdings. Please let us know if we misunderstood your concerns in this regard.”

Concerns of the University of Minnesota, Yale University, and Penn State University:
“OCLC is currently working with those libraries’ Networks to clarify and address their concerns.”

Sally Sinn (NAL) thanked the task group for their survey and summary report. She also expressed the Big Heads’ appreciation to Glenn Patton for coming and responding to its concerns, expressing the hope that this dialogue would lead to constructive action. She noted that OCLC’s response reveals that priorities for addressing our batchloading concerns are influenced by two types of considerations: technological solutions and OCLC policies as reflected in agreements with regional groups. The latter have favored setting holdings over adding unique records. The Big Heads believe that adding original records is a greater contribution than adding holdings. She asked if Glenn had any feeling about trends; i.e., whether there would continue to be an emphasis on adding holdings for interlibrary loan use over increasing the number of records representing unique titles in our collections. Glenn Patton (OCLC) said he saw something of a shift towards regional groups with an emphasis on resource sharing; now there was greater interest in showing the entire holdings of a group rather than facilitating the sharing of widely held materials.

On a positive note, Judi Nadler commented that the University of Chicago had had thousands of records not loading: archival materials, recon, etc. After OCLC had studied the Task Force report, those records had been loaded. However, she tempered enthusiasm with caution, noting that this Task Force report was actually a follow-up to an earlier report. [2001 Big Heads Survey on Copy Acceptance Policies (Arno Kastner, Chair; Beacher Wiggins; Judi Nadler; Katherine Farrell; and Barbara Henigman) In light of the results of this survey it had been recommended to engage the utilities in the dialog on copy availability] She thought that the biggest problem was that OCLC did not bring to the attention of the sending library the records it was not able to load; instead, OCLC just put them in some local file. She also expressed concerns about the sustainability of the loading of unique materials in all formats.

Arno Kastner (NYU) said he was not clear about who makes adding holdings a higher priority over adding unique titles. Glenn Patton said that decision is often based on contractual agreement with regional groups. Merging, de-duping and adding records take more resources than does adding holdings. Sally Sinn (NAL) asked if that translated into favoring adding holdings over original titles; he said Yes.

Joan Swanekamp said that Yale still has a healthy cataloging staff and lots of their records are unique: maps, etc. She thought that resource sharing is furthered by adding unique materials rather than one additional holding to a record that already has many holdings.

Lee Leighton (Berkeley) echoed Judi Nadler’s congratulations over OCLC’s loading of PCC and non-book records. He commented that he understood OCLC's emphasis on checking coding because it could easily be done by program, and it is much more difficult to verify the content of MARC fields; now many vendor records are being added with good MARC coding but poor content.

Carol Diedrichs (OSU) commented that she had learned as a member of the OCLC Members Council that OCLC is trying to balance the needs of different communities: they are hearing strongly from ILL about wanting holdings added as a priority while international members are concerned about American libraries overlaying their records. She encouraged the other Big Heads to serve on the OCLC Members Council as a good way to influence OCLC policy. Bob Wolven (Columbia) agreed with her, saying to Glenn Patton of OCLC, You are trying to balance competing needs. Encourage people learning, through Members Council and groups such as Big Heads, that that needs to be done.

Beth Picknally Camden (University of Virginia) asked what OCLC does to inform libraries that have MARC coding problems. Glenn said such information is provided on reports; whether or not libraries download those reports he didn’t know. When OCLC evaluates an institution they are in contact with that library’s representative. OCLC also has an active program with different vendors about local system features but he thought they probably don’t work with vendors enough. Beth Picknally Camden (University of Virginia) said that reports are often not clear enough about specifics of the problems.

Electronic Resources Management Metadata
(Tim Jewell, University of Washington)

Tim Jewell noted that Big Heads has been interested in this topic for some years and has heard several reports on it (see for the most recent report). Several years ago in San Francisco (2001) he had reported on what was available at that time ( Recently the project has worked on a revision of the Entity/Relationship model based on the Functional Requirements for Bibliographic Data (FRBR) model ( In process is a roadmap or overview of existing or proposed standards and their relationship to the work being undertaken. Also in process is an XML schema being worked on by a group chaired by Adam Chandler (Cornell). Various meetings have been held of people interested in working on e-resources management metadata. He announced that a general update meeting under Big Heads sponsorship would be held that evening at the Sutton Place Hotel from 7:30 pm to 9:30 pm.
Tim Jewell and Adam Chandler have created a web hub for Electronic Resources Management metadata which was last updated yesterday (June 19, 2003). Go to In May 2002 an e-resources metadata standards workshop was held (see At the workshop they talked about problems and presented draft documents on such topics as descriptive metadata, licensing, and access and administration metadata. Sally Sinn (NAL) asked about the relationship between the work of this group and portals such as SFX. Tim Jewell said their work deals with resources discovery and the portability of information about electronic resources to portals, gateways, etc. Cindy Shelton said that UCLA has been developing its own homegrown management system and asked if he knows of any vendor systems that might be available. Mr. Jewell said that Innovative Interfaces Inc would have a system in Beta testing within a month or so and that Ex Libris is looking to develop such a system.

Report of Cornell Benchmarking Analysis
(Karen Calhoun, Cornell University)

The purpose of the Benchmarking analysis was to gather statistics on technical services staff size in those ARL libraries which are members of Big Heads: size, trends in technical services staff sizes, definitions of technical services, proportion of professional staff to non-professional, etc. The percentage of technical services staff size to total staff ranged from a low of 11% to a high of 33% with an average of 23%. The overwhelming impression is that the size of technical services staff is declining. The proportion of professional staff size to non-professional staff ranged from 16% to 36% with an average of 25%. The responding libraries define technical services in various ways. There is a core cluster of activities included by all respondents: Cataloging, Serials control, Catalog maintenance, Retrospective Conversion, Metadata, and Physical processing. All but one library includes Acquisitions. Other activities included by some libraries as being part of technical services are: Electronic Resources (10 Y, 1 N), Gift and Exchanges (9 Y, 2 N), Binding (8 Y, 3 N), Preservation (6 Y, 5 N), and Stacks (1 Y, 10 N); the same library that did not include Acquisitions did include Electronic resources. Some respondents reported only data related to central technical services; others reported all technical services centers across a campus. Some included archivists, others did not.

Judi Nadler (University of Chicago) commented that some of us employ a large number of students, others a small number; she asked Karen if she had a sense of how many students her respondents employ? Karen Calhoun (Cornell) said she did not and suggested that if others did a more thorough survey of technical services, they should do a more detailed type of staff counting.

Joan Swanekamp (Yale) asked what kind of standards Karen had developed for counting decentralized technical services areas. Karen replied that in support of a library workforce planning initiative Cornell recently did a survey of all 453 library staff across the entire campus. If any other library wants to see the Cornell survey Karen would be happy to provide copies.

Duane Arenales (NLM) encouraged ARL to use some term other than Non-professional which many members of her staff find demeaning. Joyce Ogburn (University of Washington) said we need to expend the effort to determine what outcomes we are achieving with our existing staff, decreasing though it probably is. Karen said that one of the findings of the workforce review of technical services was that Cornell’s technical processing centers do not have commonly-shared, outcome-based measures of service quality, and that further research into this topic is warranted.

Digital Archiving: LOCKSS participants’ discussion

Sally Sinn (NAL) noted that some of the libraries represented at Big Heads are members of LOCKSS [Lots of Copies Keep Stuff Safe, a model for creating low-cost, persistent digital “caches” of authoritative versions of http-delivered content, see] and asked what people are doing or intending to do with it. NAL is a participant but hasn't yet resolved problems with getting the software to run on their server because of firewall constraints. Bob Wolven said that Columbia is an active participant in the sense that they have a server running and load every version but nothing beyond that; it is more a test system than real archiving of data. Judi Nadler said that the University of Chicago also has a server but is not actively testing the software. Cynthia Clark reported that the New York Public Library is also participating and has loaded the latest version of the software; they have targeted dance journals as a test collection to be archived. Many of the journals are newsletters, emanating from mom and pop type of operations, and NYPL is taking time to make arrangements with the various groups that put them out.

Someone asked, “What do you mean by ‘archiving’?” Bob Wolven said that Columbia’s interest is in ensuring long term access. The real question is administrative: getting a group of libraries to agree about what content they are going to maintain. None of the structures are in place to ensure that multiple libraries are caching the same titles. What happens if a library decides to cancel subscriptions? They need to verify where access is available: at a publisher’s site or on a local server. Judi Nadler (Chicago) suggested discussing the next steps at a future Big Heads meeting, asking ‘what is the economic model that will support this structure’?

Changes in Big Heads representation

Sally Sinn (NAL), on behalf of the Big Heads, bid farewell to Carol Diedrichs who is leaving OSU to become the Director of Libraries at the University of Kentucky; she also welcomed Sally Rogers who will be the new Ohio State University representative and Phelix Hanible, the new University of Michigan representative.

Judith Hopkins
State University of New York at Buffalo


ALCTS announces Margaret Mann Citation recipient

Barbara Tillett, Chief, Cataloging Policy and Support Office at the Library of Congress, is the recipient of the 2004 Margaret Mann Citation presented by the Association for Library Collections & Technical Services (ALCTS) Cataloging and Classification Section (CCS) of the American Library Association (ALA).

The award is a citation and a $2,000 scholarship donated in the recipient's honor by OCLC Online Computer Library Center, Inc., to the library school of the winner's choice. It recognizes outstanding professional achievement in cataloging or classification either through publication of significant professional literature, participation in professional cataloging associations, demonstrated excellence in teaching cataloging, or valuable contributions to the technical improvement of cataloging and classification and/or the introduction of a new technique of recognized importance.

The Margaret Mann Citation Committee is pleased to present this award to Barbara B. Tillett for her extraordinary contributions to both the theory and the practice of cataloging. In particular, the Committee notes her work developing and explaining IFLA's Functional Requirements of Bibliographic Records (FRBR), implementing the Library of Congress's first integrated library system, spearheading work on the Virtual International Authority File, leading IFLA's efforts to develop an international cataloging code, and contributing substantively to a new edition of the Anglo-American Cataloguing Rules (AACR). Her achievements in all four of the award's criteria have given shape and direction to the work of many others in our profession, catalogers and non-catalogers alike.

The FRBR conceptual model represents a significant advance in cataloging theory and is now being used as the foundation of much theoretical and practical work to improve user access to materials in libraries and in other, non-library collections. FRBR concepts, as well as reshaping cataloging rules and practice, are now being extended to authority records and are being applied practically in increasing numbers of library and information management systems.

The Virtual International Authority File (VIAF) is a conceptual and technological building block for both a powerful cataloging tool and for the practical development of the semantic Web. The VIAF project attempts to link authority records from around the world and make them available via the Internet, allowing national or regional variations in authorized forms to co-exist and supporting worldwide users' needs to see names in their preferred language, script, and spelling. As Edward T. O'Neill, Consulting Research Scientist at OCLC noted, Dr. Tillett "distinguished herself from the other visionaries by tirelessly campaigning to convince others of the merits of her idea and obtaining their commitments to pursue the vision."

At a time when many are challenging the relevance and utility of cataloging, Dr. Tillett has breathed new life into the field, centering her work on meeting the needs of the library user through innovations in library technology and cataloging concepts. Her work represents an essential bridge between the world of traditional catalogers and that of metadata librarians, enabling each to understand and appreciate the importance of the other. "Her work has greatly contributed to the concepts that are now transforming the conceptual basis of cataloging, cataloging practices from rule making to OPAC design, and the application of cataloging concepts and practices to metadata for digital materials on the Internet," said Matthew Beacom, Catalog Librarian for Networked Information Resources at Yale University. "The combination of innovative conceptual breakthrough with the insight to see their practical applications typifies ... Dr. Tillett's greatest strength," wrote John. C. Attig, Authority Control Librarian at Penn State. Attig further noted that, "Because of this ability ... Dr. Tillett provides inspired leadership, encouragement of the creative work going on ... and a commitment to redefining the conceptual foundations of cataloging in new and exciting ways."

Dr. Tillett holds a master's degree in library science from the University of Hawaii, Honolulu and a Ph.D. from the University of California, Los Angeles.

The Margaret Mann Citation will be presented on Sunday June 27th 2004 at the ALCTS Awards Ceremony during the ALA Annual Conference in Orlando.

Margaret Mann Citation Committee

CCQ Homepage | Tables of Contents | Back to vol. 38, nr.2 | Informaworld |

Comments to: Jeffrey Beall at
Haworth Press, Inc.