Cataloging & Classification Quarterly
Volume 32, no. 3, 2001
Roe, News Editor
to the news column. Its purpose is
to disseminate information on any aspect of cataloging and classification that
may be of interest to the cataloging community.
This column is not just intended for news items, but serves to document
discussions of interest as well as news concerning you, your research efforts,
and your organization. Please send
any pertinent materials, notes, minutes, or reports to: Sandy Roe; Memorial
Library; Minnesota State University, Mankato; Mankato, MN 56001-8419 (email:
mailto:firstname.lastname@example.org; phone: 507-389-2155).
News columns will typically be available at the CCQ website (http://catalogingandclassificationquarterly.com/) linked from their
corresponding volume and issue numbers prior to their appearance in print.
would appreciate receiving items having to do with:
Abstracts or reports of on-going or unpublished research
Bibliographies of materials available on specific subjects
Analysis or description of new technologies
Call for papers
Comments or opinions on the art of cataloging
Notes, minutes, or summaries of meetings, etc. of interest to catalogers
Description of grants
Description of projects
Announcements of changes in personnel
Announcements of honors, offices, etc.
An electronic discussion on XML and its use in libraries began on Feb.
12, 2001. The Extensible Markup
Language (XML) is being used by libraries for a variety of purposes. The purpose
of this electronic discussion is to assist library staff in learning about XML
and how to apply it to library problems and opportunities.
To subscribe send the message "subscribe XML4Lib YOUR NAME" to
information, including the discussion archive, can be found at http://sunsite.berkeley.edu/XML4Lib/.
This list is hosted by the UC Berkeley Library; Roy Tennant is the list
Minutes of the ALCTS Technical Services Directors of Large Research Libraries
Discussion Group ("Big Heads"), held during the Midwinter American
Library Association Meeting, Washington, DC, January 12, 2001.
and introductions (Lee Leighton, Chair)
Heads/ALCTS Research Initiative (Nadler)
Nadler reported on the progress of an initiative that started at the Chicago ALA
annual conference (summer 2000) with a discussion of BIBCO records and records
from external sources in general. At the suggestion of Karen Muller (ALCTS
Executive Director) Big Heads proposed that the ALCTS Board authorize a task
force to work with Big Heads on a research project that will help Directors of
Technical Services improve operations and access and learn more about how our
patrons make use of what technical services librarians do. The Board commended
Big Heads for its initiative and authorized the establishment of a task force to
define the research questions, obtain funding for the research, and facilitate
the research. The Task Force consists of Judith Nadler representing Big Heads,
Karen Schmidt from the ALCTS Board, and representatives from the ALCTS Research
and Statistics Committee.
Task Force has drafted 3 questions:
What is the percentage of overlap among large library collections (the
Big Heads Group, for instance) for monographs and serials?
Is there any evidence that the percentage of overlap for monographs and
serials is growing or shrinking?
What are the record acceptance policies of the large libraries for
printed monographs and serials and how do these policies affect cataloging
output? What are the record
acceptance policies of the large libraries for electronic resources, and how do
these policies affect cataloging output? In these two questions the Task Force is wondering about the use of
records from outside sources (BIBCO records, vendor records, etc.) and the
required standards for these records to make them acceptable to libraries
without additional local work.
In these two questions the Task Force is wondering about the use of
records from outside sources (BIBCO records, vendor records, etc.) and the
required standards for these records to make them acceptable to libraries
without additional local work.
Are the standard subject analysis policies and practices of the large
libraries (LC classification, Dewey Decimal Classification, LC subject headings)
adequate for contemporary library patrons who must search library catalogs as
well as the web in conducting their research? What level of uniformity is
question 3 the Task Force is trying to get at user needs and expectations in an
increasingly mixed environment of controlled and uncontrolled resources.
2 seems to generate the most interest. Karen Schmidt had reminded Judith Nadler
that the CIC (Committee on Inter-institutional Cooperation) used the last North
American Title Count (NATC) data to look at overlap.
3 is the most complex.
Nadler pointed out that the questions have the potential to generate valuable
information but responses will be only as good as the questions asked.
Wolven (Columbia) asked for clarification on question 3. Is its emphasis what we
are doing enough or that we should we do more? Judith Nadler said it is asking whether we are doing the
Wiggins (LC) added: If what we are doing isn't sufficient, why isn't it? What is
missing? And what are respondents basing their responses on: anecdotal evidence,
what reference staffs have said?
Rogers (Univ. of Pennsylvania) asked at whom the survey questions would be
aimed: staff or non-staff library users. Judith
Nadler said she hoped to get input both from users and from those who see the
records internally and can comment knowledgeably.
Sinn (NAL) said the questions seem a mix of true research and those relating to
work procedures. At NAL the cost center is moving from dealing with print
materials to dealing with electronic products.
She felt a need for information on overlap of collections. She expressed
support for questions 1-2.
Pitts Diedrichs (Ohio State) referred to some 1990s studies by Anna Perrault on
Anna H. "The Shrinking National Collection: A Study of the Effects of the
Diversion of Funds from Monographs to serials on the Monograph Collections of
Research Libraries." Library Acquisitions: Practice & Theory 18 (Spring
Anna H. "The Printed Book: Still in Need of CCD." Collection
Management 24, no. 1/2 (1994-95): 119-136. Also available at http://wwwcrl.uchicago.edu/info/awccconf/awpapersgenl.htm.
Swanekamp (Yale) expressed interest in overlap in backlogs since they are a
large component in many large libraries' cataloging activity. She thought that
the answers to question 3 will govern what we decide about number 2.
Shelton (UCLA) said she was interested in questions 1-2.
asked if the purpose of the research survey was to collect facts or a wish list?
Judith Nadler said they were interested in collecting facts.
Arenales (NLM) said the first question was interesting and thought it would be
good to get someone from Big Heads of Collection Development involved. Some
aspects of the questions are interesting for the world as we know it, a world
that is shifting under our feet, and we need to focus on the future. The
questions seem focused on the current world; it might therefore be useful to
address data we can obtain from vendors now.
Nadler said that learning about the record acceptance policies of libraries
should give us information on what libraries need.
Alford (University of North Carolina at Chapel Hill) said that the Program for
Cooperative Cataloging (PCC) would probably support this research.
There is already work going on related to question 2.
Kurth (Cornell) said that goals of varying existing studies are different from
the goals of this proposed study.
Tierney (Stanford) said we could devise questions related to numbers 1 and 2 but
that the focus of our need is number 3. We need to have an understanding of that
because it impacts the business of running a library.
Wolven said that as we accept masses of outside records for our collections we
need to know why they are acceptable or if they are not, why not.
Nadler suggested that a research survey be done that addressed questions 1 and 2
while Big Heads made plans to study number 3.
Arenales agreed with Catherine Tierney that number 3 is where the future is. An
interesting question would be how many students are using anything but keyword
Leighton (University of California at Berkeley) commented that questions 1-2 are
aimed at catalogers; number 3 is aimed at users.
Alford said that many of the topics discussed at the LC Bicentennial Conference
in November 2000 are implicit in number 3.
Catherine Tierney said the conference was LC-focused, that ALCTS needs a
broad survey, which will help LC as well as others.
Nadler said that information gathered would be valuable not only for
administrators but also for catalogers who need to see how their work is used.
membership rules (Diedrichs and the subcommittee)
Pitts Diedrichs reported on behalf of the Membership Task Force that the
University of Virginia and Pennsylvania State University will be asked to join
Big Heads as of the annual conference in San Francisco in 2001 and that
Northwestern University will leave after the 2002 annual conference. Big Heads
agreed to continue using the 1998/1999 ARL criteria index ranking in the Spring
2001 membership re-assessment instead of recalculating based on the 1999/2000
criteria index ranking.
versions, one record or several (Leighton and group)
to the Midwinter Meeting the Big Heads had exchanged information via their
electronic discussion list about their bibliographic treatment of multiple
versions (electronic, CD-ROM, microforms, print) of books and journals. Barbara
Stelmasik (University of Minnesota) had summarized the responses in a
spreadsheet with accompanying comments made by the various respondents.
Leighton summarized the responses by saying that no library was completely
consistent. Of the 22 reporting libraries many (17) use the single record
approach for electronic versions of serials while 14 use multiple records for
electronic versions of books. All the libraries that reported on their treatment
of CD-ROM versions use multiple records for both serials and books. For
microform versions the treatment was predominantly in favor of using the same
version for both serials and books. From the comments on staff attitudes toward
records it seems that technical services leaned toward multiple records while
public services fairly strongly endorsed single records in almost all
Nadler suggested discussing both the local and national implications of our
decisions, pointing out that Michael Kaplan in his LC Bicentennial Conference
paper had suggested having single records locally and multiple records globally.
said we need a way to take individual (multiple) records from vendors and make
them display as single records to the public.
Nadler pointed out that the wide variability even within individual libraries is
based on such things as historical influences and lack of money. We can't insist
on uniformity either within libraries or among libraries; we should focus more
on access than on changing cataloging policies.
Kastner (New York University) asked if everyone was contributing these records
to the utilities? Lee Leighton said he was unsure what Berkeley did.
Arenales said that current CONSER policy, which allows users to choose their own
approach, serves our needs today.
Sinn asked: Needs for what? It is not a question of our rules for bibliographic
control that is at issue; rather it is our system's abilities to display
information and to communicate our different solutions. She expressed the belief
that the future lies in the direction of using separate records for multiple
Wolven commented that the LC Bicentennial Conference had said that the future
lies in post-concatenation of multiple records for display.
Nadler said we shouldn't even try to agree on a uniform policy.
Ogburn (University of Washington) said that user expectations are that as soon
as material arrives in the library, records for them will be in the catalog.
Swanekamp said that Yale loads Ebsco records in batch mode and replaces them as
a batch. [See summary of Accessing Full Text e-Journals in Aggregation Databases
through the OPAC by Matthew Beacom later in this column.-Editor's note]
Arenales stressed the importance of making it clear to users that we have both
print and electronic versions.
Leighton commented that we seem to be victims of technology; we scramble to keep
up with new technologies.
Kastner noted that the overhead of managing this piecemeal approach is high.
Arenales said that this is not a problem that technical services staff can solve
by ourselves; we must work with public services groups.
Wolven commented that as complexity increases (more materials and more
approaches to them) there seems to be less uniformity among public services
staff in favor of the single record approach.
Nadler wondered if questions on the user approach to single versus multiple
records could be folded into the research agenda.
Arenales suggested that the Library of Congress call a new Airlie House
Conference with both technical services and public services staff, as well as
systems staff and vendors, participating.
Wiggins noted that the LC bicentennial conference had covered many of these
issues and that we should first examine the recommendations of the conference to
see which might be relevant to this question.
Leighton said that it seems we are looking towards technology that doesn't exist
yet to get us out of this state of confusion.
Swanekamp said she had heard that the CCS Committee on Cataloging: Description
and Access (CCDA) is going to appoint another taskforce to study multiple
said we need to start discussions with vendors.
Nadler asked Beacher Wiggins if he had a sense of what were the most essential
recommendations from the LC Bicentennial Conference. He replied that there were
a number of recommendations, which LC needs to compartmentalize over the next 2
months. One criterion will be to determine which recommended changes could
provide quick results. Another criterion will be the resources that would be
needed for implementation. Other questions that will be asked are what do we
want to do, and which things can't we do and thus should be moved elsewhere?
Sinn said we need to re-look at the Airlie House recommendations and determine
if the 3 level record model they proposed is currently feasible. If it is not,
then we need to devise a different model.
Nadler said Michael Kaplan's approach is more a matter of local systems taking
action and thus is more doable than the Airlie House approach which looked to
outside development that would require change of all databases.
Pitts Diedrichs asked if we could identify a group of Big Heads that have a
relationship with the major Integrated Library System vendors and who could
start a conversation saying this is what we discussed: we have this problem, can
technology provide a solution?
Wolven said that as individuals we are likely to have such conversations with
vendors. If they are likely to hear the same thing from all of us it will
promote such development.
encouraged those investigating new systems to share these concerns with their
Nadler summarized the discussion by saying that the national implication of the
multiple versions question is asking whether sharing of records is an issue.
Which option is more conducive to sharing? Until we know the answer to those
questions we are not ready for solutions.
and electronic licenses (Tim Jewell, Head of Collection Management Services,
University of Washington)
began by saying that his position involves reviewing and signing off on all of
the University of Washington's Library's electronic licenses. As a result of
this experience he has come to believe that their current system of managing
licenses leaves a lot to be desired. He has 3 concerns:
To prevent violation of license terms by users and staff.
To encourage ILL staff to make use of electronic resources whenever they can
legitimately do so.
Licenses usually say that the library will make a good faith effort to inform
users of licensing restrictions; usually, however, nothing is done.
is writing a paper for the Digital Library Federation that is tentatively titled
"Selection and Presentation of Commercially Available Electronic
Resources"; Abby Smith from CLIR is writing a related paper for the DLF
dealing with local digitization, and Lou Pitschmann from Wisconsin is doing one
on selection of free web resources.
has had discussions with Adam Chandler of Cornell who is taking a more metadata
approach. They have established a web site that will shortly be used to discuss
data elements, definitions, etc. that are involved in licensing. The URL is:
this introduction he said the remainder of his presentation would be about
functions and data elements; that he would talk about the future.
1. Paper form used by the University of Washington Library to analyze each
license. The form asks such questions as: Who are the users authorized to use
this resource? What limits on access are required? What may be downloaded? What
uses are prohibited? Can an electronic version be used for ILL? Can the library
make a paper copy and use it for ILL? What rights does the vendor have to make
changes in the terms of the license? What liability issues are there? Is there a
warrant in the license? Is indemnification of the publisher required if
licensing restrictions are violated? What are the termination and cancellation
2. Web form devised by the University of Notre Dame Library to gather
appropriate information for the acquisition of electronic resources. Description
of the product, Costs and fund codes, Order information, Access information,
Processing information, and Routing information are among the data elements
3. This slide shows the more sophisticated VERA (Virtual Electronic Resources
Access) system introduced by MIT a year ago. Clicking on an icon can bring up
the full text of a license.
Sinn asked if the form is used to replace catalog records? The answer was No,
but that in the future data from catalog records, license forms, and the web
would be combined to form a new entity.
Alford said that the licenses signed by the University of North Carolina at
Chapel Hill don't require that they inform users of licensing restrictions.
Vendors are willing to accept that contract provision if the library insists. He
agreed to share the wording they used (devised by North Carolina State
reports from Penn State are available on the web. Among the data elements
displayed are access mode, follow-up needed, full-text title lists generated,
index /abstract lists generated. Lists of titles in databases and lists of
electronic journals are derived from ERLIC and made available to users.
uses a list of databases and electronic journals. For each database there is a
clickable line which leads to a page which shows the permitted uses of that
and data elements for managing electronic resources" (handout and slide of
a spreadsheet) shows what institutions are using various data elements. The
University of Virginia needs to be added to it. The spreadsheet is an appendix
to a DLF report.
Arenales asked Mr. Jewell where this work was going? Are you trying to
standardize so that ILS vendors can come up with a solution? Tim Jewell said the
answer was Yes, that Adam Chandler is trying to develop a standardized schema
that vendors and individual institutions can use.
Shelton asked whether those working in this area had discussed questions such as
Who is authorized to work in these databases? What elements are mandatory?
Wolven asked which elements are there for presentation? For access?
said it looks like the beginning of a new module in Integrated Library Systems.
Nadler said that Chicago is trying to develop a similar project: a centralized
product to be used decentrally.
update and record acceptance policies (Larry Alford)
Alford said he hoped for a discussion of the BIBCO component of the PCC. The
NACO, SACO, and CONSER parts of PCC are very successful projects.
BIBCO is also successful with more than 60,000 records produced each
year. It has not, however, met its original goals.
He asked what are the barriers that are causing libraries to do original
cataloging but not to make the resulting records BIBCO records.
At Chapel Hill one barrier is the requirement for a call number based on
a standard classification scheme to be included in BIBCO records; many of their
original cataloging records are for items in special collections which are given
only local call numbers.
Kurth described a PCC plan to conduct phone interviews with 20 catalogers and 20
catalog managers about their views of Core records in order to try to discover
why there is low use/creation of Core records, their views about quality, why
more libraries are choosing not to use Core, etc. They are soliciting
participants from among BIBCO members.
Rogers suggested including include reference staff in the survey. At Penn the
reference staff doesn't think Core records provide enough information. Martin
Kurth replied that that would be a different study.
Alford noted that the Core level was intended to be a base standard that could
be enriched at a cataloger's judgment. There has come to be a misunderstanding
that Core level means that only the minimum should be provided.
Arenales expressed concern that self-selection of respondents in the survey
could skew the results.
Kurth said that the committee had considered that objection but had decided that
the results would be acceptable.
a question about how the 20 would be chosen from among the volunteers, Martin
Kurth said that all volunteers had to fill out a questionnaire: the respondents
would be chosen to provide a variety based on such criteria as type of library
(public or academic), their size, their length of experience with Core level
Nadler suggested asking what libraries do with Core records created by other
libraries. Her staff considers Core
level records to be "kernel" records that need to be upgraded. By the
time you do a good Core record you aren't far from creating a full record. Lee
Leighton agreed. He noted that the acceptance of Core level records as given
depends also on the level of staff authorized to accept Core records; copy
catalogers are more likely to accept them as OK while original catalogers are
more likely to notice the lack of subject headings.
Sinn said that Core level records are perfectly acceptable as created but are
susceptible to being enhanced. There
is, however, a perception that Core level records are not quite adequate.
Kastner wondered how much time is saved by creating or using Core level records?
He suggested that they might be adequate for backlogged collections which you
want to treat consistently, but that having to choose, on a title-by-title
basis, which level to use for current receipts is time-consuming. The general
response was that it is not time-consuming if the Core level record is your
Alford noted that only about 25 percent of BIBCO records are Core level; what
about the other 75 percent? What use is made of them? Is there wide acceptance
of them? Why are libraries that create full records not considering them as
Swanekamp commented that implementing BIBCO at large libraries is difficult.
There are old local practices that institutions are not willing to give up,
e.g., a practice of not assigning call numbers to rare books. There is also a
problem with batch loading in OCLC which needs to provide a way to have locally
created full records which are batch-loaded overlay brief records already in
Arenales wondered if the requirement for Core level records to include a call
number based on a standard classification scheme should be re-examined.
Nadler said she would not deny the importance of accepting otherwise good
records that lack call numbers but that at her institution the inclusion of a
standard call number is important because records with call numbers can go into
a fast stream.
Clark (NYPL) noted that none of the NYPL records have standard call numbers.
question was raised about the expense of taking an item out of the regular work
stream to give it to a high level cataloger to classify. The consensus was that
any side streaming is expensive.
Wolven said that Columbia adds LC class numbers to Avery Architecture Library
materials even though Avery itself doesn't use them; they spend more money
adding class numbers to existing records lacking them than they do by providing
this service to other libraries.
Swanekamp spoke positively of the fact that PCC records have the authority work
done for all access points. A library can pass PCC records along without further
thinking about whether or not local action is needed for authority work.
Alford concluded the discussion by noting that the idea behind BIBCO was that we
should all be able to trust the work done by and the professional judgment of
catalogers in other libraries.
were no suggestions for the next agenda. The
meeting was adjourned at 12:28 p.m.
University of New York at Buffalo
Full Text e-Journals in Aggregation Databases through the OPAC, CCS-Research
Discussion Group Meeting, Midwinter American Library Association Meeting,
Washington, DC, January 13, 2001.
John Riemer (UCLA) presented a summary of the work done to date by the
PCC Standing Committee on Automation (SCA) Task Group on Journals in Aggregator
Databases. This committee's
challenge was to provide access to serial titles within aggregators - both title
and holdings data. Their first
charge was to recommend record content, mount a demonstration project, evaluate
and determine next steps. One outcome was EBSCO's creation of a record set for
Academic Search Elite. EBSCO
derived bibliographic records that were based experimentally on the task group's
instructions and the corresponding records for the print journal found in the
CONSER database. Bell + Howell has
also made bibliographic records available for all their ProQuest titles.
In their evaluation of record creation strategies, the task force
recommends (from best to worst) human-created analytics, machine-derived
analytics, machine-generated analytics which rely on defaults, scripted creation
of minimal records, and a single combined coverage index.
In conclusion, Riemer addressed maintenance issues that included the
delivery options for the record set (whole set each time versus just the
changes), added and dropped titles, changes in volume coverage or completeness
of content, url maintenance, conventional titles changes, and finally, the
possibility of cancellation or change of subscription.
Matthew Beacom (Yale) then described how the PCC/EBSCO project was
implemented at Yale Library. In his
introduction, he reminded his audience that libraries have always been
aggregators; our collections our own aggregations and catalogs are our tools for
searching them. Our challenge
remains to provide our patrons with clear paths to our collections.
Yale initially loaded 2200 vendor-supplied records on April 30, 2000 to
provide access to titles in the EBSCO aggregations, Academic Search Elite and
Business Source Premier. Catalogers
collaborated with reference and systems staff, wrote specifications, and
evaluated the results. Beacom
encouraged the audience to keep in mind the temporary nature of access to titles
in aggregators and emphasized the need to have a sound exit strategy that can
remove all traces of these records before loading anything.
Beacom addressed changes and additions that Yale made to these records,
workflow issues, and problems encountered.
Benefits included added visibility to titles in aggregations through the
OPAC, easy removal of the records should the subscription be cancelled, and
frequent updates which make it possible to keep the OPAC current with the
aggregator's changes in titles and coverage.
Title access through the OPAC for aggregators raises expectations for
access to titles in other aggregations. Reference
librarians see the OPAC as central to what they do.
Yale experienced a 2-3 fold increase in full text retrievals after
loading the PCC/EBSCO records.
Beacom's presentation is available at http://www.library.yale.edu/~mbeacom/talk/aggregator/.
information about the PCC SCA Task Group on e-journals in aggregations can be
found at http://lcweb.loc.gov/catdir/pcc/automation.html, including the Final
report of the 1st Task Group at http://lcweb.loc.gov/catdir/pcc/aggfinal.html
and the Interim report of the 2nd Task Group at http://lcweb.loc.gov/catdir/pcc/aggtg2interimrpt.html.
of Cataloging & Classification Quarterly, vol. 29
Best of CCQ for volume 29 has been awarded to Dr. Elaine Svenonius for her
article "LCSH: Semantics, Syntax, and Specificity."
Here is the statement from the Task Force that did the selection:
Task Force to select the best article of Volume 29, Cataloging &
Classification Quarterly has chosen "LCSH: Semantics, Syntax and
Specificity" by Dr. Elaine Svenonius. Dr. Svenonius provides a perceptive
analysis of changes to LCSH in three core areas over the last hundred years. In
doing so, she evaluates the key decisions including why they were made and what
effects they have had on subject retrieval. Finally, she offers her predictions
for the future. The Task Force selected this article by Dr. Svenonius for her
identification of key concepts, for her insightful commentary, and for her
brilliant synthesis of major developments in LCSH. Members of the Task Force
were Robert P. Holley (chair), John R. James, and Barbara B. Tillett."
that Dorothy McGarry's interview with Dr. Svenonius appears in CCQ 29(4).
Tools for Audiovisual Catalogers
Subcommittee on Authority Tools, Cataloging Policy Committee, OnLine Audiovisual
Catalogers, Inc. announces that a new list, Authority Tools for Audio-visual
Catalogers, is available for use on the Web at http://ublib.buffalo.edu/libraries/units/cts/olac/capc/authtools.html.
The content is quite varied, and will be expanded in the future.
Library, University of Akron
Steering Committee for Revision of Anglo-American Cataloguing Rules
address of the new JSC page is: http://www.nlc-bnc.ca/jsc.
Content includes news and notes, a description of the rule revision
process, a list of the committee members, current activities, and selected