Está en la página 1de 20

Llbrl, Vo|. 60, pp. 3856, March 2010 Copyrlght by Wa|ter de Cruyter Ber|ln Nev York. DOl 10.1515/|lbr.2009.

004
A Korean Study on Verifying Evaluation Indicators and
Developing Evaluation Scales for Electronic Resources Using
a Pilot Evaluation


Younghee Noh
Associate Professor, Department of Library &
Information Science, Konkuk University, Chungju,
Korea.
Email: irs4u@kku.ac.kr


#DUVTCEV
The relationship between physical resources and
their usability has been measured quantitatively and
qualitatively in evaluations of libraries abroad as
well as in Korea, with the objectives of improving
service levels and management results. As the tech-
niques used by libraries and the items collected by
libraries change over time, the criteria to select items
for library evaluations should change accordingly.
This study intends to enhance the significance of
e-resources which have been increasingly purchased
and used in the digital library environment but under-
represented in library evaluations. To this end, the
study identified items for evaluations, classified
them, and verified them using experts and a pilot
evaluation. First, the study verified three evaluation
sectors, 11 indexes and 22 indicators with a Delphi
method. Second, specialists were surveyed to de-
termine the appropriate weight for each index and
indicator. Third, by conducting a pilot evaluation on
college libraries, the investigator identified evalua-
tion scales. The findings of the study are expected to
contribute to the worlds library evaluation practices
in the digital environment.
+PVTQFWEVKQP4CVKQPCNGQHVJGUVWF[
Globally, the proportion of e-resources in library re-
sources has been growing; so does the proportion of
the budget dedicated to acquiring them and the pro-
portion of library usage due to their availability. The
following statistics offer evidence in support of this
reality.
The National Center for Education Statistics
(NCES), which has one of the most comprehensive
databases of statistics relating to U.S. academic li-
braries, shows that e-resource purchase costs ac-
counted for approximately 33% of the total resource
dollars spent in 2006 (NCES 2006). According to
the annual statistics of the Association for College &
Research Libraries, expenditures on e-resources were
44% on average in 2007 (Shim 2009, 8). According
to the statistics of the Association of Research Li-
braries, the e-resources subscription budgets of their
member libraries increased from 3.5% in 199293
to 12.88% in 19992000, an increase of 360%; and to
40.95% in 200506, a 320% increase (ARL 2008).
According to the Korea Education & Research In-
formation Service (KERIS), Korean college libraries
spend about 33% of the total resource budget in this
way (Shim 2009). The reason for this increased ex-
penditure for e-resources in libraries can be ex-
plained by increased user demand for e-resources.
Meanwhile, the Committee on Library and In-
formation Policy [1] conducted a pilot evaluation of
libraries by type in 2007, based on the evaluation
indicators developed in 2006 by the Korean National
Library for public libraries (Committee on Library
and Information Policy 2007) and the evaluation in-
dicators developed in 2005 by KERIS for university
libraries. During this evaluation, evaluated institu-
tions and staff members pointed out some problems
regarding e-resources as follows: 1) not only the
number of kinds of Web DB (Database) subscrip-
tions but also subscription fees per Web DB should
be considered in qualitative evaluation; 2) there is a
need to differentiate weights on the methods of use
since Web DB use statistics have been based on the
number of sessions, the number of hits, the number
of abstract downloads, and the number of full text
downloads [2]; 3) in the case of e-journals, the sta-
tistics need to include when the printed version has
not been read, and the cost per journal should be in-
VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 39
cluded in the evaluation for qualitative evaluation;
4) for e-journals, statistics such as the number of
sessions should be included as for Web DB; 5) for
e-books, there should be differentiated statistics on
packages and separate volumes, and also for qualita-
tive evaluation the cost should be included; 6) statis-
tics for e-book loans and the number of downloads
need to be added to the statistics for the number of
regular loans; and 7) an index which can evaluate the
use of non-publication items needs to be developed.
In brief, there are many problems in existing evalua-
tion methods: that is, Web DB evaluation methods,
e-journal evaluation methods, e-books evaluation
methods, and non-publication evaluation methods
(Committee on Library and Information Policy
2007).
From the above, it is clear that the significance of
e-resources in library evaluations is undervalued.
Given the issues with statistics and the problems
pointed out, there is a critical need for evaluation in-
dicators and indices, that can be classified and veri-
fied, with an ultimate goal of enhancing the re-
liability of overall library evaluation.
Cb[ect|ves cI tbe study
This study aims to improve the current practice of li-
brary evaluation of e-resources. While the use of
resources in libraries such as Web DB, e-books,
e-journals, and other e-resources including CD-
ROMs, DVDs, and micro materials has been grow-
ing disproportionately, there has not been enough
consideration of these e-resources when evaluating
libraries. At the same time, evaluation indicators for
these e-resources have not been clear. This paper
suggests evaluation indicators and indices to meas-
ure e-resources acquisition and usage accurately un-
der the following 5 objectives:
1) It aims to compare and analyze Korean and
overseas evaluation indicators to trace the
changes in e-resources acquisition and use, in
terms of their significance as a proportion of
the overall evaluation of college libraries.
2) It suggests evaluation indicators and indices re-
solving various problems as stated above in
order to reinforce the previously downplayed
role of e-resources in college library evalua-
tions.
3) It revises and verifies the evaluation indicators
suggested in this study using a Delphi
method. A survey is conducted on a 10-mem-
ber Delphi panel consisting of library evalua-
tion specialists to verify the adequacy of the
indicators.
4) It collects opinions from the Delphi panel on
the weight for each indicator and index veri-
fied through the Delphi method to determine
appropriate weights.
5) It applies these weighted evaluation criteria to
university libraries in a pilot study to deter-
mine evaluation scales.
Metbcdc|cgy
This study aims to verify the evaluation criteria for
e-resources, allocate the appropriate weight to each
evaluation criterion, and determine a scale for each
indicator. It was conducted in accordance with the
methodology and procedures shown in Figure 1.
Based on an analysis and comparison of studies,
mostly conducted by OECD countries, of library
evaluation indicators, new evaluation indices and
indicators have been drafted, and later revised by a
Delphi method. Initially, the study sought opinions
from the specialists, consisting of professors, spe-
cialized e-resource librarians with more than ten
years in the profession and researchers working for
information and communication institutions; using
these opinions, initial weights were calculated.
Evaluation scales were developed to calculate
evaluation scores of libraries in an actual evaluation.
In this study, a pilot evaluation was also conducted
to establish a realistic scale.
L|terature rev|ew cn un|vers|ty ||brary
eva|uat|cn |nd|catcrs
This paper aims to carry out an in-depth study of e-
resource-focused indicators in university library
evaluation in the context of digital library environ-
ment. Accordingly, this section examines existing
studies on 1) university library evaluation indica-
tors, 2) indicators as applied to digital library cir-
cumstances, and 3) specific indicators pertaining to
academic information only. By exploring studies
solely analyzing academic information evaluation
40 Younghee Noh
indicators, this study can design a methodology for
specified indicators on e-resources.
Researchers such as Suh (1996), Chang (1997) and
Yoon (2001a) conducted studies related to the de-
velopment of university library evaluation indicators.
Suh (1996) reviewed current academic accreditation
and its evaluation system. She pointed out problems
such as the weight given to the library sector and
evaluation indicators in academic accreditation stand-
ards by the University Education Council. In another
study, Suh (2004) analyzed the American Academic
Accreditation System and reviewed recent trends
in university library evaluation standards. Chang
(1997) developed a model of qualitative evaluation
of university libraries service based on the analysis
of existing studies. In the model, Chang established
25 indicators in 7 categories. Yoon (2001a) con-
ducted a study to compare and analyze university li-
brary evaluation indicators in Korea and abroad, and
attempted to develop an evaluation model which can
measure and evaluate printed materials and digital
information, physical collections and remote ses-
sions, human-centered reference services and tech-
nical information services, and information storage
and information gateways. Based on a review of the
literature, Yoon (2001b) proposed an overall evalua-
tion model including rate, index, indicators, weight,
and criteria.
Kwak and Lee (2002) argued that it is necessary
to develop new evaluation indicators which can re-
flect recent digital environments for library operation
and service. They attempted to develop evaluation
indicators for varying internal and external informa-
tional and operational environments for university li-
braries in the context of the digital environment. Cho
(2003) discussed the development of an evaluation
system of inter-library research information resource
sharing, and argued that it is necessary to activate
inter-library research information resource sharing.
As part of the Making of America (MoA) Project,
Mead and Gay (1995) conducted a study proposing
a concept-mapping method to evaluate digital librar-
ies, complementing two other proposed evaluation
standards forwarded by the Interactive Multimedia
Group (IMG) and the Digital Library Working
Group (DLWG). Lancaster (1997) argued that digital
library evaluation is different from traditional
evaluation of libraries and suggested appropriate
evaluation standards and methods for digital library
circumstances. Similarly, Sim and Kantor (1999) in-
sisted that traditional evaluation methods need to
change as printed materials become digitized. They
argued that it is necessary to describe how library
inputs have changed to library outputs in the form of
services in order to explain the evolution of library
evaluation.
Figure 1. Methodology for the Study.



VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 41
In terms of digital library evaluation, Saracevic
(2000) provided evaluation elements, evaluation level
selection, evaluation standards, index, and methods
as required necessities for digital library evaluation.
In addition, Lilly (2001) usefully distinguished three
distinct aspects of evaluation for virtual library in-
formation resources: intellectual, physical and tech-
nical.

L|brary eva|uat|cn |nd|catcrs |n Kcrea and
abrcad
The International Organization for Standardization
(ISO) announced ISO 11620 in April 1998 as an in-
ternational standard for library performance evalua-
tion of all kinds of libraries, and ISO/TR 20983 in
2003 as performance indicators for e-libraries (see
Table 1).
The Management Information for the Electronic
Library (MIEL) 2 Project emphasizes the change of
information use from library resources to network
access due to the establishment of e-libraries. The
MIEL 2 Project comprehensively deals with five
main issues: integration, user satisfaction, delivery,
efficiency, and economy (see Table 2) (Brophy &
Wynne 1997).
The EQUINOX project is a study on library per-
formance indexes supported by the European Com-
mission. This project aims to develop indexes for
newly emerged e-libraries (Brophy et al. 2000). The
indexes developed in this project are as follows:
1) Percentage of the population reached by elec-
tronic library services
2) Number of sessions on each electronic library
service per member of the target population
3) Number of remote sessions on electronic li-
brary services per member of the population
to be served
4) Number of documents and entries (records)
viewed per session for each electronic library
service
5) Cost per session for each electronic library
service
6) Cost per document or entry (record) viewed
for each electronic library service
7) Percentage of information requests submitted
electronically
8) Library computer workstation use rate
9) Number of library computer workstation hours
available per member of the population to be
served
10) Rejected sessions as a percentage of total
attempted sessions
11) Percentage of total acquisitions expenditure
spent on acquisition of electronic library ser-
vices
12) Number of attendances at formal electronic li-
brary service training lessons per member of
the population to be served
13) Library staff developing, managing and pro-
viding ELS and user training as a percentage
of total library staff
14) User satisfaction with electronic library ser-
vices.

The Standards for College Libraries (SCL) pre-
pared under the auspices of Association of College
and Research Libraries (ACRL) of the American Li-
brary Association provides quantitative and qualita-
tive methods to evaluate libraries efficiency. At the
same time, there are 11 input sections and 6 output
sections as comparative elements between university
libraries (ACRL 2004). The Academic Library Stand-
ards (ALS) developed by the National Center for
Education Statistics (NCES) provides for the evalua-
tion of service, collections, staff, budget, and e-serv-
ice in order to best meet the current conditions and
needs of university libraries (Brophy & Wynne
1997). The Korean Library Association has catego-
rized evaluation standards with six sectors; there is
more weight to user service and resources sectors,
and there is a comprehensive evaluation between
quantitative and qualitative measurements (see Ta-
ble 3) (Korean Library Association 2004).
Evaluation indicators provided by the Korean
Council for University Education are similar to those
of the Korean Library Association. However they
have a tendency to emphasize quantitative measure-
ment. In terms of resources, it has been categorized
with separate volume, periodicals, and non-publica-
tions. In addition, collections have been categorized
based on quantity and quality. To prevent it from
counting only the quantity of volumes, the quality of
volumes has been evaluated based on the purchase
cost (see Table 4) (Korean Council for University
Education 2004).
42 Younghee Noh
Table 1. ISO 20983 Evaluation Indicators
Sector Index Performance Indicators
General e-Service user rate
e-Library Service Information facility expenses rate on e-information collection
Number of publications downloaded per session
Cost per database session
Failed session rate
Remote OPAC session rate
Information Hits
Virtual visits rate out of all users
Inquiry, Reference
Service
Information inquiry rate throughout electronic methods
User Training Number of participation to e-service user training per person
Number of hours to use a computer per person
Number of users to public computers
Public Service
Facility
Computer use rate
Staff Training Regular IT related training attendance rate per person
Human Resources
Flexibility and Use
Staff Allocation Rate of library staff as e-information service provider and developer
Table 2. MIEL 2 Performance Indicators
Index Performance Indicators
Integration
Strategic Cohesiveness / Resourcing Mechanisms / Planning Processes / Service-user Liaison / Assessment and
Audit mechanisms
User satisfaction
Overall User Satisfaction / Document Delivery Services / Information Services / Study Facilities / Information
Skills programme / infrastructure
Delivery
Meeting service standards over a given period (local measure) / Meeting development targets over a given period
(local measure) / Documents delivered per FTE student during a year (national measure) / Enquiries answered
per FTE student during a year (national measure) / Proportion of students receiving post-induction instruction in
information-handling skills during a year (national measure) / Library study hours per FTE student during a year
(national measure / PC hours used per annum divided by FTE students / Volumes in collection per FTE student
at a given date (national measure) / Proportion of JISC datasets available to users / Total major electronic
subscriptions
Efficiency
Items processed per Library Staff FTE / Total Library Expenditure per Items Processed / Documents Delivered
per Library Staff FTE / Total Library Expenditure per Document Delivered / Enquiries Answered per Library
Staff FTE / Total Library Expenditure per Enquiries Answered / Total Library Expenditure per Study Hours p.a.
/ Total library expenditure/PC hours used per annum / Volumes in Stock per Library Staff FTE / Total major
subscriptions/FTE staff numbers / Total Library Expenditure per volumes in stock / A Total expenditure/Total
major subscriptions
Economy
Total Library Expenditure per FTE / Library Staff Expenditure and Operating Costs per FTE / Space per FTE /
PC hours available per annum per FTE student / FTE per number of libraries / Acquisition Costs per FTE / FTE
per Professional Library Staff / FTE per Seat / FTE students per network PC
VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 43
Table 3. The Korean Library Associations University Library Evaluation Standards.
Sector Evaluation Index and Weight Key Indicators
Loan Performance (3) - Annual Loans per Registered Student for Last 3 Years
Inter-Library Loan, Copied Version Sharing
Performance (3)
- Annual Average of Inter-Library Loan and Copied Version Sharing
Performance for Last 3 Years
Service Improvement Performance (2) - Improvement Performance for Last 3 Years
User Training Adequacy (2) - User Training Method and Process for Last 3 Years
Community Service (2) - Openness to Community for Last 3 Years
User
Service
(14)
Research or DB Development Performance for
Specialization (2)
- Research or DB Development Performance for Specialization for Last 3
Years
Information-Oriented Process and Service System
Development Adequacy (2)
- Library Information-Oriented System Development and Efficiency
DB Development (6)
- Number of Bibliography DB Collections Cost compared to Number of
Collections
- Professional DB Development Performance
- Annual Average of Mutual List and DB Development Performance for Last 3
Years
Information-
Oriented
(10)
Network Subscription and Operation Performance (2) - Number of Collection of Books per Registered Student
Collection Adequacy (4)
- Average Number of Collection Increased and Average Cost of Purchased
Books per Registered Student for Last 3 Years
Collection Expand Performance and Average Purchase
Cost per One Item (3)
- Annual Average Mutual List and DB Development Performance for Last 3
Years
Periodical Subscription Numbers and Average Cost (5) - Number of Collection of Books per Registered Student
Non-Publication Resources Purchase Adequacy (1)
- Number of Kinds of Foreign Periodical Subscription and Average Cost per
Kind by Academic Department for Last 3 Years
- Number of Kinds of Domestic Periodical Collected by Academic Department
for Last 3 Years
Resources
(14)
Disposal Adequacy (1) - Disposal Collection Rate compared to Collection Increased for Last 3 Years
Staff Headcount Adequacy (3) - Number of Collection of Books and Number of Registered Student per Staff
Staff Allocation Adequacy (3)
- Librarian Rate compared to Total Number of Staff in Library and Librarian
Certificate Evaluation Rank
Professional Manager and Director Allocation
Adequacy (2)
- Professional Position Rate of Managers and Directors
Human
Resources
(10)
Staff Re-training Adequacy (3) -Annual Average Staff Re-education Performance for Last 3 Years
Library Area Adequacy (2) - Library Area per Registered Student (m2)
Reading Room Seat Availability Adequacy (1) - Number of Students per Reading Room Seat
Facility
(6)
Appropriate Environment Maintenance and User
Facility (3)
- Air Conditioning and Heating System and Other User Facility
Library Budget Adequacy (3) - Library Budget Rate compared to University Total Budget for Last 3 Years
Budget
(6)
Resources Purchase Cost (3)
- Annual Average Resource Purchase Cost per Registered Student for Last 3
Years
Total
Weight 60
Total Index 23
44 Younghee Noh
Table 4. University Library Evaluation Standards by Korea Council for University Education.
Sector Evaluation Index Key Indicators
Resources
Collection Performance (4)
Quality of Collections (4)
Collection Expand Performance (3)
Non-Publication Purchase
Performance (4)
Kinds of Periodical Subscriptions (4)
Quality of Periodicals (4)
- Number of Collections per Registered Student
- Average Cost of Separate Volume Purchased for Last 3 Years
- Average Number of Collections Increased per Registered Student for Last 3
Years
- Average Purchase Cost for Last 3 Years
- Average Collection per Academic Department
Staff
Role of Library Dean (4)
Adequacy of Librarian Recruitment (%)
(4)
Number of Service Recipients Adequacy
per Librarian (4)
- Participating Degree in Decision-Making Process of University
- Library and Reading Promotion Law Enforcement Ordinance Standards
- Number of Students per Librarian
Facility
Adequacy of Reading Room Seat
Numbers (4)
Library Area Adequacy (4)
- Number of Registered Students (Undergraduate + Post Graduate) per Seat
- Library Area per Registered Student (m2)
Computer-
izing
Full Text DB Development Level (4)
Available User PC Adequacy (4)
- Number of Full Text DB Pages of Collections
- Number of Registered Students per PC
User
Service
Aptitude Use Performance (4)
Inter-Library Loan Performance (4)
Adequacy of Publication Loans such as
Inter-Library Loan (4)
User Training Level (4)
- Number of Loans per Registered Student
- Number of Inter-Library Loans (Registered, Requested)
- Level of IT Method Use
- Annual Number of User Training
Budget
Library Budget Adequacy (4)
Material Purchase Cost Adequacy (4)
- Library Budget compared to University Total Budget
- Book Purchase Cost per Registered Student


The KERIS evaluation indicators (see Table 5)
have been developed to measure the research and
education capability of university libraries and to
define the role of university libraries clearly. By pro-
viding appropriate functional evaluation of university
libraries, it can improve library status within the
university, expand budget support, and foster short-
and long-term planning for university libraries.
F|na||sed eva|uat|cn |nd|catcrs
Table 6 illustrates 3 evaluation sectors, 11 indexes
and 22 indicators, finalized through three rounds of
the Delphi survey conducted on a 10-member panel
during the course of this study.
It is important to define the range of e-resources
when evaluating them. In this study, e-resources re-
fers to resources that are expressed in electronic for-
mat, ranging from Web DB, e-journals and e-books
to visual resources, LD (laser disc), DVDs, video-
tapes and cassette tapes. The rational for this wide
range of e-resources is as follows. First, the collec-
tions development policy of the American Library
of Congress (1999) defines electronic resources as
anything that can be accessed through computers,
including online materials and all materials in elec-
tronic format, including physical materials such as
VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 45
Table 5. KERIS University Library Evaluation Standards
Sector Importance Evaluation Index (Weight) Evaluation Indicators
1.1 Library Plan (3) Existence of Library Plan
1.2 User Constitution (1) Existence of User Constitution Establishment and Use
1.3 Job Provision (1) Existence of Job Provision Codification
1.4 Planning (2) Existence of Planning Team
1. Library
Operation (40)
8
1.5 Library PR (1) Library PR Performance
Number of Service Recipients per Librarian
2.1 Staff Rate (6)
Permanent Staff Rate out of Total Staff
2.2 Staff Training (2) Staff Training Attendance Rate
2. Human
Resources (50)
10
2.3 Head Librarian 2) Adequacy of Head Librarian
Number of Kinds of Separate Volume per Service Recipient
3.1 Separate Volume (2)
Number of Separate Volume Increased per Service Recipient
Number of Kinds of Periodical Subscription per Academic
Department
Number of Average Journal Article Read per Journal
3.2 Periodicals (3)
Number of Kinds of Web DB Read per Service Recipient
3.3 Full Text DB (1) Number of Pages of Full Text DB Developed per Service Recipient
3. Information
Resources (35)
7
3.4 Disposal (1) Disposal Rate
Material Purchase Cost Rate compared to Total University Budget
4. Commodity
Resources (90)
18 4.1 Material Purchase Cost (18)
Material Purchase Cost per Service Recipient
Library Area per Service Recipient
5. Facility
Resources (10)
2 5.1 Library Facility (2)
Number of Service Recipients per Reading Room Seat
6.1 Use of Collection of Book (2) Average Number of Loans per Service Recipient
6.2 Circulation Rate of Collection of
Books (2)
Loan Rate compared to Books Collected for Last 3 Years
66. Information
Resources (25)
5
6.3 Use of Web DB (1) Use Rate of Academic Web DB per Service Recipient
Adequacy of User Training Method
7.1 User Training (4)
Number of Annual User Training
7.2 Lecture Liaison Service (2) Existence of Lecture Liaison Service
7.3 Mobile Service (2) Level of Mobile Service Support
7.4 Reference Information Service (2) Number of Reference Information Service per Service Recipient
7.5 Library Use Rate (2) Library User Rate among Service Recipients
7.6 Service for Disabled (2) Existence of Service for Disabled
7.7 Community Contribution (2) Adequacy of Library Service for Community
7. Information
Service (80)
20
7.8 Cooperative Network (4) Contribution of Academic Information Sharing and Circulation
Total(330)
46 Younghee Noh
Table 6. e-Resources Evaluation Indicators
Sector Evaluation Index Evaluation Indicators
Annual Number of Kinds of Web DB Subscription per Service Recipient
Annual Cost of Web DB Subscription per Service Recipient (KRW 1,000)
Annual Number of Kinds of e-Journal Subscription per Service Recipient
Annual Cost of e-Journal Subscription per Service Recipient (KRW 1,000)
Annual Number of Kinds of e-Book Purchase per Service Recipient
Annual Cost of e-Book Purchase per Service Recipient (KRW 1,000)
1.1 e-Resource Purchase Performance
Annual Number of Kinds of other e-Resource Purchase per Service
Recipient
Annual Number of e-Resource Developments per Service Recipient
1. e-Resource
Acquisition
1.2 e-Resource Development Performance
Cost of Information Facility per Service Recipient for e-Resource
Development for Last 3 Years (KRW 1,000)
Annual Number of Sessions to Web DB per Service Recipient
Annual Number of Web DB Hits per Service Recipient 2.1 Use of Web DB
Annual Number of Web DB Downloads per Service Recipient
Annual Number of Sessions to e-Journal per Service Recipient
2.2 Use of e-Journal
Annual Number of e-Journal Downloads per Service Recipient
Annual Number of Sessions to e-Book per Service Recipient
2.3 Use of e-Book
Annual Number of e-Book Loans per Service Recipient
2.4 Use of Internally Developed e-Resources
Annual Number of Use of Internally Developed e-Resources per Service
Recipient
2. e-Resource Use
2.5 Use of other e-Resources Annual Number of Use of other e-Resources per Service Recipient
3.1 e-Resource Used Programs Number of e-Resource Used Programs per Service Recipient
3.2 Machines for e-Resource Use Number of Machines for e-Resource use per Service Recipient
3.3 Use of Machines for e-Resource Use
Annual Hours of Use of Machines for e-Resource Use per Service
Recipient
3. Environment for
e-Resource Use
3.4 Education for e-Resource Use
Annual Number of e-Resource Education Attendances per Service
Recipient
Total (100)
Comments
Number of access => number of sessions; Number of searches => hits; Number of original copy downloads => number of downloads
Machines for e-Resource: computer, printer, DVD, CD-ROM, scanner, video, and so on
Service recipient: includes university staff and student (undergraduate and post graduate), but excludes lecture event attendees and community
members
This research evaluated both number or number of kinds per service recipient and cost for both qualitative and quantitative evaluations
e-Resource development cost includes salaries, facility fees, use expenses and other costs.




VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 47
CD-ROMs. Second, the International Coalition of
Library Consortia (ICOLC) defines electronic re-
sources as anything that is provided electronically,
such as abstracts, indexing services, e-journals and
other materials, and news delivery services. Finally,
there has been a request from librarians in Korea to
clearly define the range of e-resources (Committee
on Library and Information Policy 2007), and statis-
tical sources in Korea have a relatively specific
range of e-resources (Ahn et al. 2007).
Ass|gn|ng we|gbts tc eva|uat|cn |ndexes and
|nd|catcrs
To assign appropriate weights to different indexes
and indicators, this study undertook a survey of the
same Delphi panel involved in finalizing indicators
to maintain the consistency of the study. The panel
was asked about the appropriate weight for each
evaluation sector, index, and indicator; the final re-
sults of which are shown in Table 7.
As illustrated in the table, this study placed a
weight of 40% to e-resource acquisition, 45% to
e-resource use, and 15% to the environment for
e-resource use based on the panel survey. Standard
deviation for each category was mostly between 1
and 2 except for a few categories. The weight of
40% assigned to e-resource acquisition was divided
into 33% for e-resource purchase and 7% for
e-resource development. Recently, while there are a
few universities engaged in robust e-resource de-
velopment projects, in most cases e-resource pur-
chase is much more common than development in
terms of its proportion. The weight of 33% given to
e-resource purchase was divided into 12% for Web
DB, another 12% for e-journals, 6% for e-books,
and the remaining 3% for other types of e-resources.
As costs were considered more significant than
numbers, qualitative indicators were weighted more
than quantity in evaluation.
In the category of e-resource use, the 45% weight
was divided into 16% for Web DB use, another 16%
for e-journal use, 6% for e-book use, 4% for use of
other types of e-resources, and 3% for use of internal-
ly developed e-resources. As for Web DB use, the
weight of 16% was then divided into 4% for the
number of access, another 4% for the number of
searches, and the remaining 8% for full text down-
loads. The weights for e-journals and e-books were
also similarly assigned. In the category of e-resource
use environment to which was given the weight of
15%, education for e-resource use took the highest
6% and the other 3 indicators took 3% each.
F-rescurce Fva|uat|cn Sca|e (Prccsa|)
In the previous sections, existing evaluation indica-
tors were analyzed, based upon which new indicators
were drafted; these were reviewed by experts, were
validated through a 3-round Delphi survey, and
weights were assigned to each indicator. This section
proposes evaluation scales for each indicator based
on the result of a pilot evaluation study conducted
on Korean university libraries using the indicators.
Table 8 illustrates the final version of e-resource
evaluation measurement including evaluation sec-
tors and their weights, evaluation indexes and their
weights and evaluation indicators and their weights.
Remarks below the table indicate
1) Number of access => number of sessions; Num-
ber of search => hits; Number of original copy
download => number of downloads.
2) Machines for e-resource refers to the com-
puters, printers, DVD players, CD-ROM play-
ers, scanners, videos, and other types of de-
vices available for e-resource use.
3) Service recipients includes university staff and
students (undergraduates and post-graduates),
but excludes temporary lecture attendees and
residents of the community.
4) This research evaluated both the cost per use
and the number of uses per service recipient for
qualitative and quantitative evaluations re-
spectively.
While this section proposes evaluation scales to
calculate evaluation scores of libraries in each cate-
gory (indicator), in order to develop an evaluation
scale which will be meaningfully used for actual li-
brary evaluations, the study conducted a pilot evalua-
tion on 14 college libraries.
F-rescurce acqu|s|t|cn
E-resource acquisition evaluation looks at e-resource
purchases and develops a performance evaluation of
48 Younghee Noh
Table 7. Weights Assigned to e-Resources Evaluation Indicators
Evaluation Index and Indicators Average
Standard
Deviation
Final Weights
1 e-Resource Acquisition 40.0 3.8730 40
1.1 e-Resource Purchase Performance 33.2 3.1875 33
1.1.1 Annual Number of Kinds of Web DB Subscription per Service Recipient 5.0 1.7321 5
1.1.2 Annual Cost of Web DB Subscription per Service Recipient (KRW 1,000) 7.0 2.7203 7
1.1.3 Annual Number of Kinds of e-Journal Subscription per Service Recipient 4.7 1.3454 5
1.1.4 Annual Cost of e-Journal Subscription per Service Recipient (KRW 1,000) 7.0 2.7203 7
1.1.5 Annual Number of Kinds of e-Book Purchase per Service Recipient 3.1 1.5133 3
1.1.6 Annual Cost of e-Book Purchase per Service Recipient (KRW 1,000) 3.5 1.4318 3
1.1.7 Annual Number of Kinds of other e-Resource Purchase per Service Recipient 2.9 1.3000 3
1.1.8 Annual Cost of e-Resource Purchase per Service Recipient (KRW 1,000)
1.2 e-Resource Development Performance 6.8 2.2716 7
1.2.1 Annual Cost of e-Resource Development per Service Recipient (KRW 1,000)
1.2.2 Annual Number of e-Resource Development per Service Recipient 3.5 1.6279 4
1.2.3 Cost of Information Facility per Service Recipient for e-Resource Development for Last 3 Years 3.3 1.0050 3
2 E-Resource Use 44.5 4.7170 45
2.1 Use of Web DB 16.1 3.7269 16
2.1.1 Annual Number of Web DB Sessions per Service Recipient 3.8 1.5362 4
2.1.2 Annual Number of Web DB Hits per Service Recipient 4.4 1.2000 4
2.1.3 Annual Number of Web DB Downloads per Service Recipient 7.9 1.8138 8
2.2 Use of e-Journals 16.3 4.5618 16
2.2.1 Annual Number of e-Journal Sessions per Service Recipient 5.1 1.8682 5
2.2.2 Annual Number of e-Journal Downloads per Service Recipient 11.2 3.4583 11
2.3 Use of e-Books 6.3 2.3685 6
2.3.1 Annual Number of e-Book Sessions per Service Recipient 2.2 0.6000 2
2.3.2 Annual Number of e-Book Loans per Service Recipient 4.1 1.9209 4
2.4 User of Internally Developed e-Resources 3.2 1.5362 3
2.4.1 Annual Use of Internally Developed e-Resources per Service Recipient 3.2 1.5362 3
2.5 Use of other e-Resources 3.6 1.3565 4
2.5.1 Annual Use of other e-Resources per Service Recipient 3.6 1.3565 4
3 Environment for e-Resource Use 15.5 4.7170 15
3.1 e-Resource Used Programs 2.7 1.6155 3
3.1.1 Number of e-Resource Used Programs per Service Recipient 2.7 1.6155 3
3.2 Machines for e-Resource Use 3.3 2.0518 3
3.2.1 Number of Machines for e-Resource Use per Service Recipient 3.3 2.0518 3
3.3 Use of Machines for e-Resource Use 3.6 2.4166 3
3.3.1 Annual Hours of Use of Machines for e-Resource Use per Service Recipient 3.6 2.4166 3
3.4 Education for e-Resource Use 5.9 2.4678 6
3.4.1 Annual Number of e-Resource Education Attendances per Service Recipient 5.9 2.4678 6

VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 49
Table 8. E-resource Evaluation Sectors, Indexes, Indicators and their Weights.
Sector Weight Evaluation Index Weight Evaluation Indicators Weight
Annual Number of Kinds of Web DB Subscription per Service Recipient 5
Annual Cost of Web DB Subscription per Service Recipient (KRW 1,000) 7
Annual Number of Kinds of e-Journal Subscription per Service Recipient 5
Annual Cost of e-Journal Subscription per Service Recipient (KRW 1,000) 7
Annual Number of Kinds of e-Book Purchase per Service Recipient 3
Annual Cost of e-Book Purchase per Service Recipient (KRW 1,000) 3
1.1 e-Resource Purchase
Performance
33
Annual Number of Kinds of other e-Resource Purchase per Service
Recipient
3
Annual Number of e-Resource Development per Service Recipient 4
1. e-Resource
Acquisition
40
1.2 e-Resource Development
Performance
7
Cost of Information Facility per Service Recipient for e-Resource
Development for Last 3 Years
3
Annual Number of Web DB Sessions per Service Recipient 4
Annual Number of Web DB Hits per Service Recipient 4
2.1 Use of Web DB 16
Annual Number of Web DB Downloads per Service Recipient 8
Annual Number of e-Journal Sessions per Service Recipient 5
2.2 Use of e-Journals 16
Annual Number of e-Journal Downloads per Service Recipient 11
Annual Number of e-Book Sessions per Service Recipient 2
2.3 Use of e-Books 6
Annual Number of e-Book Loans per Service Recipient 4
2.4 User of Internally Developed
e-Resources
3 Annual Use of Internally Developed e-Resources per Service Recipient 3
2. Use of e-
Resources
45
2.5 Use of other e-Resources 4 Annual Use of other e-Resources per Service Recipient 4
3.1 e-Resource Used Programs 3 Number of e-Resource Used Programs per Service Recipient 3
3.2 Machines for e-Resource Use 3 Number of Machines for e-Resource Use per Service Recipient 3
3.3 Use of Machines for e-
Resource Use
3
Annual Hours of Use of Machines for e-Resource Use per Service
Recipient
3
3. Environment
for e-Resource
Use
15
3.4 Education for e-Resource Use 6 Annual Number of e-Resource Education per Service Recipient 6
Total (100)
Remarks
Web DB: Each DB is counted as one regardless of its size or subscription price. If there is sub-DB, sub-DB is also counted as one.
E-journal: E-journals included in Web DB are not counted as e-journals.
E-book: When e-books were purchased in package, count the number of different kinds of the e-books in the package.
Other e-resources : E-resources other than Web DB, e-journals, and e-books, such as CD-ROMs (which are not electronic back-up DB), DVDs, LDs, video
tapes, cassette tapes and other media containing contents in digital form
Number of access => number of sessions; Number of search => hits; Number of original copy download => number of downloads.
Machines for e-resource refer to the computers, printers, DVDs reader, CD-ROMs reader, scanners, videos, and other types of device available for e-
resource use.
Service recipients include university staff and students (undergraduates and post graduates), but exclude temporary lecture attendees and residents of the
community.
This research evaluated the cost of e-resources per service recipient for qualitative evaluation and the number of e-resources per service recipient for
quantitative evaluation.
The cost of building e-resources includes the labor cost, equipment cost, fees for use and other expenditures.

50 Younghee Noh
the libraries. Some libraries purchase back-files of
e-journals (first issue ~ back issues before 1995). As
the libraries spend their budgets to purchase the
journal back-files, they are counted. CD-ROMs/
DVDs attached to printed books are also counted.
As the numbers are calculated per service recipient,
a comparative evaluation for each college library is
possible regardless of the size of the college.
F-rescurce use
E-resource use evaluation looks at use of Web DB,
e-journals, e-books, internally developed e-resources,
and other e-resources. The statistical data on the
number of sessions, hits and downloads provided by
third parties are considered reliable. Internal or ex-
ternal statistics are used to calculate other e-re-
sources. CD-ROM/DVDs attached in printed books
are also counted. E-journals provided free of charge
are not counted.
Fnv|rcnment Icr e-rescurce use
For the evaluation of the e-resource use environment,
e-resource used programs, machines for e-resource
use, use of machines for e-resource use, and educa-
tion for e-resource use are evaluated The number of
machines for e-resource use includes the number of
computers, printers, DVD players, CD-ROM players,
scanners, videos and other types of devices enabling
the use of e-resources. Machines for e-resource use
in this evaluation only means the machines acquired
with the budget of evaluated libraries.
Ccnc|us|cn
This study aims to improve the current status of elec-
tronic resources evaluation in libraries. Cost, impor-
tance and use of e-resources have dramatically in-
creased in the digital library environment; however,
evaluation of e-resources has not been sufficiently
measured when evaluating libraries. In other words,
while the use of Web DB, e-books, e-journals and
other e-resources such as CD-ROMs, DVDs and
micro materials in libraries is increasing, these are
not sufficiently considered in the evaluation of the
libraries, and thus may diminish the reliability and
validity of the results. In this light, this study pro-
poses improved and detailed evaluation of these
items in terms of e-resources library evaluation, es-
pecially in university libraries. Most existing studies
have conducted overall evaluation, and thus there
can be a limit to evaluation of specific areas. This
study attempts to contribute to the improvement of
library evaluation regarding e-resources and their
use.
The findings of this study can be summarized as
follows.
1) A 3-round Delphi survey was conducted to veri-
fy the adequacy of revised indexes and in-
dicators with the specialist panel of current
professors, experienced librarians and pro-
fessional researchers in information channeling
institutions.
2) Weights for the verified indexes and indicators
were finalized based on the opinions of the
specialized panel with 40% assigned to e-re-
source acquisition, 45% to e-resource use, and
15% to e-resource use environment.
3) A pilot evaluation was conducted on 14 college
libraries to determine appropriate scales to cal-
culate their total evaluation scores. The pilot
evaluation allowed the range of the evaluation
scale of each indicator to be established re-
alistically for a meaningful library evaluation.
Given the increasing number and importance of
e-resources, the findings of this study are expected
to improve the quality and reliability of digital li-
brary evaluations.
Ackncw|edgement
This work was supported by a Korea Research Foun-
dation Grant, funded by the Korean Government
(MOEHRD, Basic Research Promotion Fund
KRF-2008-332-H00005).

Nctes
1. This committee is sponsored by the Ministry of Culture,
Sports and Tourism in Korea
2. Here, download is also counted when the user reads the
publication.
VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 51
Table 9. Evaluation Scales Measuring E-resource Acquisition Performance
Evaluation Index Evaluation Indicator Evaluation Scale
less than 0.01
from 0.01 to less than 0.1
from 0.1 to less than 0.2
from 0.2 to less than 0.3
1.1.1 Annual Number of Kinds of Web DB Subscription per Service
Recipient
0.3 or higher
less than 10
from 10 to less than 15
from 15 to less than 20
from 20 to less than 25
1.1.2 Annual Cost of Web DB Subscription per Service Recipient (KRW
1,000)
25 or higher
less than 10
from 10 to less than 40
from 40 to less than 70
from 70 to less than 100
1.1.3 Annual Number of Kinds of e-Journal Subscription per Service
Recipient
100 or higher
less than 10
from 10 to less than 30
from 30 to less than 50
from 50 to less than 70
1.1.4 Annual Cost of e-Journal Subscription per Service Recipient (KRW
1,000)
70 or higher
less than 10
from 10 to less than 30
from 30 to less than 50
from 50 to less than 70
1.1.5 Annual Number of Kinds of e-Book Purchase per Service Recipient
70 or higher
less than 0.5
from 0.5less than 1
from 1 less than 1.5
from 1.5 less than 2
1.1.6 Annual Cost of e-Book Purchase per Service Recipient (KRW 1,000)
2 or higher
less than 1
1 to less than 2
2 to less than 3
3 to less than 4
1.1 e-Resource Purchase
Performance
1.1.7 Annual Number of Kinds of other e-Resource Purchase per Service
Recipient
4 or higher
52 Younghee Noh
Tab. 9 continued
less than 1.0
1.0 to less than 3.0
3.0 to less than 5.0
5.0 to less than 7.0
1.2.1 Annual Number of e-Resource Developments per Service Recipient
7.0 or higher
less than 0.5
0.5 to less than 1
1 to less than 1.5
1.5 to less than 2
1.2 e-Resource Development
Performance
1.2.2 Cost of Information Facility Expense per Service Recipient for e-
Resource Development for Last 3 Years (KRW 1,000)
2 or higher
Definition
Annual Number of Kinds of Web DB Subscription per Service Recipient = Annual Number of Kinds of Web DB Subscription / Number of
Service Recipient *100
Annual Cost of Web DB Subscription per Service Recipient = Annual Cost of Web DB Subscription / Number of Service Recipient
Annual Number of Kinds of e-Journal Subscription per Service Recipient = Annual Number of Kinds of e-Journal Subscription / Number of
Service Recipient *100
Annual Cost of e-Journal Subscription per Service Recipient = Annual Cost of e-Journal Subscription / Number of Service Recipient
Annual Number of Kinds of e-Book Purchase per Service Recipient = Annual Number of Kinds of e-Book Purchase / Number of Service
Recipient *100
Annual Cost of e-Book Purchase per Service Recipient = Annual Cost of e-Book Purchase / Number of Service Recipient
Annual Number of Kinds of other e-Resource Purchase per Service Recipient = Annual Number of Kinds of other e-Resource Purchase / Number
of Service Recipient *100
Annual Cost of Other e-Resource Purchase per Service Recipient = Annual Cost of Other e-Resource Purchase / Number of Service Recipient =>
deleted
Annual Cost of Information Facility Expense per Service Recipient for e-Resource Development = Annual Cost of Information Facility Expense
for e-Resource Development / Number of Service Recipient => deleted
Annual Number of e-Resource Developments per Service Recipient = Annual Number of e-Resource Developments / Number of Service
Recipient
Cost of Information Facility Expense per Service Recipient for e-Resource Development for Last 3 Years = Total Cost of Information Facility
Expense for e-Resource Development for Last 3 Years / Number of Service Recipient
[Remarks]
o Back-files of e-journals (first issue ~ back issues before 1995) are counted.
o CD-ROM/DVDs attached in printed books are also counted.

VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 53
Table 10. Evaluation Scales Measuring Use of E-resources
Evaluation Index Evaluation Indicator Evaluation Scale
less than 5
5 to less than 10
10 to less than 20
20 to less than 30
2.1.1 Annual Number of Sessions to Web DB per Service Recipient
30 or higher
less than 5
5 to less than 10
10 to less than 20
20 to less than 30
2.1.2 Annual Number of Web DB Hits per Service Recipient
30 or higher
less than 3
3 to less than 6
6 to less than 9
9 to less than 12
2.1 Use of Web DB
2.1.3 Annual Number of Web DB Downloads per Service Recipient
12 or higher
less than 5
5 to less than 10
10 to less than 20
20 to less than 30
2.2.1 Annual Number of Sessions to e-Journal per Service Recipient
30 or higher
less than 5
5 to less than 10
10 to less than 20
20 to less than 30
2.2 Use of e-Journal
2.2.2 Annual Number of e-Journal Downloads per Service Recipient
30 or higher
less than 0.5
0.5 to less than 1.0
1.0 to less than 1.5
1.5 to less than 2.0
2.3.1 Annual Number of Sessions to e-Book per Service Recipient
2.0 or higher
less than 0.5
0.5 to less than 1.0
1.0 to less than 1.5
1.5 to less than 2.0
2.3 Use of e-Book
2.3.2 Annual Number of e-Book Loans per Service Recipient
2.0 or higher
54 Younghee Noh
Tab. 10 continued
less than 1.0
1.0 to less than 3.0
3.0 to less than 5.0
5.0 to less than 7.0
2.4 Use of Internally Developed e-Resources
2.4.1 Annual Number of Use of Internally Developed e-Resources
per Service Recipient
7.0 or higher
less than 1.0
1.0 to less than 3.0
3.0 to less than 5.0
5.0 to less than 7.0
2.5 Use of other e-Resources
2.5.1 Annual Number of Use of other e-Resources per Service
Recipient
7.0 or higher
Definitions
Annual Number of Sessions to Web DB per Service Recipient = Number of Sessions to Web DB / Number of Service Recipient
Annual Number of Web DB Hits per Service Recipient = Number of Web DB Hits / Number of Service Recipient
Annual Number of Web DB Downloads per Service Recipient = Number of Web DB Downloads / Number of Service Recipient
Annual Number of Sessions to e-Journal per Service Recipient = Annual Number of Sessions to e-Journal / Number of Service Recipient
Annual Number of e-Journal Downloads per Service Recipient = Number of e-Journal Downloads / Number of Service Recipient
Annual Number of Sessions to e-Book per Service Recipient = Number of Sessions to e-Book / Number of Service Recipient
Annual Number of e-Book Loans per Service Recipient = Number of e-Book Loans / Number of Service Recipient
Annual Number of Use of Internally Developed e-Resources per Service Recipient = Number of Use of Internally Developed e-Resources /
Number of Service Recipient * 100
Annual Number of Use of other e-Resources per Service Recipient = Number of Use of other e-Resources / Number of Service Recipient * 100
[Remarks]
The statistical data on the number of sessions, hits and downloads provided by third parties are considered reliable in principle and available.
Internal or external statistics are used to calculate other e-resources.
CD-ROM/DVDs attached in printed books are also counted.
E-journals provided free of charge are not counted in principle.

VerlIylng Eva|uatlon lndlcators Ior E|ectronlc Pesources 55
Table 11. Evaluation Scales Measuring E-resource Use Environment
Evaluation Index Evaluation Indicator Evaluation Scale
less than 0.01
0.01 to less than 0.05
0.05 to less than 0.1
0.1 to less than 0.15
3.1 e-Resource Used Programs
3.1.1 Number of e-Resource Used Programs per
Service Recipient
0.15 or higher
less than 0.01
0.01 to less than 0.02
0.02 to less than 0.03
0.03 to less than 0.04
3.2 Machines for e-Resource Use
3.2.1 Number of Machines for e-Resource use per
Service Recipient
0.04 or higher
less than 10
10 to less than 30
30 to less than 60
60 to less than 90
3.3 Use of Machines for e-Resource Use
3.3.1 Annual Hours of Use of Machines for e-
Resource Use per Service Recipient
90 or higher
less than 0.1
0.1 to less than 0.2
0.2 to less than 0.3
0.3 to less than 0.4
3.4 Education for e-Resource Use
3.4.1 Annual Number of e-Resource Education
Attendances per Service Recipient
0.4 or higher
Definitions
Number of e-Resource Used Programs per Service Recipient = Number of e-Resource Used Programs / Number of Service
Recipient*100
Number of Machines for e-Resource use per Service Recipient = Number of Machines for e-Resource use / Number of Service
Recipient
Number of Machines for e-Resource use: computers, printers, DVDs reader, CD-ROM reader, scanners, video players, and other
types of machines enabling e-resource use
Annual Hours of Use of Machines for e-Resource Use per Service Recipient = Hours of Use of Machines for e-Resource Use /
Number of Service Recipient
Annual Number of e-Resource Education Attendances per Service Recipient = Number of e-Resource Education Attendances /
Number of Service Recipient *100
[Remarks]
Machines for e-Resource use only refer to machines equipped with the budget for or of evaluated libraries.


;QWPIJGG0QJ
4GHGTGPEGU

ACRL. 2004. Standards for College Libraries, 2000 Edition.
Chicago: Association of College and Research Libraries.
Ahn, In-ja, et al. 2007. An Outstanding Issue for a New Prac-
tical Model of Korean Library Statistics. Journal of the Ko-
rean Society for Library and Information Science 41(1):
431451.
ARL. 2008. ARL Statistics 200506. Washington, D.C.: As-
sociation of Research Libraries. URL: http://www.arl.org/
stats/annualsurveys/arlstats/arlstats06.shtml [viewed 2 Octo-
ber 2009]
Brophy, Peter & Peter M Wynne. 1997. Management Informa-
tion for the Electronic Library (MIEL) Programme: Final
Report, University of Central Lancashire, Centre for Re-
search in Library & Information Management.
Brophy, P., et al. 2000. EQUINOX: Library Performance Meas-
urement and Quality Management System: Performance
Indicators for Electronic Library Services. URL: http://
equinox.dcu.ie/reports/pilist.html [viewed 20 March 2008].
Chang, HR. 1997. Developing a Quality-based Evaluation Model
for Academic Libraries. Journal of the Korean Society for
Library and Information Science 31(4): 165186.
Cho, J. 2003. Developing the Evaluation System of University
Library for Activating Inter-library Research Information
Resource Sharing. Journal of the Korean Library and In-
formation Science 34(4): 6782.
Committee on Library and Information Policy. 2007. Library
Management Evaluation Policy Study Project Report. Seoul:
Committee on Library and Information Policy sponsoring by
Ministry of Cultures, Sports and Tourism.
International Coalition of Library Consortia (ICOLC). 1998.
Statement of Current Perspective and Preferred Practices
for the Selection and Purchase of Electronic Information.
URL: http://www.library.yale.edu/consortia/statement.html
[viewed 3 October 2009]
Korea Council for University Education. 2004. University Ac-
creditation Review System. Seoul: Korea Council for Uni-
versity Education.
Korea Education & Research Information Service (KERIS).
2005. A Study on Evaluation System Improvement in Uni-
versity Libraries. Seoul: KERIS.
Korea Library Association. 2004. Korea Library Statistics.
Seoul: Korea Library Association.
Kwak, BH., & DY. Lee. 2002. A Study on Developing Evalua-
tion Indicators of University Libraries. Journal of the Ko-
rean Society for Library and Information Science 19(4):
257296.
Lancaster, FW. 1997. Evaluation in the Context of the Digital
Library. In Toward a World Library: 19th International
Essen Symposium 2326 September 1996, ed. AH. Helal, &
JW. Weiss, 156167. Essen: Essen University Library.
Library of Congress. 1999. Electronic resources. URL: http://
www.loc.gov/acq/devpol/electron.html [viewed 3 October
2009]
Lilly, EB. 2001. Evaluation of the Virtual Library Collection.
In: DP. Wallace, & C. Van Fleet, eds. Library Evaluation: A
Casebook and Can-Do Guide. Englewood, NJ: Libraries
Unlimited, Inc.
Mead, J., & Gay, G. 1995. Concept Mapping: An Innovative
Approach to Digital Library Design and Evaluation. URL:
http://www.ideals.illinois.edu/bitstream/handle/2142/2279/
mead.htm [viewed 22 October 2009]
NCES. 1999. Coverage Evaluation of the Academic Library
Survey: Technical Report. U. S. Department of Education.
URL: http://nces.ed.gov/pubs99/1999330.pdf [viewed 22 Oc-
tober 2009]
NECS. 2006. Academic Libraries: 2006. URL: http://nces.ed.
gov/pubs2008/2008337.pdf [viewed 22 October 2009]
Saracevic, T. 2000. Digital Library Evaluation: Toward an
Evaluation of Concept. Library Trends 49(2): 350369.
Shim, W., & PB. Kantor. 1999. Evaluation of Digital Libraries:
A DEA Approach. In: Knowledge Creation, Organization
and Use: Proceedings of the 62nd ASIS Annual Meeting,
Washington, D.C., October 31November 4, 1999: 605615.
Shim, W. 2009. Strategies for Expanding Foreign Electronic
Scholarly Information Resources. Journal of the Korean So-
ciety for Library and Information Science 43(1): 293311.
Suh, H-R. 1996. A Study on the Evaluation of Academic Li-
braries in the Context of Accrediting Universities in Korea.
Journal of the Korean Society for Library and Information
Science 30(1): 1934.
Suh, H-R. 2004. Recent Trends of the Assessment of Academic
Library Services in the Context of American Regional Ac-
creditation Standards. Journal of the Korean Biblia Society
for Library and Information Science 15(2): 255270.
Yoon, H-Y. 2001a. Multiplicity of Meaning and Directivity of
Evaluation Indicators for Academic Libraries. Journal of the
Korean Library and Information Science Society 32:3: 91
115.
Yoon, H-Y. 2001b. A Study on the Development of a Com-
prehensive Evaluation Model for Korean University Librar-
ies. Journal of the Korean Library and Information Science
Society 32(4): 4575.

Received 14 May 2009; revised version received 7 August 2009;
accepted 7 August 2009.
Copyright of Libri: International Journal of Libraries & Information Services is the property of De Gruyter and
its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for individual use.

También podría gustarte