Pages

Monday, September 24, 2012

A Study of E-Resource Usage Stats

After my experience building SUSHI_PY, which you can read about on this blog, I thought it might be beneficial to feel out how other libraries are faring in collecting e-resource usage stats. I wrote an article that will be published in the November 2012 issue of Computers in Libraries. I'll publish a copy of it on this blog after the time period specified in the contract I signed with CIL. Until then, here is a breakdown of the data and an Excel download containing the raw data.



1.       How often does your institution use the following methods for gathering usage data?

Never
Sometimes
Often
Manually collecting COUNTER data from vendor websites
3 (2.3%)
23 (17.7%)
104 (80%)
Manually collecting non-COUNTER data from vendor websites
8 (6.3%)
51 (39.8%)
69 (53.9%)
SUSHI via a commercial electronic resource management system
93 (75.0%)
16 (12.9%)
15 (12.1%)
SUSHI via a custom implementation
113 (94.2%)
4 (3.3%)
3 (2.5%)
A custom usage counting system your library maintains on its own
90 (75.0%)
17 (14.2%)
13 (10.8%)








2.       Approximately how much time per year does your institution spend gathering usage statistics for electronic resources such as databases, e-books, and e-journals?
Less than 1 week
14 (10.8%)
1 week
17 (13.1%)
2 weeks
31 (23.8%)
3 weeks
15 (11.5%)
4 weeks
25 (19.2%)
More than 4 weeks
28 (21.5%)






3.       How useful are the following metrics to your institution for evaluating electronic resource usage?

Not useful (0)
Somewhat useful (1)
Very useful (2)
N/A
Average Score
Full-text retrievals by journal/book
4 (3.1%)
28 (21.7%)
94 (72.9%)
3 (2.3%)
1.71
Full-text retrievals by database/collection
4 (3.1%)
31 (23.7%)
91 (69.5%)
5 (3.8%)
1.69
Full-text retrievals by platform
46 (35.7%)
39 (30.2%)
39 (30.2%)
5 (3.9%)
0.94
Sessions by journal/book
42 (32.1%)
56 (42.7%)
25 (19.1%)
8 (6.1%)
0.86
Sessions by database/collection
27 (20.6%)
52 (39.7%)
49 (37.4%)
3 (2.3%)
1.17
Sessions by platform
63 (48.5%)
36 (27.7%)
23 (17.7%)
8 (6.2%)
0.67
Searches by journal/book
24 (18.9%)
55 (43.3%)
42 (33.1%)
6 (4.7%)
1.15
Searches by database/collection
9 (6.9%)
46 (35.1%)
73 (55.7%)
3 (2.3%)
1.50
Searches by platform
50 (39.1%)
51 (39.8%)
21 (16.4%)
6 (4.7%)
0.76
Turnaways by book/journal
26 (19.8%)
65 (49.6%)
32 (24.4%)
8 (6.1%)
1.05
Turnaways by database/collection
22 (16.8%)
69 (52.7%)
31 (23.7%)
9 (6.9%)
1.07
Turnaways by platform
60 (46.2%)
46 (35.4%)
15 (11.5%)
9 (6.9%)
0.63







4.       How useful are the following types of COUNTER reports to your institution?

Not useful (0)
Somewhat useful (1)
Very useful (2)
N/A
Average Score
JR1 – Full-Text Article Requests by Month and Journal
3 (2.3%)
22 (17.2%)
97 (75.8%)
6 (4.7%)
1.77
JR2 – Access Denied to Full-Text Articles by Month, Journal, and Category
35 (27.1%)
62 (48.1%)
21 (16.3%)
11 (8.5%)
0.88
JR3 – Number of Successful Item Requests by Month, Journal and Page Type
46 (35.7%)
50 (38.8%)
19 (14.7%)
14 (10.9%)
0.77
JR4 – Total Searches Run by Month and Collection
36 (28.1%)
41 (32.0%)
39 (30.5%)
12 (9.4%)
1.03
JR5 – Number of Successful Full-Text Article Requests by year-of-Publication and Journal
29 (22.5%)
56 (43.4%)
30 (23.3%)
14 (10.9%)
1.01
DB1 – Total Searches, Result Clicks and Record View by Month and Database
11 (8.6%)
32 (25.0%)
78 (60.9%)
7 (5.5%)
1.55
DB2 – Access Denied by Month, Database, and Category
33 (25.8%)
63 (49.2%)
24 (18.8%)
8 (6.3%)
0.93
DB3/PR1 – Total Searches, Result Clicks, and Record Views by Month and Platform
44 (34.1%)
43 (33.3%)
28 (21.7%)
14 (10.9%)
0.86
BR1 – Number of Successful Title Requests by Month and Title
11 (8.5%)
40 (31.0%)
67 (51.9%)
11 (8.5%)
1.47
BR2 – Number of Successful Section Requests by Month and Title
30 (23.4%)
51 (39.8%)
34 (26.6%)
13 (10.6%)
1.03






5.       What are the greatest challenges and frustrations your institution faces when collecting electronic resource usage data? (open response)
Vendor compliance and consistency
66
Time consumption and poor interfaces
71
Granularity of reporting
7
Difficulty of cost-per-use analysis
10
Platform changes
12
Time periods (calendar vs. fiscal, month vs. year)
9





6.       What changes would make COUNTER statistics more useful in analyzing electronic resource usage? (open response)
Better vendor implementation
29
Including access points and IP addresses
6
Time periods (calendar vs. fiscal, month vs. year)
6
Availability of automation tools (SUSHI, ERM, emails, etc)
15
Including cost-per-use
6
Hiding non-subscribed titles from reports
3
More granular full-text reporting
9

The raw data below contains the complete responses to the open response questions. All identifying information has been removed. If you notice that a name or email address was left in an open response question, please let me know so that I can remove it.

Download Raw Data

1 comment:

  1. Super cool, Josh. Thanks for doing the research and sharing your data.

    ReplyDelete