does counter tell the whole story
play

Does COUNTER tell the whole story? Case-by-case examples - PowerPoint PPT Presentation

Does COUNTER tell the whole story? Case-by-case examples demonstrating the limitations of COUNTER, and suggestions for alternative evaluation metrics CARLI Spring Forum on Collections Data Analysis and Maintenance Governors State University,


  1. Does COUNTER tell the whole story? Case-by-case examples demonstrating the limitations of COUNTER, and suggestions for alternative evaluation metrics CARLI Spring Forum on Collections Data Analysis and Maintenance Governors State University, April 28 th , 2017 Jonathan Shank Acquisitions & E-Resources Librarian Northwestern University Galter Health Sciences Library

  2. Disclaimer

  3. Institutional Context Galter Health Sciences Library • Serves Northwestern's Feinberg School Medicine in Chicago • Administratively separate from University Library in Evanston - Cost sharing with Evanston on big deal agreements - Separate standalone subscriptions and a medical specific collection • Member of CARLI, but not part of I-Share or union Voyager catalog • Entire NU system migrated to Alma in Summer of 2015 • Galter maintains custom Primo front-end • Currently in transitional phase for handling of COUNTER - No ERMS or usage client, efforts currently focused on JR1 stats - Usage functionality coming to Alma this summer 3

  4. COUNTER usage statistics What works well • Standard format, impressive data set and BIG numbers • “Consistency” across vendors • Ease of utilizing for CPU analysis • Increasing compliance among vendors • Growing interoperability • Iterative improvements with each new release Flickr 4

  5. COUNTER usage statistics What works well Active and engaged community of librarians, publishers and vendors. 5

  6. COUNTER usage statistics What doesn’t work so well • Merging multiple providers and platforms, unless you have an aggregator client (i.e. Ustat, 360 Resource Manager, CORAL, etc) • Manual retrieval of reports - Still necessary despite major improvements from SUSHI - Login credentials must be stored & maintained, difficult with shared licenses • Issues with accuracy and title consistency with historical titles and title changes, splits and merges - Stats may be inaccurate or useless as a result 6

  7. COUNTER usage statistics What doesn’t work so well • Occasional issues with accuracy, compliance and reliability • Overlapping accounts, IP ranges and multiple access points can inflate or deflate numbers • Lack of distinction by location, school department, or affiliation • Not available for some resources RZstar Production 7

  8. COUNTER usage statistics What doesn’t work so well Individual usage is a relatively flat or static indicator of impact and value. “Statistics are a measurement of users’ actions that we try to correlate to their intentions.” Oliver Pesch, EBSCO Publishing 8

  9. Specific Examples Demonstrating the limitations of COUNTER 5/1/2017

  10. Example 1: Inflated numbers Numbers can be inflated by a publisher’s interface & platform design • Some platforms load HTML full text automatically, if user clicks PDF it can be counted twice • Some linking mechanisms like CrossRef allow publishers to choose linking level, i.e. link to TOC, abstract, html, pdf • COUNTER is continuously working to improve and resolve these issues • Publisher interference, or at the very least, optimization for high stats, still possible 10

  11. Example 2: IP issues Incorrect IP information can distort figures • On the vendor side, most usage in COUNTER reports is ultimately attributed to accounts based on IP addresses • According to a recent study/audit: 58% of IPs held by publishers to authenticate libraries are wrong (Spence, PSI Ltd) Vincari Blog 11

  12. Example 3: Problems distinguishing locations COUNTER still has limitations with location or account specific reporting • IPs often overlap between departments, schools and campuses, making usage indistinguishable by location - NU has campuses in Evanston, Chicago and Qatar with overlapping IPs - Content at NU is licensed by several different entities for different groups of users • Accounts themselves also have overlap in locations and access entitlements, which are lumped together in COUNTER “There is no single way [outlined in the COUNTER code of practice] for providers to categorize usage transactions to capture reporting by subsets.” - Project COUNTER 12

  13. Example 3: Problems distinguishing locations Overview of NU’s Elsevier landscape GHSL NUL NMH LCH Licenses Accesses Licenses Accesses Accesses Licenses Accesses EMBASE ClinicalKey ScienceDirect Scopus Cell Press ClinicalKey Nursing 13

  14. Example 4: Lack of context or normalization Not all usage is created equal, but it’s treated equally Undergraduate student padding out works cited for English 101 paper Vs. faculty conducting research for major grant or high impact publication Usage and information-seeking behaviors may vary widely by discipline, research area, or department screengrabber.deadspin.com 14

  15. Example 5: False negatives Journal of Dermatological Science • Journal is licensed by Galter Library through Elsevier’s ClinicalKey - Showed only 1 full text download in ClinicalKey’s 2016 JR1 • Citation analysis indicated journal was cited 46 times by NU scholars in same time period, obvious discrepancy • Title is also available through NUL’s ScienceDirect Freedom Collection - 397 full text downloads in ScienceDirect’s 2016 JR1 15

  16. Alternative usage metrics Substitutes and supplements for COUNTER 5/1/2017

  17. Alternative usage metrics Proxy logs • Pros - Data is potentially stored in one place with a single access point - Possibility to capture user affiliation, domain or location - Integration with Google Analytics or other log analysis tools • Cons - Initial set up is manual, and can be complicated - Some programming knowledge may be required - Not all traffic goes through proxy (on campus, VPN, etc.) - Not all institutions have a single proxy server 17

  18. Alternative usage metrics Link resolver logs, stats and analytics • Pros - Can be much easier to retrieve, depending on your resolver • Alma has some functionality built in to Analytics, more coming with next release - Generally found to correlate closely with COUNTER stats - Potential to capture user affiliation, domain, and/or location • Cons - Manual setup may be required - Does all of your traffic really go through the link resolver? • Galter routes PubMed traffic back to customized resolver 18

  19. Alternative usage metrics Citation data • Pros - Identifies usage based on actual research output; demonstrates impact - Depending on how it’s collected, data can be normalized and contextualized by school, subject or research area - Could identify low use, high impact titles and save them from cancellation • Cons - Not as useful for non-research oriented institutions (i.e. liberal arts & community colleges) - Doesn’t capture scholarly usage outside of publishing • Galter team currently working on project in this area for NASIG, stay tuned! 19

  20. Main takeaways No single metric is a silver bullet • Useful to have multiple evaluation metrics to check against - Outliers or anomalies from one metric can be investigated further with others - Different metrics for different titles • Institutional context plays a large role - Systems, licensing, and locations - Mission of school, level of research activity APMEX 20

  21. Questions?

  22. Thank You! j-shank@northwestern.edu @ShankLib

  23. References Bennett, N., (2015). “Could we ever get rid of usage statistics?.” Insights. 28(1), pp.83–84. DOI: http://doi.org/10.1629/uksg.222 Davis, P. M. and J. S. Price (2006). "eJournal interface can influence usage statistics: Implications for libraries, publishers, and Project COUNTER." Journal of the American Society for Information Science and Technology 57(9): 1243-1248. De Groote, S. L., Blecic, D. D., & Martin, K. (2013). “Measures of health sciences journal use: a comparison of vendor, link- resolver, and local citation statistics.” Journal of the Medical Library Association: JMLA, 101(2), 110. Haustein, S. (2012). Multidimensional Journal Evaluation: Analyzing Scientific Periodicals beyond the Impact Factor, De Gruyter. Kennedy, M. R. and C. LaGuardia (2013). Marketing Your Library's Electronic Resources: A How-To-Do-It Manual for Librarians, American Library Association. Orcutt, D. (2010). Library Data: Empowering Practice and Persuasion, Libraries Unlimited. Rathemacher, Andrée J. (2010). “E-Journal Usage Statistics in Collection Management Decisions: A Literature Review.” Library Data: Empowering Practice and Persuasion, ed. Darby Orcutt, 71-89, Libraries Unlimited. Stamison, C., Niemeyer, T., & Tucker, C. (2009). "Usage Statistics: The Perks, Perils and Pitfalls." Proceedings of the Charleston Library Conference. http://dx.doi.org/10.5703/1288284314761 23

Recommend


More recommend