assessing the scholarly impact of imageclef
play

Assessing the scholarly impact of ImageCLEF Theodora Tsikrika Alba - PowerPoint PPT Presentation

Assessing the scholarly impact of ImageCLEF Theodora Tsikrika Alba Garca Seco de Herrera Henning Mller University of Applied Sciences Western Switzerland (HES-SO) Sierre, Switzerland CLEF 2011, Sept 21, 2011 1 Evaluation campaigns


  1. Assessing the scholarly impact of ImageCLEF Theodora Tsikrika Alba García Seco de Herrera Henning Müller University of Applied Sciences Western Switzerland (HES-SO) Sierre, Switzerland CLEF 2011, Sept 21, 2011 1

  2. Evaluation campaigns  Enable the reproducible and comparative evaluation through the use of standardised resources and common evaluation methodologies • Benefits: • Provide access to evaluation infrastructure • Build reusable resources for benchmarking • Promote exchange of ideas • Encourage collaboration and interaction • Resulting in: • Development of new approaches • Increased quality of evaluation methodologies • Advancement of the field CLEF 2011, Sept 21, 2011 2

  3. Evaluation campaigns: assessing impact?  Assess the impact of the research they foster • research that would otherwise not have been possible • Scientific impact: • Scholarly impact, filed patents, … • Economic impact: • Technology transfer, time & effort saved for researchers, … CLEF 2011, Sept 21, 2011 3

  4. Evaluation campaigns: impact assessment B. R. Rowe, D. W. Wood, A. N. Link, and D. A. Simoni. Economic impact assessment of NIST’s Text REtrieval Conference (TREC) Program . Technical Report, Project Number 0211875, RTI International, 2010. C. V. Thornley, A. C. Johnson, A. F. Smeaton, and H. Lee. The scholarly impact of TRECVid (2003–2009) . JASIST, 62(4):613–627, 2011. CLEF 2011, Sept 21, 2011 4

  5. Objectives • Assess the scholarly impact of ImageCLEF • Compare different methods in performing such an assessment CLEF 2011, Sept 21, 2011 5

  6. ImageCLEF: cross-language image retrieval CLEF 2011, Sept 21, 2011 6 6

  7. Assessing the scholarly impact • Perform bibliometric analysis • Obtain derived publications and their citations • Calculate metrics (e.g., h-index) • Sources for publication and citation data • Thomson Reuters (ISI) Web of Science ● Scopus ● Google Scholar 7

  8. Comparison of data sources and tools • Thomson Reuters (ISI) Web of Science + Coverage of more than 10,000 journals − Very limited coverage of conference proceedings + High-quality citation data + Provides metrics • Scopus + Coverage of more than 18,000 titles + Includes many conference proceedings − Citation coverage only after 1996 + High-quality citation data + Provides metrics • Google Scholar + Wider coverage that includes additional journals and conference proceedings, plus books, technical reports, white papers, etc. − Errors in citation data − Does not provide metrics + … but Publish or Perish (PoP) does CLEF 2011, Sept 21, 2011 8 8

  9. Comparison of data sources and tools • Thomson Reuters (ISI) Web of Science + Coverage of more than 10,000 journals − Very limited coverage of conference proceedings + High-quality citation data + Provides metrics • Scopus + Coverage of more than 18,000 titles + Includes many conference proceedings − Citation coverage only after 1996 + High-quality citation data + Provides metrics • Google Scholar + Wider coverage that includes additional journals and conference proceedings, plus books, technical reports, white papers, etc. − Errors in citation data − Does not provide metrics + … but Publish or Perish (PoP) does CLEF 2011, Sept 21, 2011 9 9

  10. Google Scholar shortcomings • Several entries for the same publication • Due to mispellings • May deflate citation count • PoP allows manual merging of publications • Grouping citations of different papers • With similar titles and author lists • E.g., conference and journal versions of a paper • Manual data cleaning • Inability to correctly identify publication year 10

  11. Dataset of ImageCLEF publications • ImageCLEF papers in CLEF working notes • not indexed by Scopus ● ImageCLEF papers in CLEF proceedings ● published in the year following the workshop ● Papers describing ImageCLEF resources published elsewhere • written by ImageCLEF organisers • Papers using ImageCLEF resources pubished elsewhere • written by groups that have participated or just registered without submitting runs or acquired the data at a later stage 11

  12. Dataset of ImageCLEF publications • ImageCLEF papers in CLEF working notes • not indexed by Scopus ● ImageCLEF papers in CLEF proceedings ● published in the year following the workshop ● Papers describing ImageCLEF resources published elsewhere • written by ImageCLEF organisers • Papers using ImageCLEF resources pubished elsewhere • written by groups that have participated or just registered without submitting runs or acquired the data at a later stage 12

  13. ImageCLEF publications and citations ● ~ 70% of citations are from papers not in CLEF proceedings ● 8.62 cites per paper on average 13

  14. Overview vs. Participants’ papers CLEF proceedings All (CLEF proceedings + ImageCLEF resources)  90% of the papers that have # citations ≥ h-index are overviews CLEF 2011, Sept 21, 2011 14

  15. The 2005 overview paper  Single overview for both medical and general image tasks  Half life of approximately three years CLEF 2011, Sept 21, 2011 15

  16. General vs. Medical image annotation and retrieval  Publications in medical domain have slightly higher impact 16

  17. General vs. Medical image annotation and retrieval Scopus PoP publications citations ● 2006-2008 publications in medical domain have high impact CLEF 2011, Sept 21, 2011 17 17

  18. Citations per task • Peak in second or third year of operation • Followed by a decline • … unless there is a major overhaul of the task • Tasks with greatest impact – Photographic retrieval – Medical image retrieval – Medical annotation CLEF 2011, Sept 21, 2011 18 18

  19. Conclusions and Future Work • Preliminary analysis shows important impact of ImageCLEF • 249 publications • 303 citations in Scopus • 2,147 in Google Scholar • Scopus vs. Google Scholar • Both have advantages and limitations • Next steps: • Automate process (as much as possible) • Include working notes (~ 500 papers in total) • Include ImageCLEF derived papers (~1,000 papers in total) • Perform analysis for the whole CLEF CLEF 2011, Sept 21, 2011 19

Recommend


More recommend