summing up g p
play

Summing up g p Mark Sanderson Mark Sanderson 2 Summing up? - PDF document

Summing up g p Mark Sanderson Mark Sanderson 2 Summing up? What did we talk about? Where to next? 24/12/2008 3 Content Main conference Main conference EVIA 24/12/2008 EVIA highlights 4 First Mongolian test


  1. Summing up g p Mark Sanderson Mark Sanderson

  2. 2 Summing up? • What did we talk about? • Where to next? 24/12/2008

  3. 3 Content • Main conference Main conference • EVIA 24/12/2008

  4. EVIA – highlights 4 • First Mongolian test collection Fi t M li t t ll ti • New patent collection p • CLEF • Examination of evaluation orthodoxies • Sakai and Robertson • Sholer et al • Sholer et al • Karlgren 24/12/2008

  5. NTCIR Clusters 5 • Advance CLIA • CCLQA CCLQA • IR4QA • Focused domain • Patent translation Patent translation • Patent mining • MOAT • MUST • MUST 24/12/2008

  6. IR4QA and CCLQA 6 • Great collaboration • Not the first • TREC – SDR, Blog • Not the last • Not the last • GRID CLEF 24/12/2008

  7. Patent 7 • Mining and translation • Patent processing long worked on • CLIR finally important? CLIR fi ll i t t? 24/12/2008

  8. 8 CLIR – finally • Yahoo image search • Domains t • Patent • Legal • Google Google P t 24/12/2008

  9. Knowledge management 9 • MOAT • Novel text analysis Novel text analysis • MUST • Good to include other types of researchers at an IR forum • Is there a way to integrate more research • Is there a way to integrate more research groups? 24/12/2008

  10. Why have campaigns? 10 • Research together • Build community Build community • Learn how to evaluate • Build collections 24/12/2008

  11. Where to next? 11 • Diversity • Diversify use Diversify use • Users • Look out for plateaus • Cross campaign evaluation? Cross campaign evaluation? 24/12/2008

  12. 12 • Different people want different relevant Diversity documents 24/12/2008

  13. 13 SIGIR 24/12/2008

  14. Diversity collections 14 • CLEF • imageCLEF imageCLEF • New TREC web track • NTCIR? • NTCIR? 24/12/2008

  15. Diversify use 15 • Hundreds of test collections • Very few used in big conferences. • Problem with reviewing? • Problem with researchers? • Problem with researchers? 24/12/2008

  16. Smith & Kantor 2008 16 • Show some users • Google 1-10 Google 1 10 • Google 301-310 • Users equally effective Users equally effective • 301-310 more searches 24/12/2008

  17. 17 • What are test collections predicting? • Consider interaction more Lessons? 24/12/2008

  18. 18 Plateaus 24/12/2008

  19. Crazy idea 19 • Cross campaign evaluation? • Cooperation within NTCIR, TREC C i i hi NTCIR TREC • How about across NTCIR, CLEF and TREC? TREC? 24/12/2008

  20. 20 Thank you • Noriko Kando • Tetsuya Saki Tetsuya Saki • NII 24/12/2008

Recommend


More recommend