征稿已开启

查看我的稿件

注册已开启

查看我的门票

已截止
活动简介
In recent years, immense progress has been made in the development of recommendation, retrieval and personalization techniques. The evaluation of these systems is still based on traditional metrics, e.g. precision, recall and/or RMSE, often not taking the use-case and situation of the system into consideration. However, the rapid evolution of novel IR and recommender systems foster the need for new evaluation paradigms. This workshop serves as a venue for work on novel, personalization-centric benchmarking approaches to evaluate adaptive retrieval and recommender systems. New evaluation approaches of such systems should assess both functional and non-functional requirements. Functional requirements go beyond traditional relevance metrics and focus on user-centered utility metrics, such as novelty, diversity and serendipity. Non-functional requirements focus on performance and technical aspects, e.g. scalability and reactivity. The aim of this workshop is to foster research on the evaluation of adaptive retrieval and recommender systems, thus joining benchmarking efforts from the information retrieval and recommender systems communities.
征稿信息

征稿范围

We invite the submission of papers reporting relevant research in the area of benchmarking and evaluation of recommendation and adaptive IR systems. We welcome submissions presenting contributions in this scope, addressing the following topics: New metri
留言
验证码 看不清楚,更换一张
全部留言
重要日期
  • 08月01日

    2013

    会议日期

  • 08月01日 2013

    注册截止日期

主办单位
美国计算机学会
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询