This full-day workshop, co-located with the HCOMP2018 Conference , aims to bring together a latent community of researchers who treat disagreement (and subjectivity and ambiguity) as signal, rather than noise. Workshop, induce and derive value from uncertainty, ambiguity and Disagreement. The workshop will include invited talks, short technical talks and a discussion of medium- and long-term challenges to fuel future work. We invite applicant from fields such as computer science, information sciences, law, communication science and political science , as Well as those chief working on Human computation and crowd sourcing . Solutions to these conditioned problems will benefit from a diverse set of perspectives.
Ambiguity creates uncertainty in practically every facet of crowdsourcing. This includes the information presented to workers as part of a task, the instructions for what to with it, and the information they are asked to provide. Also lexical ambiguities, ambiguity can result from missing Details , contradictions and subjectivity. Subjectivity may stem from differences in cultural context, life experiences, or individual perception of hard-to-quantify properties. All of these can leave leave workers with conflicting interpretations, leading to results that requesters—including the end-users Of crowd-powered systems—would regard as "wrong".
Historically, The Human Computation Community has largely controlled the Attributed disagreement to Low-Quality Workers. This LED to Mathematical Approaches Intended to Minimize The Supposed Noise. Strategies, included aggregation (EG, Majority, Expectation maximization), Linguistic Approaches, Statistical Filtering and Incentive Design (Among Many others). These are all executed after the data is collected.
Based on interaction design and computer-supported collaborative work to refine task designs until disagreement is minimized. This is akin to the methodology used by linguistic annotation and social content analysis of task refinement using inter rater reliability ( Krippendorff, 2013 ). Focus is on minimizing the perceived ambiguity or subjectivity before the data has been collected .
Lora Aroyo, Vrije Universiteit Amsterdam
Anca Dumitrache, Vrije Universiteit Amsterdam
Praveen Paritosh, Google
Alex Quinn, Purdue University
Chris Welty, Google
We invite and encourage inter disciplinary submissions from the broad spectrum of crowdsourcing and human computation research in fields and application areas processing with (but not limited to) the topics below:
Submissions may present ideas, results from experimental studies, methodologies, work-in-progress and/or applications of systems that explore the topics of subjectivity, ambiguity and disagreement in crowdsourcing and human computation in relation to computational systems, applications, or services. All submissions should follow the main conference formatting guidelines as follows:
07月02日
2018
07月05日
2018
初稿截稿日期
初稿录用通知日期
注册截止日期
留言